Rumor: To be 720p and 30FPS on Xbox One

  • 54 results
  • 1
  • 2
Avatar image for zirilius
Zirilius

1700

Forum Posts

49

Wiki Points

0

Followers

Reviews: 1

User Lists: 2

Avatar image for counterclockwork87
Counterclockwork87

1154

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Is it a good game? That's all that really matters if you ask me.

Avatar image for glots
glots

4672

Forum Posts

74

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3  Edited By glots

30FPS shouldn't be a surprise, but I suppose 720p can be and will be to lots of people. Funny enough, I just listened to an old Bombcast recently, where Brad said that people shouldn't be expecting 1080p as a norm on new consoles, because some developers want to sacrifice on that front, if it means a steadier framerate or so.

If it was 720p on those Unfinished Videos, I couldn't tell, so I really shouldn't care about this personally. Thus I'll just agree with @counterclockwork87

Avatar image for firepaw
Firepaw

3142

Forum Posts

203

Wiki Points

0

Followers

Reviews: 0

User Lists: 1

#4 Firepaw  Online

Oh well, it still looks really good. And whether it is a good game is more important to me anyway.

Avatar image for omgfather
OMGFather

1143

Forum Posts

159

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

With the rumors about PS4's 4k upgrade too, just seems like the XB1 is stuck really far behind right now. I mean.. 720p was last gen tech. Its gotta be embarrassing for a big exclusive.

But yeah, the game didn't look bad from what little I've seen.

Avatar image for sackmanjones
Sackmanjones

5595

Forum Posts

50

Wiki Points

0

Followers

Reviews: 7

User Lists: 5

#6  Edited By Sackmanjones

If it's a solid 30fps and looks as good as it did in the unfinished videos I couldn't care less.

Avatar image for bigsocrates
bigsocrates

2191

Forum Posts

42

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7  Edited By bigsocrates

I have pretty good eyesight and while I can tell the difference between 720p and 1080p on a big screen television I don't know why people care THAT much. A game's resolution does not determine how good it looks. Pong can run 60 fps 1080p on current consoles no problem but it still looks like pong.

If the game looks good and performs well (frame rate hitches below 30 fps are a much bigger problem than a lower resolution for me because then it affects gameplay) I will be satisfied. So far the buzz I've seen has been very positive.

Avatar image for asilentprotagonist
ASilentProtagonist

738

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I could care less. Remedy makes quality stuff. It's a good thing this comes out before Uncharted though.

I've got a strong feeling UC4 will blow this game out of the water visually, and gameplay wise.

Avatar image for shivoa
Shivoa

1591

Forum Posts

334

Wiki Points

0

Followers

Reviews: 1

User Lists: 6

#9  Edited By Shivoa

Could not agree more with the above. 720p is not the end of the discussion but possibly the start. I'd prefer 1080p everywhere but devs can absolutely justify why 720p output was the right move (for their real time rendering pipeline to spit out that final frame, because they needed to do one expensive final pass and couldn't get more than 720p to hit their frame-rate requirements).

Unfortunately we normally find devs have dropped to 720p (with current tech) because they simply don't have the resources (in real-time rendering perf) to bring the desired scene to life with the limits of the rendering time. So what we end up with is 720p which has been rendered with that as the starting buffer, which means any anti-aliasing is going to almost always becomes a travesty and in motion the scene will shimmer in a way that is almost impossible to not notice initially. Luckily we are all able to adapt to scenes and so quickly suppress the bit in our brains getting freaked out by how the thing in front of us shimmers.

A quick example of 720p not being the end of the conversation. Here are two 720p Titanfall renders. One was from a PC (with a internal rendering stage going as high as 4K if I remember correctly the source but the final frame, with DSR-tech, was 720p and that is the photo presented here) and the other the XBOne. Look at them at full size (the GB upload means the default view is a low-res version where it's harder to see this clearly) and note how the PC's 720p includes sharp details and smoothness (which will continue to viewing it in motion) while the XBOne has lot of low-resolution noise (especially foliage and stair-step transitions on the angled geometry) that will become even more apparent in motion when it starts to flicker.

No Caption Provided
No Caption Provided

Avatar image for welshcleats
WelshCleats

130

Forum Posts

115

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

If it's a solid 30fps and looks as good as it did in the unfinished videos I couldn't care less.

That's pretty much how I feel about it. The article mentions Alan Wake also ran at lower resolution on the 360 and I don't remember noticing in any sort of notable way. Seems like framerate stability would be pretty important for a game with the mechanics of Quantum Break.

Don't forget that it's no longer Xbox exclusive either. Imagine PC (if you can/have the specs) would be the way to go if you want the prettiest version. That's pretty much the standard at this point anyway.

Avatar image for oursin_360
OurSin_360

6384

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

With all the crazy blur the game seemed to have i doubt anybody will notice a difference.

Avatar image for jesushammer
JesusHammer

913

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

People are right that the game running at a stable framerate and actually being good is more important, but that still is a huge deal. The fact that an exclusive game is running like a last gen game is a kinda of huge blow to the platform as a whole.

Avatar image for lost_remnant
Lost_Remnant

383

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#13  Edited By Lost_Remnant

If this means Remedy sacrificed some graphical fidelity to make the game run smoother on the Xbox One I'm completely fine with that. I'd prefer developers go that route with consoles instead of trying to push a game graphically and have it run poorly. The game struggling to run puts me more out of the game than some textures being low-res.

Avatar image for atwa
Atwa

1691

Forum Posts

150

Wiki Points

0

Followers

Reviews: 3

User Lists: 10

Are we supposed to pretend the hardware on these consoles is better than it is, in reality?
The hardware is real old already, 720 is not weird. I rather take this, and a solid 30 if thats the case.

So many of the PS4 games drop to excruciatingly low frame rate, which is far worse than a resolution decrease.

Avatar image for zirilius
Zirilius

1700

Forum Posts

49

Wiki Points

0

Followers

Reviews: 1

User Lists: 2

#15  Edited By Zirilius

I don't think graphics are the end all or be all of game but I do play on a TV large enough that you can substantially tell the difference between something running in 720 and 1080. I know people who get super upset about this gen not being fully 1080. I typically can overlook the vast majority of that stuff if the game is good.

/putsontinfoilhat

The only bummer for me is that it apperas MS chose to not disclose this sooner to allow the developer time to maybe get it up to specs. This could have been a massive showcase for Microsoft but like a lot of their PR for this generation are having to fight an uphill battle for those that care to read about this.

/hatremoval

I'm still excited to see this game enough so that I've pre-ordered it. My general message was to set people's expectations if they were expecting this. I'm a reasonable person and having a good general coding background understand the limitations that these developers have when making games. I think this is a great chance for their cross-buy initiative and look forward to trying this on my new gaming laptop as well as the Xbox.

Avatar image for bigsocrates
bigsocrates

2191

Forum Posts

42

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

People are right that the game running at a stable framerate and actually being good is more important, but that still is a huge deal. The fact that an exclusive game is running like a last gen game is a kinda of huge blow to the platform as a whole.

This is a meaningless term because Quantum Break is much more detailed and resource intensive than a 'last gen' game.

The NES and the SNES had (nearly) the same base resolution.

Now if you want everything to be 1080p 60Fps that's fine (though detail and draw distance may suffer) but resolution and FPS do not determine that something was "running like a last gen game." You have to factor in how detailed and complex the game is too.

Avatar image for charlie_victor_bravo
charlie_victor_bravo

1724

Forum Posts

4112

Wiki Points

0

Followers

Reviews: 4

User Lists: 4

@omgfather: "Rumors" being the key word here. If you are suggesting that PS4K will run games at native 4K, you need a reality check. In practice PS4 is just as far behind as Xbox One because the PC is taking huge leaps ahead of them both.

Avatar image for ll_exile_ll
ll_Exile_ll

3023

Forum Posts

25

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

I keep seeing people commenting in this thread that Quantum Break is an Xbox exclusive. Is everyone forgetting that it's also on PC? I know it's Windows 10 store only, but it's still on PC (where it will run at any resolution or framerate your PC can handle).

Avatar image for finaldasa
finaldasa

3353

Forum Posts

9288

Wiki Points

0

Followers

Reviews: 9

User Lists: 13

#19 finaldasa  Moderator  Online

Looked pretty great in the recent videos GB put up.

Digital Foundry was also looking at the pre-release version, no telling if that's final software or not.

Avatar image for conmulligan
conmulligan

2215

Forum Posts

11722

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

Why not judge the game on how it actually looks and performs as opposed to some arbitrary rendering resolution?

Avatar image for shoey920
Shoey920

182

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

jaycutlerdontcare.gif

Avatar image for hunkulese
Hunkulese

4225

Forum Posts

310

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Resolution is still the dumbest thing to get hung up on. 99% of people would never know a game isn't running at 1080 if they weren't told.

Avatar image for giantlizardking
GiantLizardKing

1144

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I think many people are upset that it's this early in the generation and the systems are so weak that they are already appear to be tapped out. MS is continuing to pay the price for forcing the constraint of a Kinect in the box on the general gaming public. But I don't own a PS4 or a XBO so I can only speculate.

Resolution is one thing, but the fps situation is very disappointing. 60 just flat out looks so much smoother when you are used to it.

Avatar image for shivoa
Shivoa

1591

Forum Posts

334

Wiki Points

0

Followers

Reviews: 1

User Lists: 6

@hunkulese: Yes, this is definitely factually accurate and not a statement made by someone who really needs to see an optician.

*huge great big pixels flicker through the scene when walking up to any semi-transparency like foliage*

*watches as someone walks up the side of a building because the camera was giving the edge a bit of an angle and the stair-step edge was clearly traversable*

This is fine, this is all fine. We haven't worked out how to render scenes that look any better than this.

[Again, as to my post above, this is all about the internal rendering resolution and not the final output resolution]

Avatar image for zirilius
Zirilius

1700

Forum Posts

49

Wiki Points

0

Followers

Reviews: 1

User Lists: 2

@ll_exile_ll: Technically until about 4 weeks ago it was when they announced it as a Cross-Buy option. I had forgotten about the cross buy option if you pre-order and even still it's a "console" exclusive if not a "platform" exclusive which is the stupidest terminology on the face of the planet.

Avatar image for peacebrother
peacebrother

757

Forum Posts

311

Wiki Points

0

Followers

Reviews: 0

User Lists: 13

#26 peacebrother  Online

720/30 in 2016 is ridiculous, and as this generation goes on (at least until the Xbone One.5 and PS4.5) games will have to keep dropping res and framerate to keep "improving" the visuals.

Avatar image for geraltitude
GERALTITUDE

5990

Forum Posts

8638

Wiki Points

0

Followers

Reviews: 17

User Lists: 2

#27  Edited By GERALTITUDE

Didn't Alan Wake - a game people fell over themselves talking about how good it looked - run at some hilarious resolution like 700x500 or some shit like that? It's lower than average, that's for sure, and looked way better than average. Same seems to apply here.

Avatar image for zirilius
Zirilius

1700

Forum Posts

49

Wiki Points

0

Followers

Reviews: 1

User Lists: 2

Avatar image for geraltitude
GERALTITUDE

5990

Forum Posts

8638

Wiki Points

0

Followers

Reviews: 17

User Lists: 2

@zirilius said:

@geraltitude: Believe it was 900 x 554 on 360.

I knew there was a 500 in there somewhere! Thanks.

Avatar image for shivoa
Shivoa

1591

Forum Posts

334

Wiki Points

0

Followers

Reviews: 1

User Lists: 6

#30  Edited By Shivoa

@peacebrother said:

720/30 in 2016 is ridiculous, and as this generation goes on (at least until the Xbone One.5 and PS4.5) games will have to keep dropping res and framerate to keep "improving" the visuals.

I'm actually pretty hopeful that visuals will continue to climb as this generation progresses, not go backwards.

In the early PS360 days then we were often getting 720p renders that were exactly that (the render was to look at the spot at the centre of each of the 1280 by 720 pixels and calculate a colour) which led to a lot of aliasing issues (this never totally went away). This was partially "fixed" by FXAA/SMAA/MLAA algorithms that could traverse the frame and look for anything (pixel shapes) that looked like aliasing and apply a blur to soften the edges. It still wasn't real (eg traditional MSAA) anti-aliasing but it helped. It made it so the scene seemed to be rendered with higher fidelity, even if it wasn't as good as real MSAA (8xMSAA does mean that for every pixel rendered it is, when drawing polygon edges, working with 8 different points inside the pixel and seeing how much of it is covered by which polygons and so can get a lot more detail about the edges of the polygon right vs only sampling at the centre of the pixel to see if the polygon covers it). Right now we're seeing working and fast temporal anti-aliasing just starting to be good enough (see the Division for a good example of this) and that really does actually generate extra information (by looking back in time and warping the scene to get more samples of a block of geometry from a similar angle) for each pixel location. It's not perfect but it's a step up from FXAA and seems to be cheap enough (and no longer plagued by ghosting issues) to be the next big thing. It is effectively providing more "source resolution" to the render, which ends up with higher quality scenes whatever the final output resolution used.

Algorithmic improvements for the costly things we do (another example of this would be ambient occlusions, something we can currently do with much higher quality and less run-time cost than the initial algorithms developed only a few years ago - HBAO+ is used as a high-quality variant of AO with all the sliders turned to 11 but actually it was first advertised as being a much faster way of doing AO at the same quality settings) enable upgraded visuals on fixed platforms. So it's not a case that framerates or resolutions have to drop in order to get nicer looking games as time goes on with a generation. It's all about working out optimised and clever algorithms to improve fidelity and, in the case of anti-aliasing tech, the effective perceived resolution of the raw render. This generation is no different to the ones before it: the games that come out at the end of the generation will look much more impressive than the ones at the start.

Avatar image for ry_ry
Ry_Ry

1712

Forum Posts

153

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#31  Edited By Ry_Ry

So long as the game doesn't start dropping frames I'm fine.

Avatar image for jakob187
jakob187

22960

Forum Posts

10045

Wiki Points

0

Followers

Reviews: 8

User Lists: 9

As long as the PC version isn't capped like this, then I'm fine.

Avatar image for nasher27
nasher27

389

Forum Posts

26

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#33  Edited By nasher27

@charlie_victor_bravo said:

@omgfather: "Rumors" being the key word here. If you are suggesting that PS4K will run games at native 4K, you need a reality check. In practice PS4 is just as far behind as Xbox One because the PC is taking huge leaps ahead of them both.

I'm under the impression that most people who believe the 4K gaming part of that rumor (I believe the PS4K to probably be real, not the 4K gaming part) just do not know enough about where the hardware is right now.

Getting high fidelity games to run at 4K on a PS4 is simply not possible at this point in time (assuming a similar form factor and power consumption).

Avatar image for notnert427
notnert427

2382

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 1

@shivoa said:

Could not agree more with the above. 720p is not the end of the discussion but possibly the start. I'd prefer 1080p everywhere but devs can absolutely justify why 720p output was the right move (for their real time rendering pipeline to spit out that final frame, because they needed to do one expensive final pass and couldn't get more than 720p to hit their frame-rate requirements).

Unfortunately we normally find devs have dropped to 720p (with current tech) because they simply don't have the resources (in real-time rendering perf) to bring the desired scene to life with the limits of the rendering time. So what we end up with is 720p which has been rendered with that as the starting buffer, which means any anti-aliasing is going to almost always becomes a travesty and in motion the scene will shimmer in a way that is almost impossible to not notice initially. Luckily we are all able to adapt to scenes and so quickly suppress the bit in our brains getting freaked out by how the thing in front of us shimmers.

A quick example of 720p not being the end of the conversation. Here are two 720p Titanfall renders. One was from a PC (with a internal rendering stage going as high as 4K if I remember correctly the source but the final frame, with DSR-tech, was 720p and that is the photo presented here) and the other the XBOne. Look at them at full size (the GB upload means the default view is a low-res version where it's harder to see this clearly) and note how the PC's 720p includes sharp details and smoothness (which will continue to viewing it in motion) while the XBOne has lot of low-resolution noise (especially foliage and stair-step transitions on the angled geometry) that will become even more apparent in motion when it starts to flicker.

No Caption Provided
No Caption Provided

I like how you chose a game the PC community abandoned to make this point. Apparently actually being able to play the game with others is irrelevant as long there's less foliage noise.

Avatar image for shivoa
Shivoa

1591

Forum Posts

334

Wiki Points

0

Followers

Reviews: 1

User Lists: 6

@notnert427: Hi, I'm a rendering geek/engineer talking about the technical factors of real-time rendering (ie the topic of this thread). The actual game I pick to make the point about 720p output not being the only factor is completely irrelevant to my point.

So are you just trolling or did this all go completely over your head? Thnx, please try again!

Avatar image for oursin_360
OurSin_360

6384

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

720p doesn't matter to me, but the game should be running above 30fps at the resolution even if it was variable 40-45. I'm sure they could make the game look good on console by sacrificing something in the graphics department like real time shadows or particle effects etc. I'll take 720p 60fps over 1080p 30fps tbh.

Avatar image for notnert427
notnert427

2382

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 1

@shivoa said:

@notnert427: Hi, I'm a rendering geek/engineer talking about the technical factors of real-time rendering (ie the topic of this thread). The actual game I pick to make the point about 720p output not being the only factor is completely irrelevant to my point.

So are you just trolling or did this all go completely over your head? Thnx, please try again!

I was simply making the point that perhaps too much is made of resolution/framerate/graphical minutiae. It's cool that you're really into this stuff, but some people are more concerned with the actual game. As an aside, you might consider taking the condescension down a peg or ten. Not everyone considers this rumor to be that big of a deal, and that's okay.

Avatar image for artisanbreads
ArtisanBreads

9107

Forum Posts

154

Wiki Points

0

Followers

Reviews: 2

User Lists: 6

I carry much more about framerate and on PC I can get it up above 30. Otherwise looks great. Excited to play it.

Avatar image for deactivated-5e60e701b849a
deactivated-5e60e701b849a

745

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I'm glad to see that people aren't concerned about this rumor. Kudos, people.

Even if this rumor turns out to be true, it doesn't necessarily mean the game will be worse, so let's just wait and see how it turns out.

Avatar image for zombievac
zombievac

492

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

@grulet said:

30FPS shouldn't be a surprise, but I suppose 720p can be and will be to lots of people. Funny enough, I just listened to an old Bombcast recently, where Brad said that people shouldn't be expecting 1080p as a norm on new consoles, because some developers want to sacrifice on that front, if it means a steadier framerate or so.

If it was 720p on those Unfinished Videos, I couldn't tell, so I really shouldn't care about this personally. Thus I'll just agree with @counterclockwork87

Though you shouldn't be comparing game resolution using highly compressed online videos (you won't see a difference there, or very slight), I would be surprised if you couldn't tell the difference between 720p and 1080p in real life. Shit, I can tell the difference between 1080p and 920p - which is why the fact that even the new consoles can't do 1080p most of the time really irks me. Downscaling and Upscaling are noticeable, and ugly!

Avatar image for zombievac
zombievac

492

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#41  Edited By zombievac

@shivoa said:

@notnert427: Hi, I'm a rendering geek/engineer talking about the technical factors of real-time rendering (ie the topic of this thread). The actual game I pick to make the point about 720p output not being the only factor is completely irrelevant to my point.

So are you just trolling or did this all go completely over your head? Thnx, please try again!

I was simply making the point that perhaps too much is made of resolution/framerate/graphical minutiae. It's cool that you're really into this stuff, but some people are more concerned with the actual game. As an aside, you might consider taking the condescension down a peg or ten. Not everyone considers this rumor to be that big of a deal, and that's okay.

Not only was that the topic of this thread, but the dude in particular you responded to was describing the technical details of resolution, rendering, scaling... and whether or not this matters, or when it does. It's not condescending of him to mention you missed the point entirely with your initial, off topic, snarky reply (and your 2nd reply, for that matter)... if you're trying to go after people who care too much about low resolution or framerate to the point that they miss out on experiences due to it, he wasn't one of them.

BTW, graphics are a big part of a game, whether you like it or not. Resolution and framerate affect a lot of things. Just because it's A concern for many of us does not mean we don't care about the quality of the gameplay. In fact, I care about the gameplay so much that I choose to play on PC for the best experience, so resolution and framerate woes don't hamper my enjoyment!

Avatar image for jinoru
Jinoru

438

Forum Posts

13

Wiki Points

0

Followers

Reviews: 1

User Lists: 18

I'm surpriased no ine linked to the Digital Foundry article where they observed 720p rendering and Remedy confirmed the specs

http://www.eurogamer.net/articles/digitalfoundry-2016-hands-on-with-quantum-break

"Quantum Break's 1080p output is a temporal reconstruction from four previous 720p 4x MSAA frames. This approach gets us high pixel quality in combination with complex shading and effects, allowing us to achieve a cinematic look. However, varying sample counts between passes and temporal upscaling makes talking about resolution, as it is traditionally understood, complicated in the case of Quantum Break. Since the start of Quantum Break's development, the most important thing for Remedy and Microsoft has been delivering a compelling gaming experience with superior artistic quality. This is what Remedy is renowned for. We're confident that we have achieved this, and can't wait to hear what fans think on April 5 when they play the game."

Avatar image for notnert427
notnert427

2382

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 1

@notnert427 said:
@shivoa said:

@notnert427: Hi, I'm a rendering geek/engineer talking about the technical factors of real-time rendering (ie the topic of this thread). The actual game I pick to make the point about 720p output not being the only factor is completely irrelevant to my point.

So are you just trolling or did this all go completely over your head? Thnx, please try again!

I was simply making the point that perhaps too much is made of resolution/framerate/graphical minutiae. It's cool that you're really into this stuff, but some people are more concerned with the actual game. As an aside, you might consider taking the condescension down a peg or ten. Not everyone considers this rumor to be that big of a deal, and that's okay.

Not only was that the topic of this thread, but the dude in particular you responded to was describing the technical details of resolution, rendering, scaling... and whether or not this matters, or when it does. It's not condescending of him to mention you missed the point entirely with your initial, off topic, snarky reply (and your 2nd reply, for that matter)... if you're trying to go after people who care too much about low resolution or framerate to the point that they miss out on experiences due to it, he wasn't one of them.

BTW, graphics are a big part of a game, whether you like it or not. Resolution and framerate affect a lot of things. Just because it's A concern for many of us does not mean we don't care about the quality of the gameplay. In fact, I care about the gameplay so much that I choose to play on PC for the best experience, so resolution and framerate woes don't hamper my enjoyment!

First off, I wasn't referring to just myself in regards to the condescension. I wouldn't have even posted in this thread were it not for the earlier snippy post towards hunkulese, followed by the post about how and why the PC version of Titanfall looks better (which I couldn't help but observe the irony of since the game is virtually unplayable on PC). The screenshots posted don't even give a good visual representation of that because they aren't from the same viewpoint or even from the same level (and were uploaded under the names "PC" and "notaPC" in case there was any question about the mindset behind this). System wars via jargon about foliage noise is still system wars. Still, that a PC can produce better graphics/performance (even at the same resolution) is news to no one, yet every single time even a rumor is posted about a console game being <1080p and/or <60 FPS, the PC master race folks come out of the woodwork to celebrate it and pat themselves on the back for being a PC gamer. And yeah, using Titanfall of all games to try and tout the PC certainly makes it worth pointing out that people on PC are indeed missing out on experiences there.

Graphics are a part of the game, sure. However, the level of importance there varies from person to person. I didn't claim that you don't care about the quality of the gameplay, either, but let's explore that anyway. The prior post you made here was about how the very existence of anything sub-1080p irks you. If you actually practice that graphical threshold and aren't just using it as a system wars punchline, then you've ceased to care about the gameplay of anything below 1080p, regardless of how good the game may be. At that point graphics are no longer "a" concern, they're "the" only concern. As for your supposed "best experience" on PC, the aforementioned Titanfall is a fine example of how that isn't necessarily the case, and that's not even counting gaming classics like Red Dead Redemption, the Forza games, most of the Halo series, etc. that have never made it to PC for any experience. Moreover, if your enjoyment is contingent upon a game meeting resolution/framerate standards that you set, you've inherently hampered yourself.

Avatar image for colourful_hippie
colourful_hippie

5980

Forum Posts

8

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

Wouldn't be surprised if it's true because I was surprised to see the Xbox One even be able to run nice visuals that Quantum Break has. Maybe it's 900 instead of 720 but whatever I'm playing this on PC anyways

Avatar image for shivoa
Shivoa

1591

Forum Posts

334

Wiki Points

0

Followers

Reviews: 1

User Lists: 6

#45  Edited By Shivoa

@colourful_hippie: Possibly lost in the desire of someone to turn a technical discussion about real-time rendering into a format war above, @jinoruhas posted this above. So it's no longer a rumour but confirmed. It's also quite an interesting explanation of exactly what's going on here.

DF did a test and then MS/Remedy responded explaining the temporal reconstruction (take an old frame, warp it roughly to match the camera's movement, use that as an extra point to sample information from on top of the new frame you've rendered to try and recover more detail) they're using on top of a 720p 4xMSAA native render for each frame to explain the results (which Drew and the crew commented on in the QL as being somewhat weird - definitely this sort of temporal anti-aliasing has had some weird results in the past, although the Division seems to get it pretty close to great and so we're hopefully out of the era where you'd get weird ghosting - I think Project Cars PS4 is where a lot of people saw this go horribly wrong before they patched it).

Avatar image for zombievac
zombievac

492

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

@zombievac said:
@notnert427 said:
@shivoa said:

@notnert427: Hi, I'm a rendering geek/engineer talking about the technical factors of real-time rendering (ie the topic of this thread). The actual game I pick to make the point about 720p output not being the only factor is completely irrelevant to my point.

So are you just trolling or did this all go completely over your head? Thnx, please try again!

I was simply making the point that perhaps too much is made of resolution/framerate/graphical minutiae. It's cool that you're really into this stuff, but some people are more concerned with the actual game. As an aside, you might consider taking the condescension down a peg or ten. Not everyone considers this rumor to be that big of a deal, and that's okay.

Not only was that the topic of this thread, but the dude in particular you responded to was describing the technical details of resolution, rendering, scaling... and whether or not this matters, or when it does. It's not condescending of him to mention you missed the point entirely with your initial, off topic, snarky reply (and your 2nd reply, for that matter)... if you're trying to go after people who care too much about low resolution or framerate to the point that they miss out on experiences due to it, he wasn't one of them.

BTW, graphics are a big part of a game, whether you like it or not. Resolution and framerate affect a lot of things. Just because it's A concern for many of us does not mean we don't care about the quality of the gameplay. In fact, I care about the gameplay so much that I choose to play on PC for the best experience, so resolution and framerate woes don't hamper my enjoyment!

First off, I wasn't referring to just myself in regards to the condescension. I wouldn't have even posted in this thread were it not for the earlier snippy post towards hunkulese, followed by the post about how and why the PC version of Titanfall looks better (which I couldn't help but observe the irony of since the game is virtually unplayable on PC). The screenshots posted don't even give a good visual representation of that because they aren't from the same viewpoint or even from the same level (and were uploaded under the names "PC" and "notaPC" in case there was any question about the mindset behind this). System wars via jargon about foliage noise is still system wars. Still, that a PC can produce better graphics/performance (even at the same resolution) is news to no one, yet every single time even a rumor is posted about a console game being <1080p and/or <60 FPS, the PC master race folks come out of the woodwork to celebrate it and pat themselves on the back for being a PC gamer. And yeah, using Titanfall of all games to try and tout the PC certainly makes it worth pointing out that people on PC are indeed missing out on experiences there.

Graphics are a part of the game, sure. However, the level of importance there varies from person to person. I didn't claim that you don't care about the quality of the gameplay, either, but let's explore that anyway. The prior post you made here was about how the very existence of anything sub-1080p irks you. If you actually practice that graphical threshold and aren't just using it as a system wars punchline, then you've ceased to care about the gameplay of anything below 1080p, regardless of how good the game may be. At that point graphics are no longer "a" concern, they're "the" only concern. As for your supposed "best experience" on PC, the aforementioned Titanfall is a fine example of how that isn't necessarily the case, and that's not even counting gaming classics like Red Dead Redemption, the Forza games, most of the Halo series, etc. that have never made it to PC for any experience. Moreover, if your enjoyment is contingent upon a game meeting resolution/framerate standards that you set, you've inherently hampered yourself.

Hey, you can interpret it how you want to, but you should've responded to the people you thought were condescending, not the guy who knows more than anyone in the thread about rendering and trying to help people understand. Because the example he used, like countless other examples, was the PC version, does not mean he was engaging in system wars. I'm not either, I have both consoles and a PC, and I've played many games at substandard framerates and graphical settings. But when given the choice, as I said, I prefer PC because it's better in every case I've ever encountered in modern times (I've been playing since Atari and DOS) - except MK X, which wasn't graphics related but them refusing to support the PC version further. I play A LOT of games. Maybe Titanfall was broken at launch, I don't know or care really. Of course there are examples of bad PC ports.

One of my favorite games ever was a TERRIBLE PC port, but I loved the game even more because of it and the mod community who fixed the issues and then some within a DAY - that just can't be done on the consoles. One of the amazing reasons I love the flexibility of PCs.

Avatar image for colonel_pockets
Colonel_Pockets

1347

Forum Posts

22

Wiki Points

0

Followers

Reviews: 0

User Lists: 46

If this thread had not been made I would not have known. The whole number of Ps thing is just dumb. Is the game good? That should be what matters.

Avatar image for notnert427
notnert427

2382

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 1

#48  Edited By notnert427

@zombievac said:

Hey, you can interpret it how you want to, but you should've responded to the people you thought were condescending, not the guy who knows more than anyone in the thread about rendering and trying to help people understand. Because the example he used, like countless other examples, was the PC version, does not mean he was engaging in system wars. I'm not either, I have both consoles and a PC, and I've played many games at substandard framerates and graphical settings. But when given the choice, as I said, I prefer PC because it's better in every case I've ever encountered in modern times (I've been playing since Atari and DOS) - except MK X, which wasn't graphics related but them refusing to support the PC version further. I play A LOT of games. Maybe Titanfall was broken at launch, I don't know or care really. Of course there are examples of bad PC ports.

One of my favorite games ever was a TERRIBLE PC port, but I loved the game even more because of it and the mod community who fixed the issues and then some within a DAY - that just can't be done on the consoles. One of the amazing reasons I love the flexibility of PCs.

The guy who "knows more about rendering than anyone" is the same person who I found to be condescending and thus responded directly to. Titanfall wasn't broken at launch to my knowledge; the PC community just kinda rejected it and now you unfortunately can barely find a game on it, whereas it's still plenty active on the Xbox One. As such, highlighting graphical features the PC version has that the Xbox One version doesn't rubbed me a bit the wrong way because the PC version doesn't really have......players, a feature which I consider vastly more important to a multiplayer FPS. I really enjoy Titanfall and feel like it's a game that fell through the cracks a bit due to hyperfocus about its resolution instead of whether or not it was actually a quality game, which IMO it very much is. I want people playing good games, not losing interest in games before they even come out if they don't hit certain technical benchmarks. That's where I'm coming from here, so apologies if that didn't come across.

@shivoa said:

@colourful_hippie: Possibly lost in the desire of someone to turn a technical discussion about real-time rendering into a format war above,

Relax; I've made my point, so I'm through with this thread. It won't be too long anyway before we see how the "notaPC" version of this game actually turns out.

Avatar image for kryplixx
Kryplixx

288

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

How the hell is Doom going to be 1080p 60 fps on Xbone than?

Avatar image for thataintfalco
ThatAintFalco

4500

Forum Posts

444

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

I'd be more upset about that weird blur they've got going on in the game than 720p or 30fps.

This edit will also create new pages on Giant Bomb for:

Beware, you are proposing to add brand new pages to the wiki along with your edits. Make sure this is what you intended. This will likely increase the time it takes for your changes to go live.

Comment and Save

Until you earn 1000 points all your submissions need to be vetted by other Giant Bomb users. This process takes no more than a few hours and we'll send you an email once approved.