Something went wrong. Try again later

CairnsyTheBeard

.

411 447 14 3
Forum Posts Wiki Points Following Followers

On the Topic of Framerate Versus Graphics

I have become annoyed recently at the trend of big devs/publishers to be slimy and manipulative in their reasoning as to why a game runs at a certain resolution or framerate. They think that it would be better to lie to the uninformed and ignorant rather than just tell the truth with regards to their game not being able to hit 60fps due to their high graphical benchmark and hardware constraints.

I would have no problem with devs telling the truth: "our game runs at 1080p30 because it's all we could get from the hardware or we decided graphics were more important". Problems arise, however when they decide to justify it with some flat-out bullshit about 30fps being a "design decision" or being more "cinematic" and worst of all drawing comparisons between the way a game engine works and the way capturing film works. They are not comparable, they are different mediums. When you watch a film at 24fps it is "cinematic" because all films follow that standard and we don't have any interaction with the screen. When you notice motion blur in a film it is because there are only 24 frames to make a second of movement so objects "flutter" more like a flip book and our brains fill in the missing movement. Games are running off engines that display everything in real-time, but you cant expect the average Joe to know. Not only does lying cause an internet shit-storm that would probably reach those who wouldn't have known or cared otherwise if they had just shut up, but it also strengthens a feeling of distrust or hate for said company.

Assassin's Creed Unity Has Been the Catalyst for Several Internet Shit-Storms Lately
Assassin's Creed Unity Has Been the Catalyst for Several Internet Shit-Storms Lately

Something Ubisoft have been great at recently is building distrust and hate particularly in the PC space. Why is this on a fundamental level? Is the company so big it cant function without messing up? Are they too money hungry to implement female character models and animations or code properly for each individual platform? Most quizzically, how could they possibly believe that saying they "decided to lock them [PS4 and Xbox One version of Assassins Creed Unity] at the same specs to avoid all the debates and stuff" would avoid arguments? But allow me to shift the blame a little as Ubisoft isn't the only culprit, Need for Speed: Rivals was locked to 30 on PC and The Evil Within has been confirmed to be locked at 30 on PC.

What angers me the most is when the more powerful platforms' version of the game is hampered by the weakest link in the chain. Just because the Xbox One version of Game X cant handle 1080p60 that shouldn't mean that the PS4 version and especially the PC version should be held back! A locked 30fps framerate on a PC game if hard to fathom in the year 2014 to be honest, and few devs have legitimate excuses. Some who would have excuses include Obsidian's South Park: The Stick of Truth as its design suits a low framerate and the moment to moment interactivity doesn't require smooth movement. Games that have their physics or speed tied to a framerate also deserve to laughed out of the room, it's 2014 and we have higher standards.

60fps vs 30fps Comparison
60fps vs 30fps Comparison

I vehemently disagree with the point of view that graphical fidelity should trump framerate in most cases on a console platform and in nearly all cases on PC. If the game requires ANY kind of precision movement then it would definitively be improved with a higher framerate. Would you rather have complete agency and accuracy when you're playing a shooter? Or would you like a few more lines of pixels and a larger draw distance. Remember everyone, these are GAMES, they're meant to be PLAYED. They're being treated by big devs as art installations or advertisements for the hardware of the console system, but I believe that this attitude is a detriment to the gamers, to the customers, to the players.

South Park: The Stick of Truth
South Park: The Stick of Truth

The exception of course are games like point n' clicks or games with specific situations like South Park: The Stick of Truth, or even late generation games striving for more detail over fps. Even so I often think it would have been a cool, pipe-dream (and obviously impossible) idea for Sony or Microsoft to have imposed a 60fps minimum for all games on their platforms. Ok, if not then at least a trade-off whereby the user gets more control (like on PC) over graphical options so I choose for myself. Although I don't see that happening.

Overall I'm just fed up with the lies, misinformation, sliminess and gamer-unfriendly nature of big game development in recent years. I think performance is something that big publishers and developers should generally be prioritized over lines of resolution, along with more customer transparency, and let's not even start talking about FOV sliders!

Relevant Videos:

TotalBiscuit: The Great Framerate Non-Debate

TotalBiscuit: Let's not play Need for Speed: Rivals

Jimquisition: A Sad History of PC Failures

57 Comments

57 Comments

Avatar image for alistercat
alistercat

8533

Forum Posts

7626

Wiki Points

0

Followers

Reviews: 2

User Lists: 27

Of course, everything you say is true. There are also enough consumers who already believe the "30fps is more cinematic" line.

FFXIII on PC is 720p only. Normally I get annoyed about framerates but I didn't even consider the idea of a fixed res PC game.

Avatar image for sravankb
sravankb

564

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 1

I just want a choice to switch between 60 and 30. Kinda like how TLOU:Remastered did.

Ever since I switched to PC gaming, 60fps is absolutely necessary. Hell, I'm okay with low res textures or blocky shadows as long as I get a smooth 60 framerate.

Avatar image for cairnsythebeard
CairnsyTheBeard

411

Forum Posts

447

Wiki Points

0

Followers

Reviews: 6

User Lists: 22

@sravankb: Yes, choice is something consoles sorely lack, although i remember the first Bioshock allowed you to unlock the framerate and widen the FOV, crazy!

Avatar image for finaldasa
FinalDasa

3862

Forum Posts

9965

Wiki Points

0

Followers

Reviews: 9

User Lists: 16

FinalDasa  Moderator

I think the "lies" they tell is just marketing. If you came out and said our graphics were at a high enough benchmark that we had to throttle framerate some may take it as a fault rather than a technical limitation. Even now when games don't come out at 1080p and 60fps people complain and claim it's the consoles, or it's a pay off, ect.

When it comes to marketing it's best to keep things positive rather than say a developer couldn't accomplish something.

Avatar image for cairnsythebeard
CairnsyTheBeard

411

Forum Posts

447

Wiki Points

0

Followers

Reviews: 6

User Lists: 22

@finaldasa: Very true, seems odd to avoid negativity by pushing buttons with the whole "30fps cinematic feel " argument though

Avatar image for finaldasa
FinalDasa

3862

Forum Posts

9965

Wiki Points

0

Followers

Reviews: 9

User Lists: 16

FinalDasa  Moderator

@cairnsythebeard: Oh for sure. Some of that might be the old attempt of video games trying so much to prove they're just like movies. It's an easy way now to get most people to just accept the limitations without having to actually explain it. I do see some smaller developers admitting the technical limitations more often now.

Avatar image for slaegar
Slaegar

935

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Edited By Slaegar

@sravankb: Yes, choice is something consoles sorely lack, although i remember the first Bioshock allowed you to unlock the framerate and widen the FOV, crazy!

I remember that. It was pretty cool. Only problem was if you unlocked the framerate it would tear like crazy. A little tearing doesn't bother me, but boy was it bad.

Avatar image for conmulligan
conmulligan

2292

Forum Posts

11722

Wiki Points

0

Followers

Reviews: 0

User Lists: 11

Edited By conmulligan

I understand your frustration, but I don't think you're taking into account just how difficult it is to sustain a steady 60 hertz framerate on consoles, especially in comparison to other technical choices like the framebuffer resolution. When a game runs at 60fps, it has 16 milliseconds to complete everything in the game loop — that includes updating AI routines, calculating physics, outputting sound, redrawing the UI and rendering the viewport. With a cycle that tight, the graphics pipeline becomes just one bottleneck among many. This is why it's incredibly rare to see games running at 60 on PS4 and only 30 on Xbox One — because the CPUs are roughly equivalent, most of the time both systems will end up being limited by that long before the PS4's more capable GPU even becomes a factor. In other words, there's no point in adding customisable graphics options to console games because it likely won't make a lick of difference to the framerate, unless they also allowed you to turn off positional audio, real-time physics, and advanced AI.

All that said, there really is no excuse for limiting the framerate on PCs, and if you are constrained to a 30 hertz refresh rate, at least be honest about why that's the case.

Avatar image for j0lter
j0lter

310

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

60fps>fancy grass. 60fps makes the gaming experience completely different and ever since I got a pc I buy all my games on there because it isn't worth dealing with the ps4/xbone crap.

Avatar image for cairnsythebeard
CairnsyTheBeard

411

Forum Posts

447

Wiki Points

0

Followers

Reviews: 6

User Lists: 22

@conmulligan: That's my main issue, not that every game has to be 60fps (although striving for it would be nice) but simply not lying about why it doesn't run at 60 and also not dragging down superior hardware

Avatar image for bollard
Bollard

8298

Forum Posts

118

Wiki Points

0

Followers

Reviews: 3

User Lists: 12

I would like to remind you that it will likely be the marketing team, likely the publisher, that is pushing the "30fps as a design choice" notion. Getting angry at the developers for that is probably a little unjustified.

Avatar image for cairnsythebeard
CairnsyTheBeard

411

Forum Posts

447

Wiki Points

0

Followers

Reviews: 6

User Lists: 22

Edited By CairnsyTheBeard

@bollard: I wasn't exactly sure who was to blame, so included both publisher and dev, although I agree it probably wasn't the devs fault I was just trying to use dev as a blanket term for the 'people involved in the development process of whom had something to do with this'.

Avatar image for xanadu
xanadu

2157

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

I love the way 60fps feels but I would say that I'd much more concerned with a locked in frame rate. I would take a locked in 30fps over a flucuating 45-60fps any day.

Avatar image for cairnsythebeard
CairnsyTheBeard

411

Forum Posts

447

Wiki Points

0

Followers

Reviews: 6

User Lists: 22

Avatar image for bollard
Bollard

8298

Forum Posts

118

Wiki Points

0

Followers

Reviews: 3

User Lists: 12

Avatar image for thehbk
TheHBK

5674

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 6

Here are some points. When the Dreamcast came out, the first thing I noticed was the framerate. These were arcade ports that ran as smoothly as they did in the arcade. Crazy Taxi, smooth at 60fps. Framerate makes for a better experience.

Also, people getting all mad about a game being held back or something. Everything is fine. You just don't know. As the guys in the bombcast pointed out, the first Xbox was way more powerful than the PS2, but we never saw any of the benefits for cross platform games besides running smoother but you knew that was not all the Xbox could do. But its fine, its how it goes. The question is simple. For AC. Is it worth it to put in the extra money and effort to get the PS4 version up to 1080p? PS4 has more consoles out there, more people will buy AC on PS4 anyway. Does 180 more p's equal enough sales to warrant the extra cost of getting the PS4 version there? If you don't know this answer, then don't get mad and stop complaining.

Avatar image for cairnsythebeard
CairnsyTheBeard

411

Forum Posts

447

Wiki Points

0

Followers

Reviews: 6

User Lists: 22

@thehbk: True, I don't know, but that's a small piece of my blog, I was using it as an example of a wider trend among devs or publishers.

The xbox versions of the PS2 games were mostly ports done post development though right? So that situation is slightly different.

Also i'm not a dev but I cant imagine it would take too much money to turn up the resolution a notch as they have a tad more power on PS4, can it? I mean what have all the other multi-plat games that are better on PS4 been doing?

Avatar image for notnert427
notnert427

2389

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 1

I find these debates way overblown. Someone needs to do an experiment and gather a bunch of the gamers obsessed with this crap. Show them various videos ranging from 900p-1080p and from 30-60 FPS randomly in succession and have them write down what they think the framerate and resolution is. I'd be willing to bet a bunch of people would be way off and would fail to spot differences that they've spent hours upon hours on the internet claiming are so damn important and obvious.

The reality is, a steady framerate of 30 FPS or better and anything 900p or better typically looks and plays just fine. Is a 1080p, 60 FPS game the best? Of course. Is a 900p 60 FPS or 1080p 30FPS game anywhere near as horrible or unacceptable as a bunch of entitled gamers with white people problems claim it is? Hell no.

Avatar image for deactivated-5afdd08777389
deactivated-5afdd08777389

1651

Forum Posts

37

Wiki Points

0

Followers

Reviews: 1

User Lists: 2

The whole 30 fps thing being cinematic is idiotic. Film is locked at 24 fps only because people are used to it in movies. It is in fact worse than a movie at 60 fps for numerous reasons. Hopefully James Cameron can help us get beyond this silly limitation.

Avatar image for fisk0
fisk0

7321

Forum Posts

74197

Wiki Points

0

Followers

Reviews: 0

User Lists: 75

fisk0  Moderator

The whole 30 fps thing being cinematic is idiotic. Film is locked at 24 fps only because people are used to it in movies. It is in fact worse than a movie at 60 fps for numerous reasons. Hopefully James Cameron can help us get beyond this silly limitation.

Yeah, The Hobbit movies have plenty of issues, but the whole "48 fps looks cheap!" argument is just crazy.

I wonder what people were saying back in the 1920's when they switched from 18 to 24 fps.

Avatar image for crysack
Crysack

569

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Edited By Crysack

I find these debates way overblown. Someone needs to do an experiment and gather a bunch of the gamers obsessed with this crap. Show them various videos ranging from 900p-1080p and from 30-60 FPS randomly in succession and have them write down what they think the framerate and resolution is. I'd be willing to bet a bunch of people would be way off and would fail to spot differences that they've spent hours upon hours on the internet claiming are so damn important and obvious.

The reality is, a steady framerate of 30 FPS or better and anything 900p or better typically looks and plays just fine. Is a 1080p, 60 FPS game the best? Of course. Is a 900p 60 FPS or 1080p 30FPS game anywhere near as horrible or unacceptable as a bunch of entitled gamers with white people problems claim it is? Hell no.

I'd be willing to bet against you. If you've been playing PC games for any length of time you should be able to spot the difference immediately. In fact, I find it difficult to tolerate playing any games on consoles these days because of the technical limitations. TLoU was particularly grating last gen and I would go so far as to say that the performance issues ruined my enjoyment of the game.

Avatar image for cornbredx
cornbredx

7484

Forum Posts

2699

Wiki Points

0

Followers

Reviews: 0

User Lists: 15

I agree with your assessment. It's hard not to being that most of it is factual.

I don't hate Ubisoft. Their PR tends to be bad, but it seems most of the big publishers these days PR are awful.

I would start my own PR firm just to counter this issue (because it seems like PR firms don't actually care) but I don't care enough to do that.

The fact is, also, that resolution only matters because that's what displays are starting to render at natively and to go below native resolution is noticeable on a display that is meant for it (even if not a big deal but with games it can mean the screen isn't right which has it's own issues on actual game play).

I don't care about the debate between 30 and 60 fps. Both are fine to me- even if over time the difference is noticeable it doesn't affect game play for me as long as it's a constant.

Avatar image for bowties
Bowties

8

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Avatar image for impartialgecko
impartialgecko

1964

Forum Posts

27

Wiki Points

0

Followers

Reviews: 15

User Lists: 2

@xanadu said:

I love the way 60fps feels but I would say that I'd much more concerned with a locked in frame rate. I would take a locked in 30fps over a flucuating 45-60fps any day.

I'm the same way. All I care about is a consistent experience, which is why I've largely moved away from PC gaming. Everything from micro-stutter and frame-drops pull me out of the experience when I know I'm able to mess with the settings to try and improve things. That's not the kind of experience I want to have.

Avatar image for nightriff
nightriff

7248

Forum Posts

1467

Wiki Points

0

Followers

Reviews: 4

User Lists: 7

Don't worry about the PC version, give it 2 weeks and people will unlock the actual files of the game and run at a higher p and framerate that they decided to "hold back"

Avatar image for tourgen
tourgen

4568

Forum Posts

645

Wiki Points

0

Followers

Reviews: 4

User Lists: 11

The "30fps games feel more cinematic" is extra super strength dumb. Computers calculate discrete images at fixed points in time. Reality happens in a continuous, non-discrete manner. Cameras capture light over some set shutter duration. This is different from a 30fps slide-show. anyone who tries to equate the two should be immediately ignored as they are possibly too stupid to remember to breathe.

Avatar image for adequatelyprepared
AdequatelyPrepared

2522

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I agree. 30fps, as long as it STAYS there, is fine with me for most games (exceptions are stuff like especially fast-paced FPSs or hack and slash games like MGR, those need the 60fps). However, I do prefer 60fps if possible, because it is better, at least in my eyes. When Evil Within releases, I will be using those tools that Bethesda said they will release in order to unlock the 30fps cap they put on the PC version of that game.

If Ubisoft had just said that AC Unity is running at 30fps because they decided to have it locked to that framerate rather than have it stutter higher up, that's fine, whatever. But don't try to sell bullshit like 'cinematic-experience', or that 30fps is superior in some way to 60fps (it IS superior to a fluctuating 40-60fps though). We're not idiots.

Edit: Also, it's not all bad OP. Kojima had a few tweets that stated how, as far he is concerned, frame rate is what developers should be striving for first.

Avatar image for corevi
Corevi

6796

Forum Posts

391

Wiki Points

0

Followers

Reviews: 0

User Lists: 4

Edited By Corevi

Don't worry about the PC version, give it 2 weeks and people will unlock the actual files of the game and run at a higher p and framerate that they decided to "hold back"

I highly doubt the PC version of Unity won't allow for 1080p (or 1440p) at 60fps. It's just a matter of how well it runs.

Avatar image for stonyman65
stonyman65

3818

Forum Posts

1

Wiki Points

0

Followers

Reviews: 0

User Lists: 4

For me frame rate trumps all. I can deal with 30fps if I need to, but I prefer a solid 60. Playing under 30 is painful because it's so slow and kind of unplayable at a certain point. A good recent example of this was in GTA 5. When you were swimming or doing anything in the water the frame rate would drop so low that you could actually count the frames and it would just chug and chug. Having nice graphics or lots of stuff on screen is nice, but if the game plays like crap because the frame rate is all over the place than all the pretty stuff doesn't really matter.

Locking the frame rate at 30 for South Park made sense because they wanted it to look/feel like the TV show (which runs at NTSC 29.97 fps), and I think that worked really well considering how those characters animate on the actual show. Anything faster or slower would look weird. All that being said, South Park was a very specific case and I don't think the same logic would play for anything else that wasn't trying to emulate a cartoon show.

Now, I will reserve judgment until the actual games comes out, but nothing that I have seen so far of the new Assassins Creed and the Evil Within do not justify the 30fps lock or the crazy requirements that The Evil Within "needs" for it to run. The games look nice, but there is no reason why they shouldn't be running 1080p/60fps. PCs have been doing that (and better) for the last 5 years now. Are the new consoles really not that powerful? I look at what is being done with games like Metal Gear Solid 5 and the more I see of that, the less I believe the "we could only get it to run at 1080p 30!" line.

Avatar image for slag
Slag

8308

Forum Posts

15965

Wiki Points

0

Followers

Reviews: 8

User Lists: 45

Definitely #teamlockedin60fps here

Avatar image for mike
mike

18011

Forum Posts

23067

Wiki Points

0

Followers

Reviews: -1

User Lists: 6

Now, I will reserve judgment until the actual games comes out, but nothing that I have seen so far of the new Assassins Creed and the Evil Within do not justify the 30fps lock or the crazy requirements that The Evil Within "needs" for it to run. The games look nice, but there is no reason why they shouldn't be running 1080p/60fps. PCs have been doing that (and better) for the last 5 years now. Are the new consoles really not that powerful?

I don't think they are. 1080p or under and 30 fps seems to be the norm for console games right now. I say under because so many console games are 792p or 900p just to achieve a stable 30 fps. Then there are cases like Wolfenstein: The New Order on Xbox One, which claims 1080p/60 but actually dynamically reduces it's vertical resolution in order to maintain that frame rate. Or Killzone SF, it's 30 fps in single player and a reduced resolution with 60 fps in multiplayer. Even some games that claim 1080p/60 don't maintain it, with some games like The Last of Us Remastered dipping down into the low 40's and bouncing between there and 60 for a good portion of the game. It's pretty bad right now, I hope as time goes on things will improve, because from where I stand the two new consoles look kind of bad and we're only a year into a very long life cycle. That is especially true for the Xbox, a system that seemingly is always coming in last for multiplatform games. They say DX12 will change that, but I'm skeptical to say the least.

This is just part of the reason why I opted out from all consoles this generation and went strictly PC. I just keep wondering to myself that if the performance gap between PC and consoles is this great already, imagine what it's going to be like in three or four years when the consoles are only halfway through their life cycles and PC hardware is still improving on practically a weekly basis.

Avatar image for m16mojo2
m16mojo2

451

Forum Posts

80

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Here's my question... What happens when 4k televisions become the standard? It's not that far off, in my opinion. What's more concerning is that these consoles are at the very beginning of their lifecycle, and their barely able to hit 1080. Are we going to have to suffer with an upscaled 900, or 1080? I feel like we've hit a wall with regards to the leap in console tech. Seeing some of the launch titles on 360, and ps3 compared to xbox, and ps2 was like night and day. I'm not seeing anything "amazing" with xone, or ps4.

Avatar image for csl316
csl316

17006

Forum Posts

765

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

The whole 30 fps thing being cinematic is idiotic. Film is locked at 24 fps only because people are used to it in movies. It is in fact worse than a movie at 60 fps for numerous reasons. Hopefully James Cameron can help us get beyond this silly limitation.

Well, yeah, it looks "cinematic" because it's what we're used to in cinema. Seems like a reasonable adjective.

Avatar image for rongalaxy
RonGalaxy

4937

Forum Posts

48

Wiki Points

0

Followers

Reviews: 1

User Lists: 1

If a game is good nothing else really matters. This stuff is all secondary in my opinion. I want developers to work on making new and interesting things for both consoles and pc, and if they choose to make certain concessions then I am fine with that.

If you are a person who really cares about frame rate and resolution then buy a PC. Its as simple as that. Consoles are never going to be on the bleeding edge of performance, so if you expect that you are naive.

Avatar image for mike
mike

18011

Forum Posts

23067

Wiki Points

0

Followers

Reviews: -1

User Lists: 6

@m16mojo2 said:

Here's my question... What happens when 4k televisions become the standard? It's not that far off, in my opinion. What's more concerning is that these consoles are at the very beginning of their lifecycle, and their barely able to hit 1080. Are we going to have to suffer with an upscaled 900, or 1080? I feel like we've hit a wall with regards to the leap in console tech. Seeing some of the launch titles on 360, and ps3 compared to xbox, and ps2 was like night and day. I'm not seeing anything "amazing" with xone, or ps4.

I think that 4k displays being the standard may be a ways off yet, but I still think we will see that happen before the next generation of consoles is announced. Let's say the PS4 and Xbox One are going to be around eight years before the new ones come out...they've been out a year now. Even if it's three or four more years before we see 4k TVs commonly in homes, that's still three or four years to go before new console hardware comes out.

I'm not even sure where that leaves us. Maybe more people will move to PC gaming, or maybe the next console generation won't be as far off as I think it's going to be. Who knows.

Avatar image for turboman
turboman

10064

Forum Posts

19

Wiki Points

0

Followers

Reviews: 23

User Lists: 11

Edited By turboman

We are at a point now where most games generally look just fine. I don't get taken out of the experience at all if I spot a few jaggy pixels.

The Last of Us at 60 fps is way better than The Last of Us at 30 fps.

Avatar image for icemael
Icemael

6901

Forum Posts

40352

Wiki Points

0

Followers

Reviews: 20

User Lists: 20

@m16mojo2 said:

Here's my question... What happens when 4k televisions become the standard? It's not that far off, in my opinion. What's more concerning is that these consoles are at the very beginning of their lifecycle, and their barely able to hit 1080. Are we going to have to suffer with an upscaled 900, or 1080? I feel like we've hit a wall with regards to the leap in console tech. Seeing some of the launch titles on 360, and ps3 compared to xbox, and ps2 was like night and day. I'm not seeing anything "amazing" with xone, or ps4.

Ideally, video games should ignore 4k completely. The constant increase in resolution makes sense for film and TV shows (and therefore for TV screens, since that's what they're mainly used for), but it's just about the last thing one should be concerned about when it comes to video game graphics. In film, there aren't really many other visual improvements to be made -- in video games, however, we have to think about things like frame rate, animations, textures, model quality etc. and all of these, as I see it, are way more important than resolution. Even the most technically impressive video games running in 4K look less "real" and natural than any live-action film being shown in 480p, and the disproportionate focus on resolution in video games only means it'll take longer to close that gap.

I'm not saying we should be playing games in 240p in 2014, but I'd be completely fine with 720p remaining the standard for the next decade at the very least. Not that that's going to happen.

Avatar image for mike
mike

18011

Forum Posts

23067

Wiki Points

0

Followers

Reviews: -1

User Lists: 6

@icemael: Except that we're way beyond 720p now because most consumers have 1080p TVs or monitors for gaming. Depending on the scaler, a 720p source on a 1080p monitor can look pretty awful. Whether developers like it or not, or whether the consoles are capable of it or not, 1080p is now the standard and it's only going to go up from here.

Avatar image for cornbredx
cornbredx

7484

Forum Posts

2699

Wiki Points

0

Followers

Reviews: 0

User Lists: 15

Edited By cornbredx

@icemael: To put this into context a little bit- I just recently bought a 1080p 40" TV for 350$ (not including taxes or whatever).

1080p is standard now, if it's that cheap, and to develop below standard does have a significantly noticeable impact over time. We saw this when CRT displays were being phased out (especially with text- however, my personal experience with a game forcing me to go 720p HD was Dead Space on the 360 which was unplayable on the "SD" CRT display TV I had at the time).

The significance of the difference between 720p and 1080p may seem small right now, but there is a noticeable difference between the two, especially when 1080p is native so the picture is being up/downscaled. 4k is so much bigger (4 times the size if I recall) and when it becomes standard it would be as noticeable a difference as the switch from "SD" to "HD"- so much so it will be necessary to develop at that resolution.

TV manufacturers want to make money so it will happen (the only way they can keep forcing people to buy new TVs is to make new standards at higher resolutions). It won't be any time soon, but it will become standard at some point. Whether or not it should is moot at that point.

1080p will (within the next 10 years or so) be the new 480p.

Avatar image for justin258
Justin258

16686

Forum Posts

26

Wiki Points

0

Followers

Reviews: 11

User Lists: 8

Edited By Justin258

I find these debates way overblown. Someone needs to do an experiment and gather a bunch of the gamers obsessed with this crap. Show them various videos ranging from 900p-1080p and from 30-60 FPS randomly in succession and have them write down what they think the framerate and resolution is. I'd be willing to bet a bunch of people would be way off and would fail to spot differences that they've spent hours upon hours on the internet claiming are so damn important and obvious.

The reality is, a steady framerate of 30 FPS or better and anything 900p or better typically looks and plays just fine. Is a 1080p, 60 FPS game the best? Of course. Is a 900p 60 FPS or 1080p 30FPS game anywhere near as horrible or unacceptable as a bunch of entitled gamers with white people problems claim it is? Hell no.

This guy's not claiming that 30FPS is unacceptable, he's angry that developers tell people "it's cinematic". That is a really, really dumb thing to say, and I can't believe that any developer worth his or her salt actually believes that. Console games generally run at 30FPS because the consoles are limited and 30FPS lets developers pull some extra power out of that.

For most games, yes, a locked and stable 30 is good. Forza Horizon is a game that runs and feels very smooth and looks great - that's because it never, ever drops a single frame. On the other hand, I just finished Grand Theft Auto V, a game which rarely actually reaches 30. Unlike Forza Horizon, GTA V feels sluggish and more difficult to control. It seems like there's a ridiculously cheap lock-on shooting function because the framerate just isn't good enough for the controls to feel smooth when in a firefight or when driving. Some people are going to say they didn't have problems with this, and that's fine, but how many PC gamers failed to notice it?

On a final note, Ratchet and Clank Future: A Crack in Time looks fucking gorgeous and it's a 60FPS game. The Legend of Zelda: A Link Between Worlds is a 60FPS 3DS game! Back on the PS2, God of War aimed for 60FPS and it looked fantastic. Yeah, I'd say that modern consoles should have 60FPS as a standard if these games on much weaker platforms can do it, but frankly I'd be happy if framerate simply wasn't an issue and developers would release games that play well and run at a consistent 30.

Avatar image for k-t
k-t

4

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Edited By k-t

@tourgen: Obligatory shader demonstrating the difference: https://www.shadertoy.com/view/XdXXz4

Avatar image for geirr
geirr

4166

Forum Posts

717

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

Edited By geirr

Not to sound like a snooty cry-puppy but I die a little on the inside when games are locked at 30fps; especially when combined with narrow 40-60 FOVs. It almost feels claustrophobic.

Of course I just don't buy or play them and get over it pretty fast, but still, sometimes it's a good game that I just can't play since it's disorienting and motion blur is kinda sickening. My wife can somehow stomach motion blur, but I just can't. I'd rather have a stuttery frame rate.

Avatar image for cairnsythebeard
CairnsyTheBeard

411

Forum Posts

447

Wiki Points

0

Followers

Reviews: 6

User Lists: 22

Avatar image for rowr
Rowr

5861

Forum Posts

249

Wiki Points

0

Followers

Reviews: 2

User Lists: 3

This whole 1080 60fps thing is absolutely doing my head in. It needs to go away.

It's obvious console gamers for the most part seem to have got it into their heads that it is absolute necessity at this point, the funny thing is it seems to be an incredibly PC centric expectation that has somehow been communicated and transferred over to the console crowd. I understand their is some overlap (I don't consider myself attached to either side) and that many console gamers know exactly what they are talking about, but I feel like a large part of this movement of disgust is by people who don't even have an understanding how it all fits together in the grand scheme of things to set a games resolution while maintaining 60 fps.

I think likely this has all come about as a result of a 10 year console cycle on the last gen that finished with MANY people moving to PC - as soon as the new consoles hit they get nothing but scrutiny from this crowd and then all opinions are misinterpreted and carried over to the casual crowd who don't know much except how to argue in an IGN comment sections.

I just don't understand how these expectations are reasonable. You don't buy a console to complain about the resolution and graphical settings of games - this is all PC territory, buy a freaking PC. Of course that is going to give you a million other things to complain about while the console crowd smuggly chime in once in a while to tell you how well their new release runs on day one.

Naturally a company is going to do everything they can to make their game look amazing as possible at a playable frame rate, there is no smart business reasoning not to do this.

Anybody who has owned a gaming PC understands the tweaking needed in the graphical settings to get the best possible outcome in this regard - hell most of the time to get an adequate result you need to go outside the game and dedicate a few hours to do your tweaking

All these people will tell you from experience is at a certain point you can't have it all without the expense of framerate, and occasionally you are going to make the hard decisions between turning the graphics settings up or down. Sometimes you will even decide to drop the resolution down so you can afford to turn on the other bells and whistles which will likely make the game look much better and be far more playable. I KNOW, CRAZY RIGHT.

HEY maybe this is why people who buy console games should shut up and enjoy the fact that the developer went to a hell of a lot of trouble to optimise the game for you and tweak those settings to present the game in the best possible light. GUESS WHAT, you don't need to search the forums for a day to work out why the game runs like shit on your computer, you don't need to pour through an in depth write up spanning twenty pages telling you the difference between each graphical setting and what it's performance hit is with comparison screenshots. You don't need to wait a week for drivers to release to optimise the game (IF THEY EVER EVEN COME)

I'm starting to wonder if developers are going to have to start implementing the same approach in their console versions of games to allow console users to decide how the hell they want to run their game and if they decide they want the game to max out and run at 20 fps. If they decide they want the game to run at 60 fps whilst everything on screen looks like a potato, power to them! No more blame game.

I mean seriously, I built my PC at roughly the same time as the next gen console releases for about 5 times the cost and i'm already having to make concessions on frame rate and graphics options - how can any console gamer seriously expect this whole 1080 by 60 fps dream they have conjured up to be the only acceptable resolution when the hardware is going to do nothing but age from here on out whilst graphical expectations rise with every release, I understand developers learn to squeeze more and more out of the hardware, but christ almighty there is some disappointment in store for some people if their heads don't come out of the clouds on this issue.

Avatar image for cairnsythebeard
CairnsyTheBeard

411

Forum Posts

447

Wiki Points

0

Followers

Reviews: 6

User Lists: 22

@rowr: Read the article, my point is that focusing on FPS over detail would benefit games because they're an interactive medium, and also that (on console) any resolution and FPS is fine as long as they don't justify it with lies or penalize those with versions on superior hardware.

Avatar image for rowr
Rowr

5861

Forum Posts

249

Wiki Points

0

Followers

Reviews: 2

User Lists: 3

Edited By Rowr

@rowr: Read the article, my point is that focusing on FPS over detail would benefit games because they're an interactive medium, and also that (on console) any resolution and FPS is fine as long as they don't justify it with lies or penalize those with versions on superior hardware.

Indeed, no argument against that at all, and i didn't intend my post as a point against the OP.

Sorry didn't mean to hijack or anything, just went off on a mini rant there - reading some of the clickbait stories over on gamespot and seeing some of the ridiculous things people have to say in comment sections of those site just does my head in with the ignorance, there is definitely some ridiculous entitlement there, and this has been a muddy issue forever. A few years ago the general consensus was that the naked eye couldn't even tell the difference between 30 and 60 fps for example. I may have been better served to elaborate and form a blog or something I apologise.

I will say that many people do prioritise graphics quality over framerate depending on the type of game especially and it increasingly seems to me that these games need to include some graphics settings of some type given the range of preference people seem to have. I mean when this hits pc, i'm probably going to priotise resolution and graphical settings over the difference between 30 and 60 fps if it comes down to it. Given Ubisofts track record on PC and gaging from watchdogs, i'm probably going to have to take whatever I can get.

It's definitely going to feel less "cinematic" to me if all i can notice is how bad the textures are or something because I prioritised framerate. AC isn't exactly a racing or twitch shooting/fighting game, I personally find once i'm past 45 FPS or so in a game like this the difference upwards to 60 is virtually unnoticable unless i'm really looking for it, and if 30 is what I can get it's not exactly a deal breaker if it looks that much better.

Ubisoft definitely have some communication issues, they seem to put their foot in their mouth with developer quotes every other month. For years they have pulled all sorts of shady bs as far as PC is concerned in regards to performance. I do wonder how much of it is taken out of context and sensationalised though to form shitty click bait articles, I mean this was basically one developer probably casually trying to downplay people forming some sort of AC WILL PLAY AT LOW RESOLUTIONS bullshit and it backfired on him.

Avatar image for cairnsythebeard
CairnsyTheBeard

411

Forum Posts

447

Wiki Points

0

Followers

Reviews: 6

User Lists: 22

@rowr: My apologies, and I agree with you there

Avatar image for deactivated-5afdd08777389
deactivated-5afdd08777389

1651

Forum Posts

37

Wiki Points

0

Followers

Reviews: 1

User Lists: 2

@csl316 said:

@wewantsthering said:

The whole 30 fps thing being cinematic is idiotic. Film is locked at 24 fps only because people are used to it in movies. It is in fact worse than a movie at 60 fps for numerous reasons. Hopefully James Cameron can help us get beyond this silly limitation.

Well, yeah, it looks "cinematic" because it's what we're used to in cinema. Seems like a reasonable adjective.

Nope. "Cinematic" has a lot more variables than just frame rate. It's a cop-out. They need to just admit that they prioritize visual fidelity over frame rate and call it a day. Pretending like it looked bad at 60 fps so they had to turn it down to 30 is just idiotic. Do they think most consumers are dumb enough to buy that line? They're probably right.

Avatar image for deactivated-5afdd08777389
deactivated-5afdd08777389

1651

Forum Posts

37

Wiki Points

0

Followers

Reviews: 1

User Lists: 2

@fisk0: The Hobbit issues have more to do with Peter Jackson's inability to make good movies lately and saddling the movies with crappy "3D" technology.

Avatar image for csl316
csl316

17006

Forum Posts

765

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

Edited By csl316

@wewantsthering said:

@csl316 said:

@wewantsthering said:

The whole 30 fps thing being cinematic is idiotic. Film is locked at 24 fps only because people are used to it in movies. It is in fact worse than a movie at 60 fps for numerous reasons. Hopefully James Cameron can help us get beyond this silly limitation.

Well, yeah, it looks "cinematic" because it's what we're used to in cinema. Seems like a reasonable adjective.

Nope. "Cinematic" has a lot more variables than just frame rate. It's a cop-out. They need to just admit that they prioritize visual fidelity over frame rate and call it a day. Pretending like it looked bad at 60 fps so they had to turn it down to 30 is just idiotic. Do they think most consumers are dumb enough to buy that line? They're probably right.

If 24fps is what we're used to, and people equate 30 fps as being similar to films we're used to, then some consumers are going to say it's closer to being cinematic despite all the other variables. I'm not talking about developers, the TC did his detailed breakdown of that. This is a dumb argument we're having.