• 58 results
  • 1
  • 2
#1 Edited by expensiveham (290 posts) -

Story: http://www.videogamer.com/ps4/thief/news/ps4_thief_capped_at_30fps_producer_suggests_60fps_not_a_necessity.html

Apparently Thief, Killzone: Shadow Fall and other games are going for a 30 fps cap. So much for "'phenomenal piece of hardware" and "world's best PC".

In terms of raw graphics we have not seen much of an improvement in "next-gen" titles so the assumption was that we could get higher resolutions and better performance and frame rate. But seeing this makes you wonder. What are your thoughts on this? With all the hype surrounding the hardware it is strange to see articles like this, does not seem like this is such a big leap Sony would like you to believe.

#2 Posted by JasonR86 (9650 posts) -

That seems dumb. It can clearly run games at 60 fps. If current PCs can run games at 60 fps so can the PS4. Maybe they have a good reason why they don't want to do 60 fps. Maybe the first few games will straddle that generation line until the older consoles are dead and gone and the majority of PC consumers have moved up to better hardware.

#3 Posted by The_Laughing_Man (13629 posts) -

I think that no system ever has been able to use all it's capabilities on release. Game devs need time to learn about it and adapt to it.

#4 Edited by GrantHeaslip (1557 posts) -

@jasonr86 said:

That seems dumb. It can clearly run games at 60 fps. If current PCs can run games at 60 fps so can the PS4. Maybe they have a good reason why they don't want to do 60 fps. Maybe the first few games will straddle that generation line until the older consoles are dead and gone and the majority of PC consumers have moved up to better hardware.

I don't get why you and the OP are trying to say here. Framerate's a result of tradeoffs, not a hardware feature. Faster processing gives them more graphical bandwidth, but they can choose to fill that series of tubes with more polygons, effects, post-processing, etc. rather than using it to hit 60 FPS.

I imagine that most people are more receptive (whether they're cognizant of it or not) to a solid 30 FPS and better graphics than a solid 60 FPS and worse graphics, but I'm sure that's a trade-off the developers don't make lightly, and make depending on the style of game. Thief and Killzone aren't particularly twitchy or fast games, and probably wouldn't benefit from a high framerate the same way a Call of Duty or Gran Turismo would.

This proves nothing about the PS4, it just proves that a couple of developers have decided against allocating resources (human and/or processing) into hitting 60 FPS.

#5 Edited by Vin (4 posts) -

It's the launch game syndrome.

Compare the early games for all the systems across the board. Some might even deem 'em unplayable compared to what we're used to now.

As long as the new games released are fun as hell and look at least like the devs tried to tap into some of that next gen magic juice, I'm in.

#6 Edited by SerHulse (685 posts) -

@the_laughing_man: Not only that but when were the specifications finalised? a year ago? a couple of months?

Any developer making a next-gen game releasing in the launch window has probably been working on non-final hardware for a while. Engines need to be updated, middleware needs re-writing, there is more to it than just making the game at the start of a generation.

#7 Edited by MattyFTM (14363 posts) -

Console games are never going to run at 60 FPS as standard. Even when developers start to be able to squeeze more out of the hardware, there is always going to be a trade-off. They can either put more pixels, shaders, lighting effects and other graphical elements on screen at once, or they can include less shiny graphics stuff and have it run at 60 FPS.

And shiny graphics stuff sells to a far larger audience than high framerates. "Look at this sweet explosion" and "Look at the detail of the facial animation" speaks to people. "Look at the framerate" doesn't. As long as it isn't choppy, people don't care whether it's running at 30 or 60. While there will always be exceptions, on the whole developers are always going to push for more graphical detail than a higher framerate.

Moderator
#8 Edited by FunkasaurasRex (847 posts) -

I just want better AI you guys.

#9 Posted by MedalOfMode (294 posts) -

This time, there is a x86 Architecture. Developers just need one month to learn. This is system's problem. PS4 Will never open any game in 2K. But we will see 1080p 60 FPS Games. Like GoW 4 and Next-Gen Call of Duty. Battlefield Next, Excellent Unreal Engine 4 Games Will run at 30 FPS.

#10 Posted by McGhee (6094 posts) -

If it is stays at 30 fps, then that's fine with me.

#11 Edited by JasonR86 (9650 posts) -

@jasonr86 said:

That seems dumb. It can clearly run games at 60 fps. If current PCs can run games at 60 fps so can the PS4. Maybe they have a good reason why they don't want to do 60 fps. Maybe the first few games will straddle that generation line until the older consoles are dead and gone and the majority of PC consumers have moved up to better hardware.

I don't get why you and the OP are trying to say here. Framerate's a result of tradeoffs, not a hardware feature. Faster processing gives them more graphical bandwidth, but they can choose to fill that series of tubes with more polygons, effects, post-processing, etc. rather than using it to hit 60 FPS.

I imagine that most people are more receptive (whether they're cognizant of it or not) to a solid 30 FPS and better graphics than a solid 60 FPS and worse graphics, but I'm sure that's a trade-off the developers don't make lightly, and make depending on the style of game. Thief and Killzone aren't particularly twitchy or fast games, and probably wouldn't benefit from a high framerate the same way a Call of Duty or Gran Turismo would.

This proves nothing about the PS4, it just proves that a couple of developers have decided against allocating resources (human and/or processing) into hitting 60 FPS.

I would hope that that trade-off wouldn't be that black and white with the next consoles. Especially with first-generation games that aren't pushing the hardware to it's limits. I never said that it proves anything about the PS4.

#12 Edited by EXTomar (4629 posts) -

This is no more of a problem than noticing the latest version of a car is "only four cylinder".

#13 Edited by GrantHeaslip (1557 posts) -

@jasonr86: Okay, "proves" isn't the right word, but what I mean to say is that demonstrates very little about the power of the PS4. There's not much point in reading too far into this.

#14 Posted by JasonR86 (9650 posts) -
#15 Edited by Cameron (596 posts) -

30fps is fine (though not ideal) for certain kinds of games, but it needs to be locked at 30. If it ever dips below that it is immediately noticeable.

I'd much rather have 60fps with slightly lower quality lighting (SSAO seems to be what really kills framerates right now) than 30fps with slightly better lighting. I was playing Bioshock Infinite (not a twitchy game by any means) with almost everything turned all the way up and I was getting drops from 60 to 30 in some areas. I turned the graphics down a bit and it held at 60 no problem. I find the smooth motion of 60fps makes the game look better than the slight graphics graphics bump I got when running at 30. I'm sure 30 is fine in most games when you've never seen 60, but it's hard to go back once you have.

#16 Posted by Humanity (9014 posts) -
@mattyftm said:

Console games are never going to run at 60 FPS as standard. Even when developers start to be able to squeeze more out of the hardware, there is always going to be a trade-off. They can either put more pixels, shaders, lighting effects and other graphical elements on screen at once, or they can include less shiny graphics stuff and have it run at 60 FPS.

And shiny graphics stuff sells to a far larger audience than high framerates. "Look at this sweet explosion" and "Look at the detail of the facial animation" speaks to people. "Look at the framerate" doesn't. As long as it isn't choppy, people don't care whether it's running at 30 or 60. While there will always be exceptions, on the whole developers are always going to push for more graphical detail than a higher framerate.

I think it largely depends on what standards we adapt. If a majority of next gen titles coming out will natively run at 60 FPS than any game that comes out and runs at 30 FPS will be seen as a substandard product. The question remains if developers for next gen consoles will want to make the transition to smoother gameplay or if they'll continue chasing PC standards which due to static hardware they simply won't be able to match.

Online
#17 Posted by FierceDeity (358 posts) -

@jasonr86 said:

That seems dumb. It can clearly run games at 60 fps. If current PCs can run games at 60 fps so can the PS4. Maybe they have a good reason why they don't want to do 60 fps. Maybe the first few games will straddle that generation line until the older consoles are dead and gone and the majority of PC consumers have moved up to better hardware.

I don't get why you and the OP are trying to say here. Framerate's a result of tradeoffs, not a hardware feature. Faster processing gives them more graphical bandwidth, but they can choose to fill that series of tubes with more polygons, effects, post-processing, etc. rather than using it to hit 60 FPS.

I imagine that most people are more receptive (whether they're cognizant of it or not) to a solid 30 FPS and better graphics than a solid 60 FPS and worse graphics, but I'm sure that's a trade-off the developers don't make lightly, and make depending on the style of game. Thief and Killzone aren't particularly twitchy or fast games, and probably wouldn't benefit from a high framerate the same way a Call of Duty or Gran Turismo would.

This proves nothing about the PS4, it just proves that a couple of developers have decided against allocating resources (human and/or processing) into hitting 60 FPS.

Precisely. It's simply a decision of what the developer favors more: frame rate (which has both visual and game play benefits) or polygons, textures, etc (mainly visual benefit). 99% of NES and SNES games ran at 60/50 fps, so it's not really related to hardware...

#18 Edited by MariachiMacabre (7070 posts) -

Yep. Sounds like Launch Game Syndrome, alright.

#19 Posted by FierceDeity (358 posts) -

@mattyftm said:

Console games are never going to run at 60 FPS as standard. Even when developers start to be able to squeeze more out of the hardware, there is always going to be a trade-off. They can either put more pixels, shaders, lighting effects and other graphical elements on screen at once, or they can include less shiny graphics stuff and have it run at 60 FPS.

And shiny graphics stuff sells to a far larger audience than high framerates. "Look at this sweet explosion" and "Look at the detail of the facial animation" speaks to people. "Look at the framerate" doesn't. As long as it isn't choppy, people don't care whether it's running at 30 or 60. While there will always be exceptions, on the whole developers are always going to push for more graphical detail than a higher framerate.

How do you explain the massive success of Call of Duty though? They may not realize that it's running at 60 fps, but plenty of people have commented on how much smoother the game feels compared to other FPS's. I don't think you can entirely discount that kind of thing in an industry that relies on word of mouth as much as the games industry.

#20 Edited by Tarsier (1057 posts) -

I just want better AI you guys.

better AI, better physics, more expansive worlds, more things happening at once, better skyboxes, etc... all they seem to be doing at this point is taking the same old shit and adding some new lighting and textures.

#21 Posted by JazzyJeff (399 posts) -

@fiercedeity: It depends on the game. Call of Duty is a fast-paced, twitch shooter. Games that rely heavily on reflexes and speed are going to benefit more from running at 60 FPS. A game like Thief, which is way more meticulous in its gameplay, can make that trade-off because it's on the opposite end of the spectrum as far as player skillset. 60 FPS is great; I love it. But you'd have to tone down or cut out the elements of what make some games good for them to obtain that speed.

#22 Posted by SSully (4147 posts) -

@mattyftm said:

Console games are never going to run at 60 FPS as standard. Even when developers start to be able to squeeze more out of the hardware, there is always going to be a trade-off. They can either put more pixels, shaders, lighting effects and other graphical elements on screen at once, or they can include less shiny graphics stuff and have it run at 60 FPS.

And shiny graphics stuff sells to a far larger audience than high framerates. "Look at this sweet explosion" and "Look at the detail of the facial animation" speaks to people. "Look at the framerate" doesn't. As long as it isn't choppy, people don't care whether it's running at 30 or 60. While there will always be exceptions, on the whole developers are always going to push for more graphical detail than a higher framerate.

This. There will be come devs that strive for that framerate, but it's never going to be a standard for consoles, especially as they go later in their life cycle.

#23 Edited by FFFFFFF (75 posts) -

If that surprises you, you haven't been paying attention.

Games were never going to jump to 60fps, regardless of what they are running on. Saying games are going to run at 30 frames has nothing to do with hardware. Games could easy run at 60 on PS3 or PS2 if they were willing to trade visuals to do it, but publishers rarely traded visuals for control fidelity.

If you have a game running at 60, but you could cut that in half and make it look even better, you do it. Not saying I agree with it, but that's the practice.


It's kind of funny that when everyone was racing to copy Call of Duty, no one tried copying the great frame rate. Sure most people that are buying it don't know what that is, but they can feel it whether they realize it or not. And it's probably a factor that helped keep CoD at the top of the pile while everyone else was throwing their money at looking better.

#24 Posted by GS_Dan (1402 posts) -

60 frames doesn't make trailers more impressive

#25 Posted by Bollard (5395 posts) -

I just want better AI you guys.

Did you see the Killzone demo? The bit where the guy shoots at an enemy, and it doesn't die, so it gets bored and walks away instead of trying to kill the player? Can't wait.

Online
#26 Posted by Wampa1 (634 posts) -

Does anyone remember shit like Untold Legends that launched with the PS3 but looked way more like a PSP game. Systems are rarely (if ever) pushed to the limits on launch. Uncharted looked strong at launch but Uncharted 2 was when you saw how great that system could be.

#27 Posted by believer258 (11773 posts) -

@mattyftm said:

Console games are never going to run at 60 FPS as standard. Even when developers start to be able to squeeze more out of the hardware, there is always going to be a trade-off. They can either put more pixels, shaders, lighting effects and other graphical elements on screen at once, or they can include less shiny graphics stuff and have it run at 60 FPS.

And shiny graphics stuff sells to a far larger audience than high framerates. "Look at this sweet explosion" and "Look at the detail of the facial animation" speaks to people. "Look at the framerate" doesn't. As long as it isn't choppy, people don't care whether it's running at 30 or 60. While there will always be exceptions, on the whole developers are always going to push for more graphical detail than a higher framerate.

I can see where you are coming from with that, but I don't entirely agree with it. Call of Duty, the biggest video game franchise in the world right now, runs at a solid 60 and has for a while now. You might be able to write it off as an exception to the rule, but in this case it's a hell of an exception.

I do agree that developers will often push for better-looking more than they will for 60 frames. Shame, really.

#28 Edited by zoozilla (978 posts) -

@mattyftm said:

And shiny graphics stuff sells to a far larger audience than high framerates. "Look at this sweet explosion" and "Look at the detail of the facial animation" speaks to people. "Look at the framerate" doesn't. As long as it isn't choppy, people don't care whether it's running at 30 or 60. While there will always be exceptions, on the whole developers are always going to push for more graphical detail than a higher framerate.

Especially since the vast majority of video on the internet (and on TV) is not 60 frames per second, so the benefits of a higher framerate are not evident unless you're actually playing the game.

#29 Posted by JoeyRavn (4961 posts) -

@tarsier said:

@funkasaurasrex said:

I just want better AI you guys.

better AI, better physics, more expansive worlds, more things happening at once, better skyboxes, etc... all they seem to be doing at this point is taking the same old shit and adding some new lighting and textures.

I could settle for better stories. The reason I liked recent games such as The Walking Dead, 999/VLR, Bioshock Infinite or even Spec Ops: The Line was not because of them shiney grafx. But if you're talking strictly technical stuff, yeah, sure. But the difference between 30 and 60 FPS is massive, still.

#30 Posted by MikeJFlick (441 posts) -

@believer258: "I can see where you are coming from with that, but I don't entirely agree with it. Call of Duty, the biggest video game franchise in the world right now, runs at a solid 60 and has for a while now. You might be able to write it off as an exception to the rule, but in this case it's a hell of an exception."

But............. Call of Duty doesn't push the limits, it hasn't changed much visually since the first release on the 360 and has been narrowing FOV ever since to be able to get those frames.

Now ask yourself this, if you could play some future game that offers 64vs64 player matches at 30fps or would you settle for 32v32 for that 60fps? I personally would prefer limits to be pushed.

#31 Edited by Seppli (10251 posts) -

It seems to me like many developers shy away from old tricks, and are just induldging too much in fancy new school rendering practices, which often add very little worthwhile or nothing at all to the end result, and hence I feel like they chose to fight the wrong battles.

With this new hardware generation, I think it'd be best if developers would work with a mandate of 1080p rendering resolution @ 60 FPS, and then go for the best feeling results, and not shy away from yesteryear's lower fidelity tricks, if need be. Results-driven development, rather than always trying to go full hog on the latest cutting edge rendering fancy and whatnot.

Blizzard should be held up as the poster child for getting the most out of a lean and efficient technical art direction. Not saying everyone should go full-on Blizzard, chosing style and substance over fancy graphics always, but please fight your battles a little more in favor of efficiency and playability over going after more flashy new rendering details at their cost. Many of the more subtle improvements are lost in the heat of battle anyways, and a majority of consumers doesn't even have the eye to spot most of the more fancy touches anyways.

A higher rendering resolution increases all detail you manage to put into the game, and a higher and smoother framerate increase playability in general. Less fuss about putting more minute little rendering details on screen, more focus on all-encompasing detail increase via higher rendering resolutions, and improved playability by hitting a smooth framerate around 40 to 60 FPS.

Like you can't still make games look absolutely jaw-dropping without cramming the latest dynamic refracted translucent volumetric radiating most ambient occlusioned whatever effects everywhere. Come on!

#32 Edited by believer258 (11773 posts) -

@believer258: "I can see where you are coming from with that, but I don't entirely agree with it. Call of Duty, the biggest video game franchise in the world right now, runs at a solid 60 and has for a while now. You might be able to write it off as an exception to the rule, but in this case it's a hell of an exception."

But............. Call of Duty doesn't push the limits, it hasn't changed much visually since the first release on the 360 and has been narrowing FOV ever since to be able to get those frames.

Now ask yourself this, if you could play some future game that offers 64vs64 player matches at 30fps or would you settle for 32v32 for that 60fps? I personally would prefer limits to be pushed.

You have completely missed my point. MattyFTM said that shinier graphics sell better to audiences, to which I was saying that the best-selling game series of all time does not have shinier graphics but rather has a faster framerate. I said nothing about personal preference.

But while we're on the subject of personal preference, I like aesthetics first, fast framerates second, higher resolutions third, and detail dead last as far as visuals go, and I'll sacrifice the latter two any day for the sake of a better game. As for multiplayer, I don't play that much anymore but player counts do not necessarily mean better multiplayer. Battlefield 3 has 64 players (if you're playing it on the PC) as opposed to MAG's 256 and guess which one feels more intense and interesting?

Finally, I'm wondering which one of those you consider pushing the limits since each one is actually a trade-off. You'd be pushing the limits by striving for both, not deciding which one you'd rather have.

#33 Posted by expensiveham (290 posts) -

Those of you saying that this will go away as developers get used to the hardware are forgetting that this is x86 architecture and developers have been making games on it for decades. And the graphical change is barely noticeable. Killzone is being praised and pushed by Sony and gaming sites for it's graphics and it looks worse then plenty of current gen games on the PC.

There is not a major improvement in "polygons, effects, post-processing, etc". Unless all that new power is being pushed into fitting thousands of enemies on screen or some amazing super intensive AI or Physics i don't see why these games should not be able to run at higher frame rates. If games like Killzone can't do 30 fps then it won't be long until we have games on consoles that drop down to 15 fps during intensive moments.

Obviously things will be better with time as developers learn how to work around the hardware and do things more efficiently. I just think that considering the very small leap in graphics, 60 fps should not be a problem.

#34 Posted by FFFFFFF (75 posts) -

@mattyftm said:

And shiny graphics stuff sells to a far larger audience than high framerates.

I can see where you are coming from with that, but I don't entirely agree with it. Call of Duty, the biggest video game franchise in the world right now, runs at a solid 60 and has for a while now. You might be able to write it off as an exception to the rule, but in this case it's a hell of an exception.

It's a thing that people don't know they like it until they try it. And even then, those people that like it, don't know what it is that they like. They would probably just say that it feels more tight or responsive, without knowing why. Low controller latency is a big deal and I think call of duty proves that people care about it -- it's just that no one knows that they care about it. Including the guys in suits that hold all the money saying, 'make it look better than Call of Duty.'

#35 Edited by believer258 (11773 posts) -

@cheh said:

@believer258 said:

@mattyftm said:

And shiny graphics stuff sells to a far larger audience than high framerates.

I can see where you are coming from with that, but I don't entirely agree with it. Call of Duty, the biggest video game franchise in the world right now, runs at a solid 60 and has for a while now. You might be able to write it off as an exception to the rule, but in this case it's a hell of an exception.

It's a thing that people don't know they like it until they try it. And even then, those people that like it, don't know what it is that they like. They would probably just say that it feels more tight or responsive, without knowing why. Low controller latency is a big deal and I think call of duty proves that people care about it -- it's just that no one knows that they care about it. Including the guys in suits that hold all the money saying, 'make it look better than Call of Duty.'

Yep. Precisely.

#36 Posted by SlashDance (1812 posts) -

Some people believed the PS2 was gonna play games at 60fps with no load times... just saying.

#37 Posted by Andorski (5239 posts) -

@mattyftm said:

And shiny graphics stuff sells to a far larger audience than high framerates.

I can see where you are coming from with that, but I don't entirely agree with it. Call of Duty, the biggest video game franchise in the world right now, runs at a solid 60 and has for a while now. You might be able to write it off as an exception to the rule, but in this case it's a hell of an exception.

I do agree that developers will often push for better-looking more than they will for 60 frames. Shame, really.

Don't the Treyarch CoD games run at 30 fps?

I agree with the initial assessment. Prettier graphics are easier to sell that higher framerates. Games like IW's Call of Duty and Burnout make 60 fps seem more of a necessity to it's gameplay, but the majority of console games have been designed with 30 fps in mind. No one plays Halo and thinks the gameplay is at a disadvantage when being run at 30 fps versus 60 fps.

I think this is also why most next generation console games will have sub 1080p native resolution. 720p is almost indistinguishable to 1080p when playing at ~6 ft. on TV's smaller than 46". Better to use those processing resources to make more explosions and have more enemies on screen than to up the resolution.

#38 Edited by believer258 (11773 posts) -

@andorski: No, all current-gen CoD games run at 60FPS. I have heard here and there that Black Ops 2 drops occasionally, but not often, and it still aims to maintain 60.

#39 Posted by SathingtonWaltz (2053 posts) -

I always try for 60fps on my PC games as I find 30fps on PC unplayable. However on consoles I never really find it bad at all, I'm not sure why.

#40 Posted by JZ (2125 posts) -

Well those one are just fancyed up ps3 games.

#41 Posted by Anund (881 posts) -

I'll take 30 fps and improved graphics/AI/whatever over 60fps any day, any time.

#42 Posted by sins_of_mosin (1556 posts) -

60fps was a bench mark for pcs for very heavy graphics test. It really has no impact on actual enjoyment of games.

#43 Posted by Hunter5024 (5600 posts) -

People make a much bigger deal out of it than it is. There are like 2 kinds of games where it even matters at all.

#44 Posted by Basm321 (136 posts) -

I would rather have 60fps than say... rat shadows.

#45 Posted by tourgen (4459 posts) -

60fps is a big deal if you are making an arcade game or any other skill game. RPGs, strategy games, sloppy action-adventure titles, yeah, no one cares and rightly so. But for real action games it's 60fps or GTFO. 30fps is 33ms of slop. It also limits how fast you can have things moving on screen without causing nasty jitter or using a godawful amount of motion blur.

#46 Posted by Mrsignerman44 (1100 posts) -

This topic sounds accusatory. Of course the launch title games won't be great, games will ramp up to 60 fps by the end of the life cycle, I'm sure.

#47 Edited by GERALTITUDE (3179 posts) -

Shocked this thread and any replies even exist... then again ignorance and complaints do seem to go hand in hand.

#48 Edited by Raven10 (1759 posts) -

@gs_dan said:

60 frames doesn't make trailers more impressive

This is pretty much your answer. As many have said almost any console in history can play games at 60 fps. Hell, most any console since the NES can play games at 120 or even 240 fps. It's just a matter of graphical quality. For example, when running games in Dosbox you often have to put a framerate limiter on games so they don't run at tens of thousands of frames per second. If you were to try and put a 70's or 80's era game on a console with no changes then it would run too fast. On PS3 there are several games that ran at 60 fps. But there were tradeoffs. Rage ran at an almost perfect 60 fps. But the tradeoff was low resolution textures, baked on lighting, and an often sub-HD resolution. The same thing is true on PS4. The thing could easily run a PS3 quality game at 60 fps. But if the game looks like a Pixar movie then you aren't going to see 60 fps on a next gen console. Think of it like a PC. I can alter settings to increase or decrease visual fidelity at the cost of framerate. Consoles are the same way. It's just that the developers make the choice for you on consoles whereas on PC I can decide whether I care more about visual quality or performance.

Also to people saying that this will improve over time, actually expect the opposite. As time goes on developers will need to push the consoles more and more to keep impressing players. That is going to mean a lower framerate. As an example for this generation, the earlier Tomb Raider games ran at 30 fps pretty consistently, while the new one drops frames like crazy. Likewise, Far Cry 2 ran pretty smoothly on consoles, but Far Cry 3, which looks much better, runs absolutely horribly. And Criterion ran Burnout Paradise at 60 fps, but to increase visual quality they lowered the framerate to 30 in Need For Speed.

#49 Posted by xyzygy (9935 posts) -

30 FPS is totally fine.

#50 Posted by Rafaelfc (1324 posts) -

The market doesn't really care for 60fps so of course they will go for bells and whistles over framerate EVERY time, it's been like that for a while now.