Something went wrong. Try again later

NPfeifer

This user has not updated recently.

239 0 14 5
Forum Posts Wiki Points Following Followers

NPfeifer's forum posts

Avatar image for npfeifer
NPfeifer

239

Forum Posts

0

Wiki Points

5

Followers

Reviews: 0

User Lists: 0

I'm trying to track down a Series S. I had to pawn my Xbox One and Xbox One X (that I had just set up) and I've been out an Xbox for a couple years now. 1080p TV, so Series S is great.

Avatar image for npfeifer
NPfeifer

239

Forum Posts

0

Wiki Points

5

Followers

Reviews: 0

User Lists: 0

Playing action games at 30fps feels bad, especially after playing at 60fps or more for a while.

Also, we can link to our youtube channels now?

To your first point: I disagree entirely and I know and appreciate the difference between solid 30fps/60fps. To each their own, I suppose, but 30fps is still industry standard and it still works really damn well.

And I wish we could link to YouTube channels. I spend dozens of hours making some of my videos and they articulate my thoughts on a subject far more than text in this message box ever could.

Avatar image for npfeifer
NPfeifer

239

Forum Posts

0

Wiki Points

5

Followers

Reviews: 0

User Lists: 0

Ok. Trying to get as many of these as possible. The quote system here is not great!

@efesell said:

Obviously I would buy it on the system that offered a consistent framerate if that situation exists, I don't understand how that would be a question.

I would want to know how inconsistent it is.. how frequently it occurs and how severe the differences are.

If the answer ultimately is Wow they crapped this game up pretty much everywhere then ..no I do not buy the game. There are plenty of other games.

How much does this really matter, though? There was time, when graphics were more inconsistent between platforms, that the IGNs and Gamespots of the world would issue articles where graphics were an actual issue, LIKE the Bethesda titles on PS3. These days, it seems that everything that can be construed as an annoyance and (now) measured is used, as you say, as a weight for/against purchase. An inconsistent framerate was rarely an actual issue in grading or enjoying games until the framerate got high enough. Going between 5-20 fps wasn't an issue, going between 45-60 is some kind of dealbreaker.

@dourin said:

@npfeifer As others here have pointed out, I think the underlying issue here is a matter of personal perception. Framerates matter differently to different people based on their own perceptions of those framerates. This reminds me of friends of mine who would get (I felt) unreasonably upset at people who claimed PC superiority because everything could run at 60 fps. "You can't even see the difference," would be a popular argument. The same debate would crop up when displays started pushing refresh rates into the 120Hz+ range. I had a friend who would argue with me to the point of actual anger that you cannot perceive the difference between 30 fps and 60 fps, and those who championed 60 fps gaming were objectively wrong and just trying to lord over players with what they perceived to be lesser equipment.

The reality is, some people absolutely don't see that difference. To some people there is no perceivable difference between 30 fps and 60 fps. My friend was one of those people. However, on the flip side, there are people to whom there is a stark difference between 30 fps and 60 fps. Once I moved from mainly console gaming to mainly PC gaming, and I was exposed to more 60 fps gaming, I absolutely noticed the difference. My console 30 fps games would often feel sluggish by comparison. 60 fps felt smooth as butter, and became the standard for my play experience going forward. Fast forward to a couple years ago, when I needed to upgrade my main monitor on my PC, and opted to go for a 144Hz display. I'd been playing a ton of Overwatch, and had heard that 144Hz could actually make your aim more accurate, so I decided what better time to test out that claim and went the higher refresh rate route. Visually, at first I couldn't say I noticed a difference between 60 fps and 144 fps. However, I absolutely felt it. Games felt smoother, my aim felt more accurate. As time went on, and ~144 fps has become the norm for how I play most games, I have found that when I play a game that drops into the ~60 fps range (not locked at 60 fps), that sluggish feeling I remember from 30 fps games before has emerged. It's not some elitist point of view, or me trying to say that if you're not playing games at 144Hz you're getting a subpar experience. I'm only saying that from my perspective, based on my own perception of refresh rates, higher, more stable refresh rates make for a better gaming experience.

On a side note, locked 60fps doesn't give me that same sluggish effect. Those games look and feel absolutely fine. It's more so games that will jump around from 55-75 that feel bad to play.

I think we're absolutely running into the limits of human perception here. I know we can interpret input up to 1/235ths of a second or something like that, but obviously there are diminishing returns. _I_ can see the difference between 60fps and 30fps, but I'm also at the point where I'd rather play at a higher resolution (4K) than 60fps because reducing aliasing means more than framerate. Eventually games will do both and it won't be a big deal. And then there's, as we've been talking about through this thread, the matter of consistency and as a small-time game developer, I don't hold it against developers who can't always keep it together when pushing the envelope in terms of assets because it is a complex resource-managing task. I'm not saying that games should chop up to 3fps during intense moments, but having it sway between 25-30fps as action picks up isn't really a big deal.

Really? They're obviously not diving into into FSAA and stuff like that, but they do have plenty of hot takes about frame rates and resolution.

@quantris said:

I could agree some of this stuff is nitpicky and may or may not matter to a given individual. Though you might be taking Jeff a little too literally when he says something is "unplayable".

I didn't say he said something was "unplayable", but I trust JEff for his opinion and he was flat-out wrong in his take.

@xeiphyer:

I'm amazed that you're able to play some games at 3 fps without it feeling bad ;)

It actually makes me more concerned about my own hardware. I built this PC for about $1800 four years ago and it runs GTAV at 4K/steady 30fps, which is great, except it hitches for a second or two as I move through the world, which is something I was not expecting and makes me appreciate the asset load masking of the version on 15 year old hardware even more. It's noticeable, but where else am I going to play it without basically building a new machine? I've read Ars Technica/Tom's Hardware articles where this is just a fact of life about seek times through a PC.

@npfeifer: On a big enough TV of course you see the difference between 900p and 1080p. Same thing as 4k, it's minor, but it's there. Will it make a game unplayable for me? Of course not. But that doesn't mean I should ignore it.

I had friends at the beginning of the PS3 and 360 era arguing about resolutions, basically as soon as HD tv were a thing, so no it's not something that came with Digital Foundry. And again, as someone who works on games, even though I don't use technology on that level, it's interesting to follow the why and how some game can achieve better performances, even if it's minimal, on different hardware.

Like it's kinda annoying because your argument come from a place of completely dismissing others experiences'. You're acting like a noticeable difference should be meaningless, it can be for you and that's fine, but it's not for everyone.

I still played and enjoyed Bloodborne on PS4 even though it was not a consistent 30 fps, I wasn't freaking out about it being unplayable (though I know some who were) but if a better version would've been available at release, it would've informed my choice of purchase to be in the know about it.

I remember the resolution arguments of that first HD generation, but I think it goes back to your first sentence: on a big enough TV all differences are noticeable. All resolution is imperceptible to a whole lot of people beyond a certain TV/room size difference. People started finding out about the Lechner distance when they started seeing things that actually weren't there. I remember when 4K sets started showing up in demo rooms and you either had to be close enough to it to appreciate the difference or it had to be large enough to see it at a distance.

I'm not trying to dismiss others experiences, but coming from a background where people said they saw a difference between a $10 and a $60 HDMI cable, I don't know of anyone would be able to, in a double-blind test, distinguish between 900p and 1080p. This isn't like the PGR3 controversy of yore where the game was running at 600p, which was literally just above standard definition resolution. Certainly it was a lot harder back in 2013 when it was an actual controversy and TVs were more expensive/smaller than they are now.

@xeiphyer said:

First of all, this seems like baiting people for self promotion youtube clicks?

Frankly, your premise that people should just be happy with how games look and run if it meets your self-defined minimum standard of acceptable is kind of ridiculous.

Of course the Switch version of Outer Worlds looks totally fine and is completely playable and recognizable as a videogame, but that doesn't mean everyone should just blindly accept that and be happy with whatever they get. If I am paying $60 for a game and I have the capacity to play it on multiple consoles or the PC, then I absolutely want to get the best version of that game for me.

What is the best version of that game? That is the part that is completely subjective, and also what you are trying to argue about. If you can accept that the switch port is the worst looking version of the game graphically and in terms of performance because you value the portability, or you are able to play it on your TV or whatever, then that's awesome and totally valid. For me personally, I want to play the version where I can get the best framerate because I enjoy the smoothness that 60+fps provides, so the PC version is the best version for me. Also valid.

Discussion about these differences by gamers and websites like Giant Bomb are important because it helps everyone make informed purchasing decisions and protect people from spending a lot of money on something that might not meet their expectations.

Also, while it might be true that the average gamer at large doesn't have a nice gaming PC, it's also important to consider that Giant Bomb absolutely caters towards the more "hardcore" audience that has a lot more invested on average than the typical gamer, so I don't think your assumptions there are really very accurate.

OKAY, and now to talk about what I agree with in what you're trying to say.

Yes, obviously people take the relatively small (sometimes) differences in versions of games and blow them up to huge deals. The games industry and gamers in general are huge fucking babies and do this shit all the time and its annoying as all hell. That being said, in my many years of experience, its generally a vocal minorityish of gamers, and generally its very young gamers that feel the need to defend and justify their (or their parent's) purchase decisions to others online.

I grew up playing games on low settings or enduring really awful framerates because it didn't really matter to me that much back then, and I was still getting most of the experience out of the game. As I'm sure a lot of GB veterans will attest to, as I got older and had more money to spend in gaming hardware and in life, my expectations for what was acceptable increased. I still think people that say they can see the difference between 120 fps and 144fps are snobs and full of shit, but definitely playing games on 3fps feels bad to me now on some games, especially first person stuff when I'm trying to aim, So I understand both arguments pretty well.

PS: Good luck with the money situation dude.

If I could present visual evidence for my argument without making it look like I'm spamming for the YouTube channel I make negative dollars off of, then I would (but also, I'm biased because there's some other really cool content I worked really hard on on there, and I think most of the internet's self-promotion rules are 'letter of the law' as opposed to 'spirit of the law', so... that's a whole other ball of yarn).

Someone made the absurd argument on my YouTube video that if the Switch version was going to be so compromised, why make a Switch version. Well, why make any version of any game then if it's not running the game with all of its assets in play with the shiniest frame rate, in 4K/8K? All versions of any game are a compromise if they're not running on the absolute most powerful processing device possible - namely, a rack-mounted super computer, but for realism, let's say it's your home desktop. That's just reality. What it comes down to beyond obvious performance issues is picking nits. Jeff saying a game that runs at a solid 30fps "doesn't run super well" is the pickiest of nits and it's the statement he was the most enthusiastic about expressing. Would I pay $40 for a Saints Row: The Third remaster nine years after I bought and beat it and made it my game of the year? No, obviously not, even if the game were running 60fps in 4K, that's completely irrelevant. That thing better have a whole other campaign's worth of content or some big new mode for that kind of money.

And thanks on the $$$ situation, it'll be great to have more perspectives (and games!) to play again.

Avatar image for npfeifer
NPfeifer

239

Forum Posts

0

Wiki Points

5

Followers

Reviews: 0

User Lists: 0

@efesell said:

I think it's funny to bring up the Saints Row DF because its incredibly positive and glowing about what that remaster accomplishes.

A steady 30 is fine, whatever, you will adjust to that without any trouble so long as you don't immediately compare it to something that is at 60. A "wobbly 45-60" is hot garbage and I deny your every attempt to say that complaining about that is unnecessary.

I can see something about a wobbly framerate not being great, I've certainly played plenty of those, but "hot garbage"? Hmmm. Would you not buy a game because of it? Are you sitting in a situation where you'll buy it on one console versus another just because of it? What if it's wobbly on all systems, which it is, do you just not buy it?

And yes, I did watch the DF Saints Row video in preparation of the video I made and their glowing analysis of the remaster, which made Jeff's take even more strange.

Avatar image for npfeifer
NPfeifer

239

Forum Posts

0

Wiki Points

5

Followers

Reviews: 0

User Lists: 0

#5  Edited By NPfeifer
@fear_the_booboo said:

@npfeifer: I do tell you that I definitely see the difference between 45 fps and 60 fps, or the one between 900p and 1080p. So you're guarantee is wrong sorry.

So I still don't understand what is your point in this.

You can tell the difference? Double blind test, let's go. No one was even talking about 900p until Digital Foundry wheeled it out, it's not something that came from gamers. I'm not saying that as a diss, I'm saying that that's their business.

My point is Jeff saying 30fps "kinda sucks" when that's a standard experience for most gamers for years and even a wobbly 45-60 frames per second is "unacceptable" and "doesn't run super well" is the opposite of reality. We don't call Forza Motorsport "great looking" because it runs at a rock solid 60fps and then say the Forza Horizon games "kinda suck" because they run at 30fps just because they aren't running at 60fps. This kind of hair splitting is completely unnecessary and doesn't make any sense and yet people make a living out of making it out to the biggest deal possible.

Avatar image for npfeifer
NPfeifer

239

Forum Posts

0

Wiki Points

5

Followers

Reviews: 0

User Lists: 0

If you want to play The Outer Worlds on the go, which is absolutely still a perfectly playable game, do you watch a Digital Foundry video about the Switch version and say "nah, this doesn't look nearly as good as an Xbox One X or high-end PC, I guess I'll pass". No, you don't. That's the absurdity of these kinds of graphical comparisons.

This is an absurdly bad take. I own all three console and a PC that can run games pretty fine. I often look at which is the best version before buying, hell they're (often) all the same price, why wouldn't I? Sometimes I'm fine sacrificing some performance for the Switch version, but it's good to have people going in the nitty gritty so I can make an informed choice. And like Outer Worlds is actually a bad example cause that game has issues on the Switch that are more than just a few small differences.

Like I get the hate towards people saying anything under 60 fps is unacceptable. When I was younger I was frustrated when critics would act like a game would be unplayable on the only console I own because, most of the time, it was fine. But critics can have access to all the available versions, so why wouldn't they comment on those differences? It's useful for me and plenty of users too.

The Switch port has issues, but that's kinda to be expected to be downported from systems that are 50% more powerful than what they designed the game on. If you only played the Switch docked to your TV, but also owned an Xbox One X or high-end PC (which is what the Digital Foundry video compares them against), then of course you'd play it on there. The Switch, philosophically, has never been about graphical fidelity and the people who don't get that "this won't run as well on Switch, but still be perfectly playable, compared to other consoles" aren't watching Digital Foundry videos anyway.

The thing is, critics DO have access to all the available versions and since they're all so similar these days, unless it's a glaring performance/technical issue, it's not worth addressing. People complaining about Xbox One v. PS4 1080p/900p at launch, I GUARANTEE couldn't tell the difference unless someone like Digital Foundry pointed it out to them. It's manufacturing unnecessary outrage for views.

Avatar image for npfeifer
NPfeifer

239

Forum Posts

0

Wiki Points

5

Followers

Reviews: 0

User Lists: 0

@npfeifer: By inconsistent I don't mean sub-30. The game PS4 Pro goes from 45-60 but mainly bounces around the high 40s-50s. It's high fps and above 30 but if you're someone who always plays game with locked framerates an inconsistent framerate is noticeable and looks "hitchy" like what DG was talking about. If you don't notice it then cool keep enjoying the game but I'm personally happy to have someone like Jeff offer his opinion on these kinds of things when most of the staff don't notice or even care about technical performance outside of games with bad fps that dip below 30

But why does that matter? At all? If it's still perfectly acceptable on the low-end and not always "glistening with high frame-rate gloss" at the top end, even if it's, like, slightly noticeable, how does that matter in the slightest? What do you gain by knowing that? Do you wait longer to buy the game for them to patch it? Do you entertain other open world crime games? It literally doesn't matter to be splitting hairs like that.

Avatar image for npfeifer
NPfeifer

239

Forum Posts

0

Wiki Points

5

Followers

Reviews: 0

User Lists: 0

Performance does affect how a game looks as well as how it plays. I find some games unplayable as well as unpleasant to look at if they drop frames every time I pan the camera over a particular asset for instance (something I'm super susceptible to). It's really disruptive to me; it makes my brain hurt for real. When a PC game has that issue, there's a chance of fixing it, and I have gotten used to that luxury. If it's an issue in a console game, it's not always so fixable, even with the handy performance mode options a lot of games have.

I struggle to enjoy games visually if there's a lot of performance issues that result in tearing or pop-in or other issues that absolutely affect how good a game looks. Some games just look bad, but bad performance makes them look worse.

I haven't played the PS4 version of SR3 myself so can't comment on that.

Performance DOES affect how well a game plays, absolutely. But watching Jeff play Saints Row the Third, the framerate was rock solid and never anything close to "unacceptable" or "doesn't run super well" and nowhere near unplayable and if you watch the Quick Look, you see that it's absolutely 100% completely playable.

Avatar image for npfeifer
NPfeifer

239

Forum Posts

0

Wiki Points

5

Followers

Reviews: 0

User Lists: 0

Uh, yeah I don't want to listen to anyone with a "everything is fine" milquetoast point of view.

I don't want to listen to "graphical analysts" splitting hairs when it doesn't matter at all. When we were comparing native PS2 and Xbox games, that was one thing. Xbox games definitively looked better than PS2 games when they took advantage of the hardware. PS3 versus Xbox 360? It mattered even less what the graphical differences were unless the game barely ran (Bethesda games on PS3). On Xbox One/PS4? You needed a Digital Foundry to point out the differences between versions, which is basically how they make their money. These days? That difference is literally less than nothing. If you want to play The Outer Worlds on the go, which is absolutely still a perfectly playable game, do you watch a Digital Foundry video about the Switch version and say "nah, this doesn't look nearly as good as an Xbox One X or high-end PC, I guess I'll pass". No, you don't. That's the absurdity of these kinds of graphical comparisons.

Avatar image for npfeifer
NPfeifer

239

Forum Posts

0

Wiki Points

5

Followers

Reviews: 0

User Lists: 0

@brunothethird:But beauty and performance are different things. A game looking good and performing well are different things. When you're as important as someone like Jeff is, saying stuff like "this framerate is unacceptable" when it is perfectly acceptable has the sway to influence people to not buy games. That's beyond the fact that he's just outright wrong about it.

Didn't know Jeff had to be soft on games with inconsistent framerates and that Digital Foundry should close up shop so you can feel better about playing on whatever hardware is available to you. Maybe don't give DG views then? Even though they came out and praised the SR3 remaster despite the PS4 Pro's inconsistent fps (One X stayed close to 60). If you enjoy playing on the hardware you have then why would Jeff's opinions on game performance bother you that much when you're already enjoying games?

Also you're generalizing the whole staff when some of the crew will only play on consoles like Abby.

I don't give DG views, so I'm telling others they probably shouldn't either. Jeff was playing on a PS4 Pro and at no point in his Quick Look did it display inconsistent fps and that was just at 30fps. I perhaps generalize people here, but I think it's more of a west coast/Jeff thing.