What do you think about the graphics of current gen consoles?

  • 54 results
  • 1
  • 2
Avatar image for liquiddragon
liquiddragon

4314

Forum Posts

978

Wiki Points

0

Followers

Reviews: 2

User Lists: 19

I've been immensely impressed the past two or so years. There were complaints even before the releases of PS4/XBO about the weakness of the CPU and some ppl theorized that because of the familiarity of the x86 architecture, developers wouldn't be able to squeeze out much more than what we'd see early in the life-cycle. I don't know if my expectations were set low because of those reasons or because of the ridiculous ways console manufactures, developers, and publishers hyped next-gen graphics when the 360/PS3 were coming out but I almost can't believe how good games look these days.

It seemed like many ppl were understandably disappointed by the downgrades of games like Watch_Dogs and The Witcher 3. On the other hand, looking at the current gen "next-gen" demos shown before the launch of PS4/XBO like Epic's Samaritan, Capcom's Deep Down, Quantic Dream's wizard guy, and Square-Enix's Agni's Philosophy, not only were we not lied to but in fact, what we have today surpasses the promises made. Sony's kinda crushing it in this department with titles like Uncharted 4, Horizon, God of War, and Spider-Man. And though I haven't played them, third party games like Resident Evil 7, Batman Arkham Knight, and especially Battlefront 1 look unbelievable as far as I can tell. You also have Red Dead Redemption 2 coming out pretty soon that appears equally handsome.

Anyways, are my standards low? What do you think about the graphics from the current batch of consoles? Are you impressed as I am?

Avatar image for justin258
Justin258

16684

Forum Posts

26

Wiki Points

0

Followers

Reviews: 11

User Lists: 8

#2  Edited By Justin258

So video game graphics these days are very impressive, especially when you consider the raw amount of manpower and computing power they require. Rendering those assets ain't easy, and it ain't cheap either.

But on a personal level, I kind of don't care anymore how much better-looking games get. As far as my own eyes are concerned, Cyberpunk 2077 looks marginally better than the Witcher 3, a game that will be at least four years old by the time Cyberpunk 2077 comes out. Metro Last Light still looks fantastic, too, and it is five years old now. On the other hand, The Elder Scrolls V: Skyrim received a bit of a graphical overhaul with its Special Edition and that game is still a treat for my eyes, even though my brain knows that it's seven years old now. It took less than half that time for Morrowind and Oblivion to look dated and ugly.

A lot of modern video games look way too freaking busy. This chase for a hyper-realistic look sometimes just winds up blurring everything together. Perhaps we're at the point where video game graphics need to stop thinking about getting as much detail in a scene as possible and start thinking about what detail to include and what detail to leave out, or at least what to highlight and what to minimize. There are times where The Witcher 3 or the aforementioned Metro Last Light make my eyes tired because I'm scanning everything, looking for something important, and all I see is detail everywhere. Arkham Asylum-style "Batman vision" is not the answer to this problem, either, that's just a cheap solution.

What I'm really getting around to seems to be this - on a technical level, I am definitely impressed by what developers can pull off these days. On an aesthetic level, though, video games still have a lot of room to improve.

Avatar image for csl316
csl316

17004

Forum Posts

765

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

I just kind of expect all of them to be good at this point.

Avatar image for liquiddragon
liquiddragon

4314

Forum Posts

978

Wiki Points

0

Followers

Reviews: 2

User Lists: 19

@justin258: I agree with your point about busy-ness. Also, I played Metro Last Light on PS3 and it looked great and ran well. Very impressive, though the character models were its shortcoming.

Avatar image for justin258
Justin258

16684

Forum Posts

26

Wiki Points

0

Followers

Reviews: 11

User Lists: 8

@justin258: I agree with your point about busy-ness. Also, I played Metro Last Light on PS3 and it looked great and ran well. Very impressive, though the character models were its shortcoming.

Hmm... I played some of it earlier this year and thought the character models were actually pretty great, but that was on a fairly high-end PC (and maybe my memory isn't too great on this part). Have you ever seen it running on a good PC or an Xbox One/PS4? They did port Metro up to the current consoles and, from my understanding, bumped the graphics up a lot.

Avatar image for liquiddragon
liquiddragon

4314

Forum Posts

978

Wiki Points

0

Followers

Reviews: 2

User Lists: 19

@justin258: No, I'm really trying not to double dip, especially these days so I kinda avoid even looking at re-releases and enhanced versions of games I've already played. I just meant more that everything else looked terrific. I remember characters being wooden and stiff but the game had tremendous atmosphere and when you got outside, it was really bleak and striking. Last gen, it seemed like if you were good at environments, you're characters weren't as good and visa versa. The Assassin's Creed games really come to mind. This doesn't seem to be a problem this gen. Btw, Exodus looks quite good.

Avatar image for creepingdeath0
CreepingDeath0

537

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I'm firmly in the position now that the next gen doesn't need to be some big graphical leap. I'd much rather see time and processing power put towards advancing ai and world simulation.

Hell, I'm still waiting for 1080p 60 fps to become standard. Which really shouldn't be asking much at this stage...

Avatar image for nicksmi56
nicksmi56

922

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

They look great, but I don't find myself caring that much anymore. The older I get, the less I care about how good-looking games are, especially when some retro and indie gems play way better than the AAA blockbusters we get these days.

Avatar image for justin258
Justin258

16684

Forum Posts

26

Wiki Points

0

Followers

Reviews: 11

User Lists: 8

@justin258: No, I'm really trying not to double dip, especially these days so I kinda avoid even looking at re-releases and enhanced versions of games I've already played. I just meant more that everything else looked terrific. I remember characters being wooden and stiff but the game had tremendous atmosphere and when you got outside, it was really bleak and striking. Last gen, it seemed like if you were good at environments, you're characters weren't as good and visa versa. The Assassin's Creed games really come to mind. This doesn't seem to be a problem this gen. Btw, Exodus looks quite good.

Faces and characters in general kind of are the last thing that doesn't always "look great". Even when they look good, there's something off about them.

I'm firmly in the position now that the next gen doesn't need to be some big graphical leap. I'd much rather see time and processing power put towards advancing ai and world simulation.

Hell, I'm still waiting for 1080p 60 fps to become standard. Which really shouldn't be asking much at this stage...

Sadly, I don't think that reality will ever really hit us. By the time consoles are powerful enough to hit 1080p60FPS all the time, every time, 1080p will be as irrelevant as 720p is today. You can just squeeze so much more power out of the GPU and some more out of the CPU, depending on your game, if you limit things to 30FPS.

They look great, but I don't find myself caring that much anymore. The older I get, the less I care about how good-looking games are, especially when some retro and indie gems play way better than the AAA blockbusters we get these days.

This is the real problem, though. There are a lot of ways in which AAA games kinda suck these days, but it feels like there's at least one game from an indie dev, or lesser-known dev, every month that's more interesting from a gameplay and aesthetics perspective than anything the AAA industry's putting out. Unless Arkane or CDPR's putting out a game that month.

Avatar image for facelessvixen
FacelessVixen

4009

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

Tough for me so say about consoles specifically since I have the option to turn things up a notch on PC (at least as far as a Haswell i5 and Strix 1060 will allow). But, current generation games in general look pretty good. Give or take the performance of my system, there are some games that I'd like to have higher texture quality, and sometimes a game's depth of field and use of TXAA for anti-aliasing makes things look too blurry when I generally prefer sharpness. But I can't complain for the most part, especially with ENB, ReShade and/or SweetFX being an option to use in some games. And it'll only get better when real time ray-tracing becomes more of a thing.

Avatar image for merxworx01
MerxWorx01

1231

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#11  Edited By MerxWorx01

Uncertain, I mean few of these current gen games are satisfying my need for large reflective puddles.

Avatar image for xanadu
xanadu

2157

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

Honestly I think it's pretty good. But Im the guy who thinks Yakuza shouldve never switched to a new engine. 0 and kiwami running in 60fps is fucking incredible. Going from those games to Yakuza 6 in 30fps just sucked. It doesnt look THAT much better. And before some one feels the need to say "all the other games were 30fps" I know...thats not a legitimate excuse to me.

Avatar image for bonbonetti
Bonbonetti

188

Forum Posts

65

Wiki Points

0

Followers

Reviews: 36

User Lists: 1

#13  Edited By Bonbonetti

I think it's very good, but graphics has never been a priority with me, there are other things I value more. I do have the option of playing on PC instead of console, but I like the carefree-ness of playing on console, so I often prefer that. [I still play games on PC, I don't ascribe myself to any grouping of gamer, I'm far too eccentric in my taste of games for that :-) .]

Back to the topic though. Sure, I have been impressed by the looks of Gods of War, Gran Turismo Sport and Detroit. However, looks alone is not enough to make me want to play a game. It has to be fun and/or interesting first and foremost; I can easily forgive a unique game that has weaker looks and performance.

I'm happy as long as the game runs smoothly and looks good enough, and most games on the current-gen consoles do that.

On a side-note, I think photo-realism is the wrong [and futile] thing to aim for. It's a short-sighted advantage. Developers should prioritize gameplay and content, is my opinion.

Avatar image for cikame
cikame

4473

Forum Posts

10

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I don't care about graphics really, i'm more excited by how clean and clear a game is to look at, and high frame rates, than i am the density of graphics and effects.
You know how some people who read books enjoy the process of building the image in their heads? Using their imagination? I feel a similar way about graphics in games, to a degree, if the graphics are too good then all the information is already in the image, there's no wonder or discovery because everything you need is in the image already. Artists making a PS2 game did the best they could to give the impression of reality, while the hardware restrictions caused them to get creative in finding ways to cram as much detail in as possible, when i look at older graphics i see the creativity of artists and the ways they use what's available to make something unique, but it's getting harder to appreciate modern AAA game graphics as they all approach the same cluttered and brown state of "real" worlds, this extends to characters and face capture, we're not looking at artists impressions of humans anymore, we're just looking at regular boring ass humans.

I want to go here
I want to go here
Avatar image for meteora3255
meteora3255

683

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 2

#15  Edited By meteora3255

I think the current level of graphical fidelity is fine for me personally. I've never been a huge graphics snob anyway; art direction matters more in my mind. Nintendo is the best at this. For the last few console cycles they have had "underpowered" hardware but still manage to make games that look great (Breath of the Wild, Mario Galaxy, etc.). I have a PC with an i7 and a 1070 but most games in medium look fine to me.

Avatar image for colourful_hippie
colourful_hippie

6335

Forum Posts

8

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

Games look fucking great now but people are overlooking other important aspects in gaming other than graphics. These current consoles have been handicapped from the start by their shitty, barely mid-range smartphone strength CPUs.

Console UIs, especially Xbox's, can be more snappier, load and switch games incredibly more quickly, but more importantly give plenty of headroom to devs to get incredibly creative with more complex Ai for enemies and NPCs in games. Open world games could support more complex infrastructures and systems and we would see more devs take a shot at that outside of Bethesda's Fallout/Elder Scrolls series.

This current gen has still been pretty close to what we've seen in last gen in terms of gameplay mechanics. The only big differences I've seen is expanded online integration, more games supporting an open world framework or large map, and shinier graphics.

I want more

Avatar image for hippie_genocide
hippie_genocide

2574

Forum Posts

1

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

#17  Edited By hippie_genocide

I'm very much an art direction over graphics guy. I'd much rather have the next Jet Set game than another gritty, realistic open world game. I also will take a good framerate over resolution any day.

Avatar image for monkeyking1969
monkeyking1969

9095

Forum Posts

1241

Wiki Points

0

Followers

Reviews: 0

User Lists: 18

I think we have reached a stage where everything looks good enough and they processing power when well used can sling enough geometry to make nearly any scene with less than 200 npcs moving around on the screen look great. I think games like Horizons, Assassins Creed, GT Sport, God of War, Red Dead 2, etc look great.

Is there anywhere to go? Sure, I think lighting, even rasterized lighting, can go further. I think animations could get smoother and lip-syncing with voice can be refined.

With that said I think the next consoles from MS and Sony and the next two video cards from Nvidia and AMD will be considered "merely adequate" & "underpowered" when the history is written. I think we are nearly at the start of the era where we move from Rasterized graphics to Ray Traced. Indeed, I think this will be a switch is going to happen simply because a good Raytraced light engine will blow a Rasterized games out of the water - every time. There won't be a argument in five years that, "...rasterized graphics hold up", people will just admit when you have the hardware to do real time RT it just looks far better.

Avatar image for mamba219
Mamba219

317

Forum Posts

11494

Wiki Points

0

Followers

Reviews: 0

User Lists: 4

I kind of don't like them. Everything has this robotic sheen and oversaturated lighting that irks me. Late last-gen was when this trend really started and it hasn't really stopped.

Avatar image for brendan
Brendan

9414

Forum Posts

533

Wiki Points

0

Followers

Reviews: 0

User Lists: 7

Tough to say because I felt this way last gen, and now that I'm used to this gen I look at 360/PS3 games and think they look a bit fuzzy. I'm not excited about graphics any more because the slope is now so gradual and iterative that I truly think you can really only appreciate the differences by looking backwards 7+ years. Once games are all in 8k native in 15 years in sure these games will all appear fuzzy.

Avatar image for frostyryan
FrostyRyan

2936

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

They're immensely impressive right now. Honestly I'm ok if all we work on is performance from here on out.

Could they be a tad better though? Sure

Avatar image for tyrrael
Tyrrael

485

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I'll just preface this by saying that I have very much been primarily a console gamer through all the years I've been playing games. Let me put it this way. I've played more on console in the past year than I've played on PC my whole life.

Visually, games have been great for a long time. The 360/PS3, at least for the time, had some phenomenal looking games that were on par with much of what the PC had to offer, and the same still rings true today. Aside from some of the bells and whistles, like TressFX (I think I got that right), games look largely the same, or at least close enough to where you don't feel like you're getting massively screwed when you pick up a game for a console. Like I said, there are some differences when you crank everything up to ultra, but it's not so much of a difference that I feel cheated.

The problem lies with the ridiculously inconsistent performance on consoles that seems to be getting worse. There's simply no standard at all, with developers making up ridiculous excuses about how they are trying to make the game more "cinematic" (I hate that word), as if that's a reason to have a game at 30fps that constantly drops frames. As soon as there is a standard that is mandated by the console manufacturers, preferably 1080p60, then, hopefully, we will start to see more consistent performance.

I know this is going to annoy some people, but keep in mind that I love my consoles very much, so this is coming from a place of love. If the PS4/XboxOne were rereleased with massively upgraded CPUs, I'm confidant that 1080p60 could be hit much more consistently. The biggest problem with the consoles has always been the CPU, which is considerably underpowered when compared to the GPU. And, yes, I know that the PS4Pro/XboxOneX upgraded the CPU, but it just isn't enough.

One last thing. I'm more than willing to have games take a mild to moderate aesthetic hit on console if it means getting a rock solid 60fps at 1080p. This would also keep the consoles within the $400-$500 sweet spot. I would much rather have consistent performance than the equivalent of having everything at ultra on PC with constantly tanking frame rates.

Avatar image for ninnanuam
ninnanuam

583

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 1

The X and the pro look pretty good most of the time.

Some of the first party Sony stuff this year has been very impressive.

That being said I still pick up a lot of multiplatform games on PC because they look substantially better. So there is still a lot of room for improvement, especially when it comes getting both performance and visuals.

Avatar image for bybeach
bybeach

6754

Forum Posts

1

Wiki Points

0

Followers

Reviews: 0

User Lists: 1

These days, and really for the first time, if I did not want to invest the money in pc gaming, I would easily enough accept present day console graphics. I just bought a video card (1080 ti). But if money got a bit more constrained, one or even 2 consoles would not be a bad way to go. I would still have an inexpensive pc built by me to do internet and media in the living room.

It's what's called 'good enough'. You forgo the high end, but that's acceptable from what I personally see.

Avatar image for nutter
nutter

2881

Forum Posts

4

Wiki Points

0

Followers

Reviews: 0

User Lists: 9

#25  Edited By nutter

Last gen, A LOT of stuff looked very good. This gen, looking very good is bare minimum. Great looking games are downright gorgeous.

Yeah, there’s always room to grow, and there will always be a next great looker, but we seem to be in diminishing returns turf. Some of the 4K HDR stuff looks amazing. We’re starting to see 60fps more. I think the last game that raised the bar (that I played) was the Xbox One X version of Hellblade. That’s a looker.

I need to try Spiderman, Forza, God of War, and Detroit at some point...

Avatar image for haneybd87
haneybd87

629

Forum Posts

1

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I’m pretty upset that we didn’t get the 4k60 promised on XB1X for most games. It’s almost always 4k30 or 1080p60. I’d gladly take less bells and whistles for a 4k60 option.

Avatar image for eurobum
Eurobum

487

Forum Posts

2393

Wiki Points

0

Followers

Reviews: 0

User Lists: 1

#27  Edited By Eurobum

@haneybd87 said:

I’m pretty upset that we didn’t get the 4k60 promised on XB1X for most games. It’s almost always 4k30 or 1080p60. I’d gladly take less bells and whistles for a 4k60 option.

So Xbox one X bragged about being able to do 6 Tflops, the overhauled Playstation 4.2 TFlops, then soon to be realeased 2080 Ti benchmarked 12.4 Tflops at some synthetic tests. So it's pretty much twice or thrice as fast as the consoles respectively, still it's the only card that averages barely above 60 FPS@4K at Ultra settings in games like GTA V.

4K is needed for video quality on the gigantic TVs, but for games rendering in 4K is still a complete hardware-melter on all fronts, from RAM to bandwidth to cooling, with the PC consuming 500 Watts of power total. It's just ridiculous to almost use one "horsepower" to play games and not just because of the noise and the heat. It's also a complete waste with but a slightest improvement to visual fidelity, that in games gets blurred out by some filter anyway, or the inherent motion blur of the screen.

4K quadruples the amount of data that needs to be copied, it's not so much compute but bandwidth that holds it back as well as the GPU waiting for the slow-ish CPU, something that can't be compensated for by ratcheting down Anti Aliasing or "bells and whistles." In the early days of 4K, the benchmarks just crapped out at 4K, they didn't scale with the resolution numbers but just hit a wall. Consoles do elegantly share memory between CPU and GPU and they use the fast GDDR5, so the CPU probably is the culprit.

Sill 60FPS at 4K was a stinking bald-faced lie to kind of make consoles fit the 4K TV marketing narrative, which they won't be able to! Even with the next generation not much is going to change, because performance per watt is barely improving any more. The actual truth is that consoles always have been scaling up graphics to fit a certain quality goal. Insisting on rendering in native resolution has always been just a ploy to sell more and more powerful cards to enthusiasts, also resolution is more of a concern at desktop viewing distance.

Avatar image for deactivated-5f8ac39b52e76
deactivated-5f8ac39b52e76

2590

Forum Posts

1360

Wiki Points

0

Followers

Reviews: 3

User Lists: 3

Faces and characters in general kind of are the last thing that doesn't always "look great". Even when they look good, there's something off about them

You know, I watched Rogue One the other day and the cringe level for CGI Tarkin was unreal and never mind that horrific dead eyed wax zombie thing that was supposed to resemble Carrie Fisher at the end there. Even blockbuster movies with the power of Disney and ILM can't pull off convincing face animation for regular old humans.

Games are at least now roughly on the level of that crappy Final Fantasy Movie made almost 20 years ago, which is certainly impressive for real-time rendering. But somehow I doubt that fully convincing animation of human beings will ever happen in games or movies. Not in my lifetime, anyway.

Avatar image for glots
glots

5169

Forum Posts

74

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#29  Edited By glots

There's been several PS4 exclusives (and Forza 4 on X) this year with graphics on the level that I wouldn't have a problem with being the standard for the next few years still. Obviously it would be great if they could also hitch up the framerate to 60, but as long as it's a steady 30, I have absolutely no problem with it.

But I would like to see more interesting art direction, for sure. I was playing Shadow of the Tomb Raider on X and it sure looked pretty technically, but except for a few selected scenes, it almost felt like I was just going through left-over ancient ruins and jungles from Uncharted 4. It's kinda why I'm welcoming back Darksiders and it's more "cartoony" visuals.

Avatar image for mostlysquares
MostlySquares

460

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@justin258: I so very much agree with graphics these days looking way too busy. In real life we have depth perception, which works wonders in regards to separating objects in the real world from eachother. In a game, however, all you have is a flattened version of all that, so you need to read the scene in a different way, you have to fill in the missing depth with whatever ability your brain has to do so. For me, a lot of games (particularly forests with lots of foliage) just looks like an absolute mess. I am red green colourblind (alongside 5-10% of the male population) so lush forests are just neon green hellscapes that are hard to focus on to begin with. Goes for real life as well.

The older I get and the shittier my eyes get, games with super clean clear graphics appeal more and more. I would MUCH RATHER they spend the time on expanding the gameplay in a huge way rather than push more polys and stuff. Gameplay trumps graphics in every thinkable way, and games these days skimp on complexity and just replace that with repetitive tasks and gated weapons/gear/abilities/whatnot. Just imagine if they flipped the budget on its ass.. Instead of having 100 artists and 20 coders you have 100 people whose job it is to fill the world with meaningful stuff.

Heavily modded Minecraft is a god damned joy at times, cause it's just jam packed with stuff to do that isn't just "kill 5 lizard men" or "talk to Ivan, he'll give you a cheese, take it to Aldus". Granted, Minecraft is a sandbox, and not all games should be sandboxes.. But Breath of the Wild showed what interacting systems do to a world. It livens it up more than a 10% increase in texture res...

Also, dynamic lighting makes any game look great. I would rather they spend the graphics budget on dynamic stuff than spend it on polishing every pebble and twig to perfection.

Avatar image for haneybd87
haneybd87

629

Forum Posts

1

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#31  Edited By haneybd87

@eurobum: I really disagree that 4K is the “slightest improvement to visual fidelity”. It’s not like the slight upgrade you see in video, in games it’s a dramatic improvement. You eliminate a great deal of aliasing without sacrificing any detail and that’s why it looks so much better than in video.

As for disabling effects to pull off 4K resolution you can absolutely do that, I’ve been doing that on my 980ti. Devs keep pushing the newest, bleeding edge tech and we never catch up to that 60fps dream. It seems that visual “pop” trumps all else because it looks flashy and sells more games. What so many people that buy into this marketing don’t realize is just how blurry that 30fps frame rate makes their games look in motion.

Avatar image for broddity
Broddity

149

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Firstly yes, games do look kind of amazing now and are approaching what I think I was imagining when the PS2 / Dreamcast / 360 launch hype was in full flow. I'm old and wise enough now however to ever think "we're done, can't get better".

But the stuff which most consistently has the ability to genuinely startle me is more in animation than in graphics. The moment a character does something incidental but so completely natural, or affects the environment in some way physical which has no actual bearing other than "immersion", that's the stuff which impresses me.

I think we're getting there with facial animation too. Use of motion capture has done a lot for this. I've been watching the girlfriend play Until Dawn lately and honestly the fidelity on the faces kind of blew me away.

Avatar image for redhotchilimist
Redhotchilimist

3019

Forum Posts

14

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

#33  Edited By Redhotchilimist

The top of the line graphics on first party PS4 games are probably the prettiest I've seen this gen in terms of technical achievement. All the little details and the beautiful lighting look gorgeous. The animations are very expressive, compared to similar games from the last gen.

Problem for me is, I definitely get what people here are saying about it looking too busy. The high detail density combined with motion blur and 30 fps maximum made stuff like Horizon Zero Dawn really difficult for me to look at for the first hours of that game. It just makes my head hurt whenever I turn the camera and everything turns into a mess for half a second. And there's still this endless chase for "realism" rather than an attempt at something stylistically amazing, like what Nintendo's always going for. Sometimes that works okay, but mostly the face captured... faces... just look uncanny to me, even as far as we have come. Like, does Peter Parker in Spider-Man have a better face than pretty much anyone on the 360 besides maybe the Metal Gear Solid V characters? Yes. Does it look in any way interchangable with a real face? Hell no! And even when the faces look amazing, like on whatever engine Capcom is currently using, I don't think it works well for every property. Resident Evil? Yeah, sure. Realistically rendered environments for RE7 combined with realistic faces in a first person view makes that look incredibly scary because you feel more like you're there. Seems to do the same job for the remake of RE2 as well. But when DMC5 showed off its realistic faces on top of old character designs, it feels like a bunch of cosplayers showed up to the party(because they literally did, they scanned some real models and their clothing). I'm afraid they're gonna try it on Street Fighter next.

Anyway, while I appreciate great lighting, detailed textures and some good animation, I don't feel like it's the same kinda leap that last gen was from the ps2/gamecube/xbox, and I think chasing realism has its place, and that's for specific games and in environments. More than anything, I'd like a higher framerate and less motion blur, even if the visuals had to be toned down to make it work. In some ways we haven't moved much forward at all. When Horizon Zero Dawn had to automate the character animation/design process to some extent to get enough NPCs for an RPG, most of them end up looking like weird mannequins with uncanny expressions. Is Horizon as buggy and janky and broken as Dragon Age Inquisition? No, but it also hasn't gotten the NPCs to not look like crap.

So I dunno. In some ways I think this gen is very beautiful, but in others I'm a bit disappointed. Personally I gotta get myself a switch, 'cause HD-ifying Nintendo's usual stylized graphics make them look absolutely amazing, and they don't much go for the busy look with a lot of blur. I wanna see the next Street Fighter with that kinda cartoony approach.

Avatar image for deathstriker
Deathstriker

1271

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Graphics are fine, but most games this gen are boring or derivative. There aren't many classics like last gen having Bioshock, Mass Effect, Red Dead,TLAU, etc.

Avatar image for eurobum
Eurobum

487

Forum Posts

2393

Wiki Points

0

Followers

Reviews: 0

User Lists: 1

@haneybd87: Fair enough. Aliasing gets reduced as lines gets smaller and less jaggy, it's also more noticeable/distracting in movement since our brain's pattern recognition tends to notice hard contrasts and edges. But you still need Anti Aliasing, so it's a marginal improvement. It would matter a lot if we didn't have AA, but with AA the difference is: a little less blur on those edges.

Concerning scaling: What I was saying is that more bottle necks open up once you enter 4K, from DDR and VRAM bandwidth, to PCIe to CPU utilization. The dreadful loading times are an unintended consequence for instance of this 4K over-ambition. Even TV input lag increases, and HDMI has hard limits.

The other thing is that you actually want the monitor refresh rate to be double that of your frame rate, just as we want video frame rate to be at least double of our optical nerve cycle rate. Hitting stable 60 frames was/is important with vsync, but actually 60 FPS on a 60 Hz screen is not that great, without sync it can lead to more screen tearing. I suspect as long as TVs are 60 Hz, consoles will target 30 fps. I don't know what the vsync situation is with current gen games, I tend to disable it.

Avatar image for haneybd87
haneybd87

629

Forum Posts

1

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#36  Edited By haneybd87

@eurobum: Most TVs these days are at least 120hz displays so that isn’t much of an issue. I don’t know about the low end but my 3 year old 4K tv is a 240hz display (with 60hz input). The 4K input lag is pretty good too and the same goes for most mid to high end 4k TVs these days.

Anyways, most console games have Vsync so the refresh rate isn’t much of an issue in the first place. Only reason you’d want to disable vsync on a PC game is if you have a g/freesync monitor otherwise you’ll end up with screen tearing more likely than not.

All that said I still really have to disagree that 4K is only a marginal improvement in games. It’s night and day.

Avatar image for pompouspizza
pompouspizza

1564

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I have been a console player my whole life. I recently upgraded to a 4K TV and have both enhanced consoles and I feel like games are constantly blowing me away. I fully expect that trend to continue with Red Dead.

I think it’s very easy to take for granted how good games actually look today.

Avatar image for themanwithnoplan
TheManWithNoPlan

7843

Forum Posts

103

Wiki Points

0

Followers

Reviews: 1

User Lists: 14

I don't care as much these days as long and as it runs well and is fun to play.

Avatar image for mrgreenman
MrGreenMan

452

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Personally I think to much emphasis is pushed on what looks realistic and what looks good in 4k and were forgetting about actually what makes games fun. I really do think we have pushed things graphically to the point where so many games play and often are very much more of the same that very little is new or fresh anymore and this big AAA developers are doing the same thing beating these mechanics into the ground it's gotten so repetitive then so many games from just about every AAA publisher in one way or another is the same. Even as good as something like Spiderman is, it very much is rooting in the same type of games we have been playing for the last 10-15 years.

I have no problem pushing the boundaries and making things look amazing, but I really do feel especially the consoles have focused just on that. From the transition from the ps3/xbox 360 to the xbox one and ps4 very little has changed. It's like we're still playing the same games that came out in 2006 just they look better and have a much bigger production value. It reminds me a lot of the film industry in how it's eating itself and not really realizing how harmful they are to themselves and just repeating the same mistakes over and over. Then again it seems more and more the industry want to be more and more like movies.

Avatar image for nodima
Nodima

3884

Forum Posts

24

Wiki Points

0

Followers

Reviews: 13

User Lists: 0

I think games are outstanding looking and this is the first generation where I really don't feel any urgency to get on to another run of consoles. My TV maxes out at 720p anyway, so I'm starting to butt up against the same situation I was in during the second half of the PS3's run where I was still playing on a 24" CRT with component cables until 2012 or so. I remember how baffled I was by Mass Effect 2 once I got the new HDTV from my dad after he upgraded, and I'm sure I'd have a similar experience with my PS4 if I were to get a new TV, or Pro, now.

Avatar image for flatblack
flatblack

220

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Spiderman and God of war have blown me away this year. Forza Horizon 4 also looks absolutely incredible, though I'm playing that on PC so it's not strictly this console generation. We're getting to the point this gen where developers really feel like they're hitting their stride. Can't wait to see how good red dead redemption 2 and the last of us 2 look when they come out.

Avatar image for eurobum
Eurobum

487

Forum Posts

2393

Wiki Points

0

Followers

Reviews: 0

User Lists: 1

#43  Edited By Eurobum

@haneybd87 said:

@eurobum: Most TVs these days are at least 120hz displays so that isn’t much of an issue. I don’t know about the low end but my 3 year old 4K tv is a 240hz display (with 60hz input). The 4K input lag is pretty good too and the same goes for most mid to high end 4k TVs these days.

Anyways, most console games have Vsync so the refresh rate isn’t much of an issue in the first place. Only reason you’d want to disable vsync on a PC game is if you have a g/freesync monitor otherwise you’ll end up with screen tearing more likely than not.

All that said I still really have to disagree that 4K is only a marginal improvement in games. It’s night and day.

About that. TVs and Monitors both advertise "Hz" but they are sadly completely different things. And while real 120 Hz TVs do exist, they were only created during the 3DTV days, and they use (fast switching) TN panels that are immediately recognizable as that, being rather dark and having narrow viewing angles. Most LCD TVs refresh at 60 Hz, because it is the best available technical compromise.

A PC monitor cannot fake its picture, it has to wait for a new signal from the frame buffer of you video card every time. A TV though can and does totally fake things, because it really doesn't matter if the signal is delayed half a second or two. Say you get a HD 720p30 signal, the TV waits to get a couple of frames, compares two following frames and then inserts additional fake frames using simple algorithms that reduce stutter. IF it inserts 1 Frame the signal turns into "60 Hz", 2 frames makes "90 Hz", 3 extra frames makes "120 Hz"... That's the reason why monitor refresh rates can have really weird Hz numbers like 144, and 165, while TVs only advertise Hz in multiples of 25/30. This is also how you get "240 Hz" and higher in TV labels, even though this isn't yet possible or useful with LCD technology.

The gaming mode in TVs does disable all this frame delay filtering and turns a TV into a rather ordinary 60 Hz Monitor (using mostly VA-panels, which offer great contrast).

How exactly TVs fake their fake Hz may be proprietary, so my explanation may be an oversimplification.

Anyhow, if you don't believe my technobabble: HDMI 2.0 only allows 4K@60 Hz, which is definite, incontrovertible proof that all TVs are created equal and created low. Even the pricey OLED! But there is always the next version of HDMI in the future...

Avatar image for haneybd87
haneybd87

629

Forum Posts

1

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@eurobum: I understand what you’re saying and I did mention that the TV is 60hz input. I don’t see what that has to do with screen tearing and that being why games shoot for 30hz. If you have vsync on you can do 60hz no problem. They’re not shooting for 30hz because of screen tearing, it’s because they’re unwilling to compromise any kind of visual fidelity for a better frame rate, even though many of us would consider a low frame rate to be a major compromise in visual fidelity.

Avatar image for eurobum
Eurobum

487

Forum Posts

2393

Wiki Points

0

Followers

Reviews: 0

User Lists: 1

@haneybd87:It's a great question, but there is even a life lesson/ life hack in it for people, once they understand it. I only became aware of it, because you asked.

Tearing aside, the reason why we want to sync Hz and FPS, is that people want a frame rate that is steady, where every frame is shown for 16.7 ms ideally or at least no frame takes longer than 50 ms. The other thing is that you want that frame as close to the moment you hit the right stick or a button as possible, so filters, scalers, rendering, encryption all create a delay and contribute to the overall frame latency. Vsync can delay a single frame for quite a long time, making controls quite unresponsive and movement juttery, so it's best disabled. At the same time going from a 60 Hz screen to a 120 Hz, shortens frame latency by 8.3ms on average. So that's 8.3 ms you get extra to react to some guy coming around a corner. Both disabling Vsync and 120 Hz gives you say 16.6 ms on average, or much more in busy situations where frame rate drops!

But if you match refresh rates and FPS exactly without Vsync, some weird things start to happen. For once the screen can miss every refresh by just a little bit, creating a worst case scenario latency. It can also happen that the refresh always hits the moment the frame buffer is written to, meaning that every frame has a tear showing half of the old and half of the new frame. Yes Vsync fixes all that, but it comes at a price (added latency), even makes some people motion sick.

The actual reason why frame rate needs to be half of refresh rate is really simple, now that I thought about it. Because 30 FPS isn't some "Best Value", or some magic number, it's simply half of 60.. The magical, "cinematic" number for our eyes is actually 24 FPS, So if we had 48 Hz Monitors, consoles would target 24 FPS. In fact 50 Hz TVs existed and Pal used 25 FPS video DVDs.

You need double refresh: so frames don't run into each other, and so frames get dealt with in reasonable time. Say a friend texts you once a day, the best strategy is to check your phone twice a day, to make sure you answer in a timely fashion and to make sure texts don't pile up. How it is the best strategy, has to do with probabilities and time windows. You simply can't miss to reply to a text message within 24 hours, that goes out every day but at different times, if you check every 12 hours.

Avatar image for haneybd87
haneybd87

629

Forum Posts

1

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@eurobum: Vsync latency is quite minimal and not noticeable. Even someone like me who’s extremely sensitive to low frame rate (30fps tends to make me nauseous) the difference is negligible enough so as not to matter. I can see if you’re playing a very frame specific fighting game it could matter but even for multiplayer shooters it isn’t that big of a deal. In any case most of these 30fps console games have vsync on anyways so 60fps with vsync is way better than 30fps with vsync.

Avatar image for otacon
Otacon

2337

Forum Posts

846

Wiki Points

0

Followers

Reviews: 0

User Lists: 1

I feel like the amazing looks of games kind of crept up this gen, it's been mainly the PS4 exclusives for me. I was playing god of war and just had this realisation a few hours in, this game looks phenomenal. I'm all for getting better and better, maybe ray tracing will be the next big step. But, I play the switch a lot and I can get similar reactions to something like breath of the wild, albeit a different type of "wow".

Avatar image for eurobum
Eurobum

487

Forum Posts

2393

Wiki Points

0

Followers

Reviews: 0

User Lists: 1

@haneybd87 said:

In any case most of these 30fps console games have vsync on anyways so 60fps with vsync is way better than 30fps with vsync.

Even with the different types of Vsync (vanilla, double/tripple buffering) they can only sync to 60 Hz, every frame 60, every 2nd frame-> 30 fps, every 3rd frame 20 fps, 4th-15fps. The minute you miss the 16,7 ms time window you drop down 30 fps. There is kind of a cliff, which is maybe why consoles also dynamically scale down resolution, to stabilize frame-rate. Infamously Zelda: BotW bounced between 30 and 20 frames, when it was released. Bouncing between 60 and 30 fps should also be quite noticeable. The Switch could have used a kind of zero latency Freesync/G-sync, since it already comes with a screen built-in and because it saves battery power, but it doesn't, possibly because of TV mode.

A lot of console style animation and combat, probably didn't come from a different (narrower) field of view or joypad controls, but instead came from latency. Things like "Batman"- combat, which is basically just QTEs; also prioritization of animation. Just hit a button and you snap to climb a wall, or swing the Zweihänder for 8 seconds. Games got good at fudging responsiveness.

Avatar image for haneybd87
haneybd87

629

Forum Posts

1

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@eurobum: You won’t drop down to 30fps from 60 if you’re using triple buffering vsync, that’s just a double buffering thing. In PC games triple buffering is usually the defacto vsync setting and if not will give you an option for it. Same can be done for consoles, still not a good reason to aim for 30fps over 60.

Avatar image for emprpngn
emprpngn

841

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 4

I play mostly on consoles these days, and honestly if I had the spare $1k-1.5k, I'd probably go with an Xbox One X and a nicer TV before I built a new PC. The performance is good enough for me, and I play enough older games to not really care about having everything cranked to ultra. Diminishing returns and all that.