Did you like the half-step consoles? Why?

Avatar image for liquiddragon
liquiddragon

4062

Forum Posts

978

Wiki Points

18

Followers

Reviews: 2

User Lists: 16

Poll Did you like the half-step consoles? Why? (124 votes)

Yeah 31%
No 54%
Poll 15%

I feel like the half-step consoles go against part of the appeal of dedicated boxes. I just have a base PS4 and it often crosses my mind that I'm not getting the optimal experience, even just within the environment of the platform of my choosing.

Maybe ppl who have the upgraded systems feel different?

How do you guys feel about how the half-steps panned out, how do you think it impacted the generation, and how do you see it shaping home consoles going forward?

 • 
Avatar image for sethmode
SethMode

2669

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I got an X because my friend wanted to buy my vanilla one, and then a Pro while in Korea because I couldn't bring my vanilla PS4 with me, and while I'm happy with both, I can't say I've found the experience all that different. I've used performance mode on things like The Witcher and there is some difference, so that's nice, I guess?

I think honestly the biggest difference so far has been comparing both of my vanilla systems to their half-step counterparts noise wise. Especially my old PS4. Holy shit playing God of War on that was nuts. My wife could hear it in the other room if it was a quiet moment.

Avatar image for nodima
Nodima

2931

Forum Posts

24

Wiki Points

0

Followers

Reviews: 13

User Lists: 0

#2  Edited By Nodima

I voted "poll" because, from the perspective of someone who stuck with their base PS4 for the entire generation, I feel like Sony kept their promise. Base users weren't left behind (I mean, I played God of War, Last of Us 2 and Horizon on a 720p TV, so what would I care anyways?) while PC-oriented players seemed like they got what they wanted and then the few games that seemed explicitly built for PC late in the generation like Control sounded like they ran poorly no matter what console you owned.

From a consumer perspective I do worry that it leads to a future where consoles are treated more like phones or TVs than desktop computers running a video game-focused OS, and thus the cycles consistently shorten to the point of irrelevancy, but for this moment in time I felt like both manufacturers did a good job not leaving the more casual, or less tech-forward, audience from feeling like they were no longer a priority. And that's the most important part for me.

Avatar image for csl316
csl316

15783

Forum Posts

765

Wiki Points

0

Followers

Reviews: 0

User Lists: 7

I thought I'd hate it, but it seems sensible to do an upgrade halfway through a generation because technology changes so much. Never thought 4k would become affordable during this gen, but it did, and the pivot was necessary.

Honestly, I'm used to shorter generations anyway from the old days. Saying the PS3 was planned for 10 years seemed nuts at the time, but it happened A part of me considers these half steps a new generation, anyway, just with a single library and eco system. Microsoft is looking to blur it even further and their approach is kind of disappointing to me because I like clean breaks, but it kind of makes sense. We're not seeing leaps like the PS1 to PS2, or even PS2 to PS3 anymore. It's more frame rate and resolution than brand new game concepts (unless the SSD really lives up to what they're saying).

It has me thinking about waiting til the Pro versions of these consoles but eh, if I can get 3 or 4 years of solid gaming out of a new console I'll be willing to trade and upgrade when they make another significant upgrade (i.e. the original Xbox One to the One X, but not something like the jump to One S).

Avatar image for zeik
Zeik

5416

Forum Posts

2

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4  Edited By Zeik  Online

I never felt like I was missing out on that much without a Pro. The biggest reason I ever considered buying a Pro was because my original PS4 started having performance issues and if I was going to buy a new console I might as well upgrade. But I didn't actually care that strongly about the Pro exclusive features, and ultimately I never ended up buying one.

So it was fine. I still got to enjoy the games I wanted to play throughout the generation, but the option to upgrade was there for those that really wanted that.

Avatar image for the_nubster
The_Nubster

4696

Forum Posts

11

Wiki Points

0

Followers

Reviews: 1

User Lists: 1

I was happy with a base PS4 until I bought Yakuza 6, which is frankly unacceptable on a base machine. I had to buy a Pro to handle it, and even then it's a choppy mess that lacks anti aliasing entirely and is hideous to look at. It severely undercut the massive leap in world generation and fidelity that the Dragon Engine brought, and I actively hated it until the PC release of Kiwami 2.

There are games out there which should not be releasing on the base machines. While Sony and Microsoft do a very good job of making sure that their games are good experiences on both, there have been a good chunk of third-party games which absolutely needed to be held to a higher standard before being released.

Avatar image for notnert427
notnert427

2388

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 1

I have a launch Xbox One and a One X. I have enjoyed owning both. While the OG 'Bone being "underpowered" was massively overstated, it had no chance of doing what the One X can and does.

This generation was weird for consoles because the display tech really changed at about mid-generation with the adoption of 4K/HDR TV sets. Neither launch console had nearly what it takes to deliver 4K, so there's an argument that the half-step was necessary.

For me, I was already in need of a new TV set anyway, so the timing worked out. I would have bought a quality 4K/HDR set regardless of video gaming, but the One X was obviously more tempting once I already had that. I genuinely felt like I got enough out of my launch Xbox One from playing Titanfall, Forza Horizon 2, the MCC, Rise of the Tomb Raider, etc. to feel fine with upgrading.

And after the upgrade? Well, if I'm being honest, 4K res in games is more of an impressive benchmark to hit than it is some game-changer. Yeah, it looks better, but HDR is the real wow factor. HDR, when done right, is fucking incredible. The One X paid for itself the second I fired up the GOTY version of HITMAN 2016 and saw a game that I had put hundreds of hours in look jaw-droppingly better.

Having Netflix and Amazon Prime shows stream through it in 4K HDR has been awesome as well. I opted for a few 4K HDR BluRays of some of my favorite movies as well. I wish more game devs had taken advantage of the One X power this gen, and some big games like RDR2 not having HDR was heartbreaking, but Forza Horizon 4 is an undeniably impressive showcase of what can be done on a console.

As for what it means going forward, it's probably nothing good. The mid-gen upgrades made sense this go-round because the display tech changed. Now I expect that since this ice has been broken, the "cell phone model" will be adopted yet again the next gen, without a need for it.

Honestly, there's arguably not even a need for the next gen yet at all. What are we getting? Ray tracing and faster load times? Okay. I'm sure I'll get one at some point, but the justification is thinner, and I only expect it to be moreso if there's another mid-gen upgrade. 8K/120FPS is pure overkill, so it's hard to make that argument if that's the plan.

I just fear that we're approaching a point where we'll be sold some perception of obsolescence when that's not actually the case. There are precious few games that seem to even remotely stress the One X, and we're about to have something roughly twice as powerful? I guess devs having plenty of extra power to play with will be nice for them, but I'm not remotely buying that the industry took full advantage of the One X hardware in particular or is presently limited by it.

I guess I just don't want the focus to be this arms race from a hardware standpoint and benchmark-seeking from the software. Make a game good. Give it proper HDR. We already hit 4K/60 FPS this gen, so there's no excuse to easily hit that next gen and focus on making games as good as they can be. If I'm told I'm supposed to want/need 8K/120 three years after these new consoles, I'm gonna tell them to fuck right off.

We're very near diminishing returns, if not there already. I'm already questioning if next gen is coming too soon, and I'm going to get really pissed if the "phone model" where some marginal-ass improvement is sold as if it's a must-have, or worse, if consoles adopt Apple's planned obsolescence model where it's forced.

At least the power upgrade is there to theoretically justify the new boxes, but I'm not sure they're entirely necessary as of this moment. The pessimist in me suspects that the industry will only go harder in the direction of trying to sell me on the next thing instead of making the most out of the existing tech, and it might lose me if it does.

That's where I'm at.

Avatar image for ithas2besaidkvo
ItHas2BeSaidKVO

45

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I bought an X simply because I took forever to upgrade from my 360 (wasn't ever in the right space money wise), so when I finally had the disposable income, I just dropped that extra couple of hundred dollars on the best version that was available. Being able to play games that have the enhanced patch is fun and they do look gorgeous most of the time, but it'd be a lie to say I know enough about the visuals and performance stuff to know whether that extra money was truly spent well.

I mean, when I was living with a friend who had a base PS4, he let me use it and so I played through most of the exclusive stuff I wasn't able to touch being an Xbox user, and let me tell you, I thought Uncharted 4 (and Lost Legacy), Horizon, TLOU remastered (and also Until Dawn) looked goddamn amazing (although that was on a 4K TV, so that might skew the results).

Avatar image for ryuku_ryosake
Ryuku_Ryosake

460

Forum Posts

4

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

In theory I see the problem with them. But I had to vote yes because the 2 consoles I bought this generation happened to be the PS4 Pro and the second revision Switch.

That was mainly a consequence of just waiting so long to buy into this generation. Knowing that I was getting the most out of every game I was playing has been a great. There is a bit of a chicken or the egg thing going on with the poor performance for certain games on the base models. Do those game perform poorly because they targeted the higher spec. Or is it is the normal late gen where everything plays like garbage because all that matters is getting pretty visuals to hold up for advertisement. So on the upgraded models we are getting better performance than usual late gen and the base models are getting the usual late gen performance.

The Switch revision is a bit different. Having a longer and more reasonable battery life feels more like you are getting the actual vision of the console as it should be rather than a compromised to ship version. Which there is probably some truth to that and they cut to battery life targets to bury the WiiU as fast as possible. Also oddly enough with DP2 I seem to get at least solid double digit frames unlike the single digit I had seen elsewhere but I could be imagining things.

Avatar image for facelessvixen
FacelessVixen

3007

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

Yes in the sense that I wouldn't downgrade from a PS4 Pro to a regular PS4 in the same vein of me not wanting to go back to my GTX 1060 after having a 2080 Ti for a year, so I appreciate the extra power that comes with both pieces of hardware. But also no in that I believe in that one of the advantages to consoles over PCs is that you have the hardware for and extended amount of time and thus shouldn't have to worry about mid generation console upgrades, which isn't a thing anymore since a PS5 Pro, Xbox Series XX, x1.5 (or whatever the fuck), as opposed to "slim" versions and relatively minor revisions and alternate versions, are within the realms of possibility since I rather only think about GPU upgrades which I'm more accepting of.

Avatar image for inevpatoria
inevpatoria

7682

Forum Posts

2136

Wiki Points

0

Followers

Reviews: 0

User Lists: 6

#10  Edited By inevpatoria

Yes, I bought both. No, I don't like the practice.

Were they worth it? That's going to be so subjective for everyone's respective setup. I play on a 1080p monitor at a desk, so a lot of the bells and whistles of the high-end hardware were lost on me regardless. But someone with a 4K display and the ability to leverage HDR will probably have a different perspective.

I upgraded from a base Xbox One to a One X. My PS4 is the only PlayStation 4 I own. I'll say two things:

1) The power differential between the base Xbox One and the Xbox One X was noticeable. As someone on a 1080p display, framerate and stability were the performance metrics I cared about most (and, for the record, I'm not someone who needs 60 frames-per-second in my games, I just need the framerate to hold steady). I began writing here that the improvements were most visible in third-party games, games like The Witcher and Hitman. But that's actually not true at all—Microsoft's first-party games looked absolutely gorgeous on the One X, and often ran at rock-solid framerates. I loved having access to different visual priority modes that hadn't previously existed on base hardware, and it genuinely improved my gaming experience on the Xbox platform.

2) PlayStation exclusives tend to look great no matter where you play them. I know how this can be misread, so let me assure you that this is less a criticism of the PS4 Pro technology and more a credit to the dedication Sony's in-house studios commit to providing all players the best experience possible. Frankly, Sony does a much better, much more reliable job optimizing their software for their entire line of consoles than Microsoft, where things sometimes feel as if they were built to run on the highest specifications. The only time I ever felt like the Pro saved me from a bad experience is God of War. Its 30 frames-per-second mode looked nice but introduced some degree of input delay otherwise mitigated by a faster framerate.

Like most people here, I'm in favor of a set-it-and-forget-it approach. I don't want to spend multiple hundreds of dollars on multiple hardware revisions over the course of a single generation. And as has already been noted in this thread, this generation launched at an awkward time relative to correlating PC technology. I don't know enough about CPUs and GPUs to say this definitively, but my impression is that the next crop of consoles will hew a little more closely to the capabilities of modern PCs. Last time it wasn't even close. Even at the start.

Honestly, it feels like a seal that can't be unbroken. So I'm fully prepared to do this stupid dance again. Unfortunately.

Avatar image for colourful_hippie
colourful_hippie

6099

Forum Posts

8

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

No, they extended the life span of a mediocre class of hardware in terms of raw power. I get why they only gave near generational bumps in graphic horsepower, or at least the X mainly did this, instead of beefing up the CPU because devs would mostly prioritize the common denominator. I just hated how underpowered this generation was coming out of the gate and their mediocre smartphone level CPU's horribly aged with each passing year.

Instead of waiting for next gen we would have already started to see more innovative gameplay through advanced Ai instead of shinier graphics that waste horsepower targeting 4k.

Avatar image for actuallydeevees
ActuallyDeevees

16

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I appreciate that developers weren't as bottlenecked with the half-step consoles but at the same time, I feel as though some consoles felt a little too underpowered. I used to own a PS4 Pro and while the benefits were nice I don't know if it was worth $400. The One X seemed like a what a half-step console should be on paper but I can't really say definitively since I never really owned one.

The worst half-step console that people tend to overlook was the New 3DS. The awful laptop nub, the single exclusive game that I genuinely think was meant to trick people into thinking there would be more coming, and the few games that really utilized the extra horsepower really made me regret purchasing one. The improved 3D was cool but I never used the feature anyway.

Avatar image for thelingo56
thelingo56

3

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#13  Edited By thelingo56

I think it was very necessary for this console generation mainly because 4K was a thing that caught on. For 8K it at least seems as if it would be enough to just support running the UI at native res while keeping the internal render of the game at a 4K target.

I don't think they should aim for a second console for next-gen if they can avoid it, having one obvious SKU allows for an easier more consistent target for developers to hit. The HDMI 2.1 spec already supports 8K anyway so people wouldn't need to swap their consoles for any kind of resolution bump. It's just unfortunate that Xbox seems like they'll be splitting their lineup right out the gate.

Avatar image for sethmode
SethMode

2669

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@actuallydeevees: Don't forget it also plays SNES games! So...here are like, 20 of them and one of them is Earthbound just to get your hopes up and then you'll never hear from us regarding this system again.

God I regret the purchase of the New 3DS.

Avatar image for thelingo56
thelingo56

3

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#15  Edited By thelingo56

@colourful_hippie: This generation really felt so hollow simulation-wise in the AAA space. I truly hope it was just because of the CPUs being so terrible that developers couldn't push that tech forward.

Avatar image for humanity
Humanity

20365

Forum Posts

5738

Wiki Points

0

Followers

Reviews: 40

User Lists: 15

#16  Edited By Humanity

Nope this was a terrible idea that I can only imagine was fueled by corporate greed and ambitions to chase the cell phone market of same software running on annually iterative hardware. While the XB1 needed the bump this splintered console gaming in a way that is completely antithetical to their entire existence. Like Jeff recently said on one of the shows, you shouldn't have to make the compromise between "Performance" and "Quality" modes on a console. The entire idea is that everyone gets the same treatment because you're in a tightly controlled ecosystem - you pop in the disc and it goes. Including graphic toggles in console games is for lack of a better term.. a slippery slope. Why not add a shadows quality slider next, let the gamers decide if they want 3FPS more or blurry shadows? As we now are on the brink of a new generation of hardware I can only think "how long until the next half steps are introduced and should I just hold out for those instead?"

Avatar image for ll_exile_ll
ll_Exile_ll

3054

Forum Posts

25

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

#17  Edited By ll_Exile_ll

@humanity said:

Nope this was a terrible idea that I can only imagine was fueled by corporate greed and ambitions to chase the cell phone market of same software running on annually iterative hardware. While the XB1 needed the bump this splintered console gaming in a way that is completely antithetical to their entire existence. Like Jeff recently said on one of the shows, you shouldn't have to make the compromise between "Performance" and "Quality" modes on a console. The entire idea is that everyone gets the same treatment because you're in a tightly controlled ecosystem - you pop in the disc and it goes. Including graphic toggles in console games is for lack of a better term.. a slippery slope. Why not add a shadows quality slider next, let the gamers decide if they want 3FPS more or blurry shadows? As we now are on the brink of a new generation of hardware I can only think "how long until the next half steps are introduced and should I just hold out for those instead?"

I don't really see how some games having a "lower resolution, 60 FPS" mode and a "higher resolution, 30 FPS" mode is in any way a negative. That's a long way away from tweaking anti aliasing, ambient occlusion, and anisotropic filtering manually. Some players prefer a clean image, some value frame rate above everything. I don't see how having two options is somehow ruing the console philosophy. I'd rather not be stuck at 30 FPS because the developers decided a high resolution was more important than a better framerate. Other players may be frustrated with a blurry image when they find 30 FPS perfectly playable and would prefer a cleaner picture.

Of course, it would be great if games didn't have to sacrifice either for the other, but that's the nature of consoles, standardized hardware meant to minimize retail price that will always be limited in some way. Calling this a slippery slope is just silly. Being able to choose between performance and image quality is nice a player friendly option, not the precursor to some doomsday where console games have graphics menus with 20 different toggles.

Avatar image for humanity
Humanity

20365

Forum Posts

5738

Wiki Points

0

Followers

Reviews: 40

User Lists: 15

@ll_exile_ll: While I would hardly call it a "doomsday" what is stopping console manufacturers from incorporating more graphics menu options? By your example that would be great because by that same philosophy gamers would be empowered with even more choices. Why not give me the option to turn off chromatic aberration? Or motion blur or anything else. I'm sure many console gamers would want all these graphic toggles on their console version as well.

The point I was trying to make was not that choice is bad or that it will bring about anything quite as dramatic as a doomsday. The point is that for years these were closed loops and developers only worked on one version of their game to make sure that was the best it could be. As a console player you would get a product that would, ostensibly of course, be optimized to run at the optimal resolution and framerate to the experience. You didn't have to worry that your half-step-down version of the console was getting an inferior experience because the game was clearly optimized for the step-up version - like Assassins Creed Origins or Red Dead Redemption 2 which both featured massive framerate drops on base console versions. The inclusion of choice while great in theory, more often than not left you with a game compromised in one way or another, or in some situations in both unless you had the better hardware.

Avatar image for ll_exile_ll
ll_Exile_ll

3054

Forum Posts

25

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

#19  Edited By ll_Exile_ll

@humanity: I think the Assassin's Creed and Red Dead examples are poor. Look at the last couple years of the 360/PS3 gen, you had games like Far Cry 3 and Battlefield 3 that ran poorly on console and were being far outclassed by the PC versions when earlier in the generation, the PC experience generally didn't have as much of a gulf over the consoles.

I think its fallacious to assume that games like Red Dead and Assassin's Creed would be better optimized for base hardware if the enhanced consoles didn't exist. I'd argue its just as likely, if not more so, that there simply wouldn't be a very good console experience available at all for those games.

Avatar image for humanity
Humanity

20365

Forum Posts

5738

Wiki Points

0

Followers

Reviews: 40

User Lists: 15

@ll_exile_ll: That better console experience exists in the form of the half step consoles. Now I can only speculate that if the half steps didn't even exists then those developers would spend more time on their games making sure that the framerate doesn't drop to 18-20 FPS when you ride your horse into every town because that wouldn't be a very good look for them. Of course you might absolutely be right that no matter what they did those games would always run poorly who knows.

As for the performance/graphics options I do think it's a very subjective notion. I want to make it clear that I don't think you're wrong. I absolutely agree with you that choice is good. This generation we've seen a lot of great accessibility options introduced for games and I think thats amazing. Not having to "mash" a button in God of War is a great option to have for me and I think games are always better for it. For me personally though it's never really a choice. I don't know how to phrase this so it doesn't sound incredibly obnoxious so apologies, but people that actually care about games will likely always choose performance. Someone that loves Souls games is always going to go for a better framerate. It just sucks that I know I'm playing a worse looking game because of it.

Avatar image for ll_exile_ll
ll_Exile_ll

3054

Forum Posts

25

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

@humanity: I mean, I get the sentiment that it never feels great to know that you have to choose to sacrifice in one area for the other to be ideal. However, I think the alternative of the developer deciding that all the nice graphical effects they worked hard on are more important than 60 FPS gameplay is worse. I think the history of console game development has shown pretty clear proof that developers are far more likely to sacrifice frame rate for better looking visuals than they are the opposite. Sure, some developers make 60 FPS a priority, but most seem to favor 30 FPS with better visuals.

If some developers choose to provide the option to sacrifice some visual fidelity in favor of a better frame rate, I think that's only a benefit. If that option isn't there and the developer makes that choice for the player, 9 times out of 10 they're going to choose visuals over frame rate.

Avatar image for navster15
navster15

232

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I personally really liked the half step consoles, especially now that we can see both Microsoft and Sony moving to multiple SKUs for their next gen launches. It allows them to not compromise the top end to meet a lower price point while still having a lower priced entry point for people that don’t need the extra power. And heck, in Microsoft’s case, they’re still going to support the supposedly obsoleted base X1 even though it’s not even in production anymore. It’s honestly a win-win all around IMO.

Avatar image for humanity
Humanity

20365

Forum Posts

5738

Wiki Points

0

Followers

Reviews: 40

User Lists: 15

@ll_exile_ll: Oh definitely, screenshots sell better because you can't sell gameplay as easily, even on video. Although this generation honestly struggled to even provide that steady 30 FPS most of the time. I know people don't like variable framerates but between Medium Detail/60FPS and High Detail/30FPS I would take a variable 30-45 framerate like InFamous Second Son with a decent compromise in between for graphical fidelity. Last of Us 2 looks phenomenal and holds a super steady framerate - dunno how Naughty Dog does it, apart from the obvious blood, sweat and tears, but their games are able to perform in some magical ways.

Avatar image for toughshed
ToughShed

243

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#24  Edited By ToughShed

I think Microsoft has been doing a great job lately but when they said they are launching with a half step console I facepalmed.

If the consoles simply started at the right power level this would never be neccessary. For some reason no one thinks you can charge more than 400 for a console and shreiks at the thought but then complains at the half step. If we wouldn't balk at 500 to 600, a totally reasonable price for high end electronics where games on the system are about to cost 70, we could get some power to last and make more progress. If you compare it to the cost of other things, even in the videogame console sphere like games and accessories, its silly.

I won't be rushing out to buy one either but they could come down in price over time. And trust me I am not someone with a lot of spending money to throw around either but the problems and shitty nature of this system and how it ruins the purity of console gaming is clear. And it's obviously aimed at milking the consumer for more money in the long run. I say that as a PC gamer mostly who does really like consoles for what they are.

Anyways, I've seen this same thinking from fans and the media. On a Giant Bomb podcast at some point Vinny said the half step new Xbox Scarlet to come out should be launching at 300. For a brand new console!? Come on man.

@colourful_hippie: Amen.

Avatar image for gkhan
gkhan

1180

Forum Posts

2

Wiki Points

0

Followers

Reviews: 0

User Lists: 4

I really liked the half-step consoles, but it might reflect the fact that I could rather easily afford to buy them. I like console games more than PC games for the convenience and for the experience, but it's also nice to have them keep up a bit more graphics-wise with the PC.

Avatar image for piousmartyr
piousmartyr

6

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I like them, but I'm not exactly an early adopter of consoles. I typically like to wait a while for the library to build up. In the last few generations there have also been console mini-revisions where the specs stay the same, but they improve on airflow or something else minor. Like the revision to the Switch that improved the power draw of the CPU, which lead to better battery life. That's typically when I jump on.

Avatar image for onemanarmyy
Onemanarmyy

5313

Forum Posts

431

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#27  Edited By Onemanarmyy  Online

If this is the new normal business model for consoles i would straight up never buy the first version. I don't need to be there playing the freshest games on day 1 nor do i care about the day1 multiplayer scene. And time and time again, we see that the first console year is not all that exciting game-wise. Why not buy in at the half-way point and get a good experience across the board instead of feeling like you're playing a gimped version of the game?

So to answer the question. No i don't like the practice, but at the same time, yes i'd like being able to step in half-way through a generation and being able to buy a more powerful box. But at the same time, if the lack of a halfstep would mean that the next generation was accelerated by like.. 2-3 years, i would prefer that. I guess that we don't exactly know if the console cycles are being prolonged by this all, but my intuition and business sense says that it is.

I've seen some chatter around 8k resolution, but i doubt we'll see that in a meaningful way in the next console generation. I know that they have namedropped the number, but that's natural given that it's the maximum resolution of the GPU they include. 2017's midtier RX 570 gpu that came with a retail price of 170$ also outputs at 8k if you want it to. An impressive spec to mention, but naturally you're not going to have an acceptable gaming experience at that point.

On top of that, you need to sit very close to your TV to benefit from 8k. Let's say you own a beefy 65 inch TV. If you sit 4.3ft away from the screen at 4k, you're getting the full benefit over a 1080p resolution. That's manageable to achieve. And only at 8.6 ft distance you're not getting any benefit at all over 1080p. But to see the full benefit of an 8k TV, you need to sit 2 ft away from your 65 inch screen. I just don't think many people are having that kind of set up that it would make much sense. 4K/60 should be the aim. Raytracing, HDR, global illumination, god rays, that kind of stuff should make the screenshots visually exciting, not an 8k resolution.

Avatar image for bigsocrates
bigsocrates

2276

Forum Posts

42

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I liked them. I bought a 4K TV in 2016 and I stream video through my Xbox so this meant I could watch 4K video as well as have better gaming.

If console generations are going to be 7 years long I don't think there's anything wrong with a refresh halfway through. Console generations used to be much shorter. If they're going to be 4-5 years long then they shouldn't do the half-steps but this way is a good way to let people get value from their early purchases while still allowing the games to look much better at the end of the generation than they did at the beginning.

7 years is a long time.

Avatar image for giantrobot24
GiantRobot24

129

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I liked it because I bought my first PC earlier in the year that the PS4/X1 came out and this let me sit back and wait to see which console built up the most appealing library of exclusives. By the time the PS4 Pro came out there were enough games I wanted to play that it made sense to buy one. I won't be getting a console at launch this year either, so if they do the half-step thing again, great, but if not, I'll just get a base one that's probably been revised and costs less.

If I exclusively played on console I'd definitely care more, though.

Avatar image for ben_h
Ben_H

4285

Forum Posts

1618

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

I'm a bit mixed on my opinion of them. I didn't buy a PS4 until 2017 so I bought a Pro and it has been great, and the extra performance stuff at 1080p is nice.

However, I also got bit by the downside of these half-step consoles in that I have a near-launch Xbox One (the original one). The extended life of these consoles became quite obvious on that thing, especially as newer games came out. I played Outer Wilds last year since it was on GamePass on my Xbox and it was pretty poor from a performance perspective (the game's great, it's just that my Xbox could barely handle portions of it). Several other GamePass games seemed similar in that they probably ran great on the Xbox One X but not well at all on a launch Xbox One.

Avatar image for giant_gamer
Giant_Gamer

815

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Redesign, better engineering like heat and fan noise reduction are welcome!

But telling owners to buy the console again for better graphics and performance is devilish

Avatar image for ry_ry
Ry_Ry

1796

Forum Posts

153

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Due to some electrical issues I found myself going from a launch PS4 and Xbox One to a PS4 Pro and Xbox One S. And while I do have a 4K TV, I don't have HDR and visually I can't immediately tell the difference between 1080p and upscaled 4k on my TV given how far I sit from it. So with that in mind, here are my thoughts.

I don't feel like I gained a lot out of the PS4 to PS4 pro in terms of enjoyment. The Pro seems louder on average and I'm sure some games like The Last Guardian ran smoother on it, but I didn't really have frame rate issues on the launch PS4. I thought Bloodborne was fine.

I do feel like the switch from launch Xbox One to Xbox One S was a MASSIVE improvement. A smaller box without the large power brick, that still runs silently! I do feel like games run better on it vs. the launch box. I still hate the OS on this thing, but i feel like the S is what Microsoft should have originally shipped.

Switch vs. Switch Lite - Due to a comedy of errors I wound up with both and I love my switch lite. The regular switch is now just a ring fit/mario kart box for my family, and that's okay. The switch should have just been the switch lite with the ability to dock to a TV. The joycon idea while neat just isn't that worth it in the end.