• 106 results
  • 1
  • 2
  • 3
#51 Edited by myketuna (1679 posts) -

I think a lot of this is interesting, but ultimately dumb until we get the consoles into the hands of people that are super deep into stuff like this. Otherwise, we're going off what this dude said and this dude responded, but then this report leaked, etc.

@seppli said:

In other words, Albert Penello is what is called in colloquial terms, a Spin Doctor!

I think the seminal hit "Two Princes" fits this conversation perfectly if you pretend the XBO and PS4 are the princes.

"Marry him, marry me

I'm the one that loved you baby can't you see?"

#52 Posted by Jimbo (9804 posts) -

@seppli said:

However, I believe there's more of a herd mentality to the broader market, it flocks to whatever the tastemakers flock to. In case of videogame consoles, the tastemakers are core gamers.

This is exactly right, imo. It's not as simple as saying core gamers will buy x and the mainstream audience will buy y. These discrete groups don't really exist, it's just convenient for the industry to put them into neat boxes like that. In reality though, everybody talks to everybody else. If the early adopters lean hard towards PS4 (because of price, performance or whatever they base their decision on), then when the next group comes along and buys 'whatever their friends have' then chances are their friends will have PS4s, and it snowballs from there.

This snowball effect wasn't really a factor before the current gen because online gaming wasn't a big deal. It didn't really matter which console your friends had before but suddenly this gen it did. Sony underestimated how much of an advantage they were giving Microsoft by giving them a year headstart with 360. I think Microsoft is now underestimating how much of an advantage they are giving Sony this gen by giving them a $100 headstart.

Perhaps the Kinect will give the X1 some immunity from the what the core audience thinks and allow it to appeal directly to the 'non-core' audience, but at $500 I think that's a tough sell. Wii managed it and then some, but it did so at half that price. $250 seems like a more palatable, family friendly price for something which you suspect is going to end up in a closet after 2 months; $500 seems like too much for people who might just be interested in the novelty value of the Kinect. If all somebody is interested in is the Kinect stuff for their kids to screw around with or whatever, maybe they'd rather just go buy a 360 Kinect bundle for $250 instead of paying twice as much for an X1.

X1 is too expensive to be the new Wii, but trying to be the new Wii is also ensuring it's too expensive to compete as the next-gen game console of choice. Trying to appeal to both audiences is likely to prevent them appealing to either. How many people truly want both the novelty of the Kinect functionality and the next gen performance? They'd better hope it's a lot, because that crossover is the only market segment where X1 is the best choice available.

#53 Edited by leebmx (2238 posts) -

Excuse my ignorance but is it not possible that once the consoles are released people can open up the boxes and have a look at what is inside? I expect it is more complicated than that - but is some sort of reverse engineering, having a look at the parts, not enough to determine the specs?

As for this Twitter/GAF conversation I don't think it really tells us anything. From what I understand the PS4 definitely has the edge in power over the One. Doesn't have better RAM and a faster GPU? However I don't think this translate into a noticeable difference in gaming performance, apart from, maybe, the console exclusives, which in any case are becoming less and less important.

My impression is that in terms of their raw potential, more was possible on the PS3 than the Xbox but we never saw the difference in cross-console titles. In fact the PS3 seemed worse a lot of the time (Skyrim for example) because of the quirky nature of its construction. The only difference this time around is the conventional PC set-up of both machines which might make it easier to use the superior power of the PS4. The problem is that devs are always going to have to design with the weaker machine in mind so the most significant difference is going to be maybe slightly better framerate, slightly faster loading and textures - nothing than the average punter is going to notice and nothing which will shift consoles. The Xbone would have to absolutely tank before devs started releasing games which had a real difference in quality.

Where we might see a difference is in the exclusives. This generation we had games like Uncharted which, graphically, seemed way superior to anything on the Xbox. This is possible again, but with no direct comparison possible it will not have the detrimental effect a markedly better COD, for example, might have.

In terms of sales I don't think any of this is enough to significantly shift the market place. What will determine console sales is services, price, sentiment (PS4 way ahead at the moment) and lastly, the games. I think people expect simillar experiences, games-wise on their machines - the rest of the experience is what tips the balance.

#54 Edited by jgf (387 posts) -
@syed117 said:

@seppli: the world has changed to the point where core gamers don't necessarily have the influence they used to. If that were the case, the Wii never would have taken off. That console was almost universally hated by core gamers. It was the joke of the industry for an entire generation. That's why we will probably never have another home console success like the ps2 was.

While I agree that the Wii took off due to its apeal to casuals, I don't see how the Xone is going to. If they announced a TV, Skype box with lots of funny gimmicky games for Kinect and priced it comparable to the Wii - lets say at 249,- - then they would definitely own the next gen casual market. But to get to this price point they would have to use a much less powerful box which alienates the hardcore gamers.

Kind of like the Wii U they are going for both casuals and hardcore and I'm not sure if this is even possible. Most casuals won't drop 499,- to play some hours of Kinectanimals 2.0 with their family. On the other hand 499,- is a tough sell to the hardcore crowd if they can get equal (or even better) looking multiplatform games at PS4 for 399,-.

I myself don't care too much about a 20% performance difference, as long as the games look and feel good I don't care for polygon counts and number of AA passes. But I care if someone is trying to bullshit me, like MS PR does. When they talk specs they should know that casuals don't care and hardcore gamers are going to call them out on their bullshit sooner then later. E.g. If they want to convince me that "power of the cloud" has a significant impact, then they should show me some demo that proofs their point. Like some split screen of the same demo one with could enabled one without. MS just don't embarrass yourself and us with things like transistor count.

#55 Posted by Syed117 (387 posts) -

@jgf: I've seen a lot of people saying similar things and all it does it show that people believe what they want to believe. Pre existing biases never change.

Xbox live when first announced and launched was laughed at. I bought that first headset with the card for xbox live and I never thought it would become what it is today. Like every new technology or new way of doing things, it takes time. The inclusions of an Ethernet port and hard drive were laughed at as well.

People say they want proof but they don't want to wait. The same people blindly believe everything Sony says despite Sony always being a follower when it comes to giant changes that have occurred in the last two generations. Online play, achievements, and using the boxes for things other than gaming all had to start somewhere.

The same people who choose to shit on everything Microsoft is trying to do with the cloud believe GaiKai is completely magical and will work perfectly for everyone. Streaming entire games with little latency and on an infrastructure that does not exist seems more reasonable than games putting non essential parts of gameplay on cloud resources that already exist all over the world. That makes perfect sense.

These biases never change. People want to make excuses for eveything.

If there was one thing to take away from that interview with Penello, it was the party about Sony ALWAYS making a big deal out of their hardware. He was absolutely right when he said they talked up the emotion engine and the cell processor. They always do the same thing and the sony fanbase always falls for it. We don't know if this time will be different, but logic dictates that you don't believe someone blindly when they have misled you twice in the past. Then again, we are talking about gamers. Logic doesn't really apply.

#56 Posted by JasonR86 (9659 posts) -

@seppli said:

@jasonr86 said:

Specs don't matter unless the difference is very big. Such as Wii U compared to X1 and PS4. If the difference is big enough that you see a resolution different, framerate difference, texture quality difference, etc. then the specs matter an awful lot. I just don't know what this means until games get out there and reviewers start reviewing products.

As soon as the hardware is in a different league, sure. Wii U won't be able to run most of the games built specifically for PS4 & Xbox One & Gaming PCs. If the Wii U tried to run a game like The Witcher 3, it'd crash and burn at 3 frames per second - there's just no way.

I don't believe minor differences make that much of a difference, as long the hardware plays in the same league, seeing how Playstation 3 managed to sell equally well as the 360, despite most games running noticeably worse on it. I think it will be similar this generation with PS4 and Xbox One, just with reversed roles.

I don't think we can know anything as consumers until a year into these consoles. In the second and third year is when we'll see noticeable differences if there are any.

#57 Posted by jgf (387 posts) -

@syed117: I'm not sure if I fully understand the point you are trying to make. But here's my answer nevertheless ;)

I think there is a difference between your examples of new technology that has been laughed at and for example the "power of the cloud". They are all far more specific. They could say that they make it easy for developers to use dedicated servers, with guaranteed availability and low pings around the world. They could also hint some use cases like offloading long running background tasks to the cloud. But I don't see how one could magically make a 30fps game to a 60fps one with the power of the cloud. Still they seem to market this feature as some sort of magic bullet.

Compare that to gaikai. Its far more specific. I don't believe that it will work flawless for everyone. I also somehow can't believe that they can reduce cost of PS3 emulation with gaikai to a level that makes sens to the consumer (You will still need some form of PS3-eske hardware for every connected player). But I tried onlive before so I have some idea about what can be achived with this technology. And what they say is at least technologically possible.

I see that many people are already very biased aka fan boys. I try to stay neutral and I wish both consoles to succeed. A 50:50 split would be good for every gamer. Giving them enough money to work with while still having the need to stay humble and cater to the customer.

But as I said before, I'm allergic to marketing bullshit. You say your system is faster because you have 50% more shader units and a faster memory? I believe you, because it makes sense to me. You tell me your graphics card is 30% slower as the competition, but it won't affect anything because of the power of the cloud? Then you better have something to back that up, because I can't make sense out of it on my own.

#58 Posted by JayEH (525 posts) -

I have an industry secret I'll break here. All the multiplatform games will look the same. Really it doesn't matter at all, games will look great on both and some will look better on another.

#59 Posted by Darji (5294 posts) -

@jayeh said:

I have an industry secret I'll break here. All the multiplatform games will look the same. Really it doesn't matter at all, games will look great on both and some will look better on another.

That was not even the case with PS3/360 there were always some differences but now the PS4 seems to be more powerful so we will definitely see it in terms of resolution and performance. Look how Microsoft is already using cheap effects like in Forza and the reflection on the hud which is just a copy of what the player is seeing instead of realtime reflection. You can even see the info which is shown over the car.

#60 Edited by big_jon (5723 posts) -

Can't wait to play some new VIDEOGAMES! November 22nd here I come!

I think it's going to be an overload for me because I'm going to get a Wii U then too with Mario World and DKC too.

#61 Posted by AlexGlass (688 posts) -

Not sure if it's to the benefit or detriment of Giant Bomb that closed topics on GAF end up migrating here.

Semi on-topic: Does the casual/doesn't-care-about-specs crowd outnumber the hardcore/foaming-at-the-mouth-about-specs crowd? My theory is that the latter group is in the minority and that specs don't matter as much as what games come out for which console.

Well I don't think it got closed because of the topic so much as the way the discussion unfolded and and so far GB seems to have managed to have a more mature discussion without the immaturity and that board's vitriol towards Albert on a personal level.

So I would consider it a benefit? Or better said Giant Bomb is a benefit to the discussion? Of course that could also be because the discussion here is not really on the topic anymore ;)

@seppli said:

In other words, Albert Penello is what is called in colloquial terms, a Spin Doctor!

Well that's what we need to know because he's committed to a pretty bold statement, with numbers, so that's kind of hard to spin at this point. I should have posted the quote in the OP but here's the specific quote:

We have Albert saying this on Reddit, in a thread that was closed:

Originally Posted by Albert Penello

I’m not dismissing raw performance. I’m stating – as I have stated from the beginning – that the performance delta between the two platforms is not as great as the raw numbers lead the average consumer to believe. There are things about our system architecture not fully understood, and there are things about theirs as well, that bring the two systems into balance.

People DO understand that Microsoft has some of the smartest graphics programmers IN THE WORLD. We CREATED DirectX, the standard API’s that everyone programs against. So while people laude Sony for their HW skills, do you really think we don’t know how to build a system optimized for maximizing graphics for programmers? Seriously? There is no way we’re giving up a 30%+ advantage to Sony. And ANYONE who has seen both systems running could say there are great looking games on both systems. If there was really huge performance difference – it would be obvious.

Along with Major Nelson seemingly backing him up:

"As I said above, Albert is one of the most amazing people I work with - that's why I invited him on my podcast a few weeks ago. I jab him a bit about posting on 'GAF (fact: they would not approve my account of there) but he's smart and driven. He's also right: We have some of smartest programmers in the world working on Xbox One. I am very much looking forward to the next few months (and beyond) as the truth comes out." http://www.reddit.com/r/xboxone/comments/1lt48f/albert_penello_there_is_no_way_were_giving_up_a/cc2nezu

Now Albert, as Larry, is a spokespeople for MS so it seems to me he committed to this statement publicly. And then we have Adrian saying what he heard from developers in the OP, though Adrian is vary vague about what exactly or when that's referencing. And Adrian didn't clarify it further when he posted his "clarification" statement.

That's the issue at hand.

#62 Posted by jgf (387 posts) -

I believe Adrian more then Albert and the Major because of two simple reasons.

1. Adrian is not employed by Sony or MS, so its more likely that his view is less biased then Albert/Majors. Also he has a background as a developer that lets me believe that he knows what hes talking about and that he has the connections to other developers and can get in on some real talk with them.

2. Looking at the leaked specs it makes sense. Faster memory, better graphics card and so on. This makes it easy to believe that PS4 has a significant advantage in raw power.

Of course I may be wrong because theres some secret sauce included in Xone we don't know about. But if they want me to believe it they need a better argument then "we have smart people, therefore our console must be faster". Like I said in a previous post, when you want to convince me of something that does not seem logical at first glance, you better have a compelling argument to back that up. Otherwise I just assume its some marketing speech mambo jambo.

#63 Edited by AlexGlass (688 posts) -

@jgf said:

I believe Adrian more then Albert and the Major because of two simple reasons.

1. Adrian is not employed by Sony or MS, so its more likely that his view is less biased then Albert/Majors. Also he has a background as a developer that lets me believe that he knows what hes talking about and that he has the connections to other developers and can get in on some real talk with them.

2. Looking at the leaked specs it makes sense. Faster memory, better graphics card and so on. This makes it easy to believe that PS4 has a significant advantage in raw power.

Of course I may be wrong because theres some secret sauce included in Xone we don't know about. But if they want me to believe it they need a better argument then "we have smart people, therefore our console must be faster". Like I said in a previous post, when you want to convince me of something that does not seem logical at first glance, you better have a compelling argument to back that up. Otherwise I just assume its some marketing speech mambo jambo.

Well the main reason I can't believe Adrian either is because for one it's a known fact, at least based on the facts we know so far, it's not a 50% overall faster console. We already know that to be wrong since the only area it had a 50% advantage was GPU. And after the upclock on the GPU it's around 40%. Then the X1 got a boost in the CPU, and of course a few other small areas. For another it's just a vague statement in terms of comparing tech. And finally, he's a former employee of a developer who worked on an Xbox exclusive.

Not that I think Albert is telling the entire story either, but he's at least a bit more descriptive. As he recently updated in this thread.

I see my statements the other day caused more of a stir than I had intended. I saw threads locking down as fast as they pop up, so I apologize for the delayed response.

I was hoping my comments would lead the discussion to be more about the games (and the fact that games on both systems look great) as a sign of my point about performance, but unfortunately I saw more discussion of my credibility.

So I thought I would add more detail to what I said the other day, that perhaps people can debate those individual merits instead of making personal attacks. This should hopefully dismiss the notion I'm simply creating FUD or spin.

I do want to be super clear: I'm not disparaging Sony. I'm not trying to diminish them, or their launch or what they have said. But I do need to draw comparisons since I am trying to explain that the way people are calculating the differences between the two machines isn't completely accurate. I think I've been upfront I have nothing but respect for those guys, but I'm not a fan of the mis-information about

our

performance.

So, here are couple of points about some of the individual parts for people to consider:

• 18 CU's vs. 12 CU's =/= 50% more performance. Multi-core processors have inherent inefficiency with more CU's, so it's simply incorrect to say 50% more GPU.

• Adding to that, each of our CU's is running 6% faster. It's not simply a 6% clock speed increase overall.

• We have more memory bandwidth. 176gb/sec is peak on paper for GDDR5. Our peak on paper is 272gb/sec. (68gb/sec DDR3 + 204gb/sec on ESRAM). ESRAM can do read/write cycles simultaneously so I see this number mis-quoted.

• We have at least 10% more CPU. Not only a faster processor, but a better audio chip also offloading CPU cycles.

• We understand GPGPU and its importance very well. Microsoft invented Direct Compute, and have been using GPGPU in a shipping product since 2010 - it's called Kinect.

• Speaking of GPGPU - we have 3X the coherent bandwidth for GPGPU at 30gb/sec which significantly improves our ability for the CPU to efficiently read data generated by the GPU.

Hopefully with some of those more specific points people will understand where we have reduced bottlenecks in the system. I'm sure this will get debated endlessly but at least you can see I'm backing up my points.

I still I believe that we get little credit for the fact that, as a SW company, the people designing our system are some of the smartest graphics engineers around – they understand how to architect and balance a system for graphics performance. Each company has their strengths, and I feel that our strength is overlooked when evaluating both boxes.

Given this continued belief of a significant gap, we're working with our most senior graphics and silicon engineers to get into more depth on this topic. They will be more credible then I am, and can talk in detail about some of the benchmarking we've done and how we balanced our system.

Thanks again for letting my participate. Hope this gives people more background on my claims.

http://www.neogaf.com/forum/showpost.php?p=80951633&postcount=195

At Microsoft, we have a position called a "Technical Fellow" These are engineers across disciplines at Microsoft that are basically at the highest stage of technical knowledge. There are very few across the company, so it's a rare and respected position.

We are lucky to have a small handful working on Xbox.

I've spent several hours over the last few weeks with the Technical Fellow working on our graphics engines. He was also one of the guys that worked most closely with the silicon team developing the actual architecture of our machine, and knows how and why it works better than anyone.

So while I appreciate the technical acumen of folks on this board - you should know that every single thing I posted, I reviewed with him for accuracy. I wanted to make sure I was stating things factually, and accurately.

So if you're saying you can't add bandwidth - you can. If you want to dispute that ESRAM has simultaneous read/write cycles - it does.

I know this forum demands accuracy, which is why I fact checked my points with a guy who helped design the machine.

This is the same guy, by the way, that jumps on a plane when developers want more detail and hands-on review of code and how to extract the maximum performance from our box. He has heard first-hand from developers exactly how our boxes compare, which has only proven our belief that they are nearly the same in real-world situations. If he wasn't coming back smiling, I certainly wouldn't be so bullish dismissing these claims.

I'm going to take his word (we just spoke this AM, so his data is about as fresh as possible) versus statements by developers speaking anonymously, and also potentially from several months ago before we had stable drivers and development environments.

http://www.neogaf.com/forum/showpost.php?p=80962073&postcount=426

There's obviously some peak numbers in there that sound great and benefit from creative addition, but at least his numbers match up more with what everyone else has come up with after the recent upgrades. Adrian's don't. In his clarification, Adrian also failed to clarify his original Twitter number or provide any sort of proof for his hearsay, time frame of where he got it from. Seemed to me more like he was backpedalling.

What I do understand from all of this is that Albert doesn't appear to be technically savvy enough to be having these discussions and Adrian is posting Twitter statements that would only work on people who have absolutely no knowledge of hardware comparisons. Might work on 13 year old and start talking about bits. Which goes back to my original point that it's ridiculous what these companies are doing, and they should just lay their cards out on the table rather than divulging this information in this matter.

#64 Posted by Istealdreams (148 posts) -

@alexglass: if you aren't on microsofts payroll, you should be. I have never seen someone so fervently defend the system of their choice against random Internet people.

#66 Posted by AlexGlass (688 posts) -

@alexglass: if you aren't on microsofts payroll, you should be. I have never seen someone so fervently defend the system of their choice against random Internet people.

You haven't been on the internet for very long then. I would hardly call this "fervently". I don't think me discussing the opposite point of view is any different than anyone else. But perhaps you could point out what about my posts make it "fervently"?

#67 Posted by subyman (606 posts) -

Well, PS4 has 50% more shaders. I'm surprised this is overlooked so often. 50% more shaders translates almost 1:1 to compute speed. More shaders does not mean higher resolution textures, but it means better real-time lighting, post processing, water effects, and such. Combine that with GDDR5 and these claims are not far-fetched at all.

#68 Edited by AlexGlass (688 posts) -

@subyman said:

Well, PS4 has 50% more shaders. I'm surprised this is overlooked so often. 50% more shaders translates almost 1:1 to compute speed. More shaders does not mean higher resolution textures, but it means better real-time lighting, post processing, water effects, and such. Combine that with GDDR5 and these claims are not far-fetched at all.

Sure but we know the X1's GPU/CPU had clock boosts, and the eSRAM ended up having more bandwidth, and a console isn't a GPU. It's made up of many components. And Adrian is making a blanket statement about the console as a whole. I mean, CPUs, bandwidth, sound chips, architecture,....last time I checked, they're important and add up to the total computational capabilities of each box. And even if you go off of total TFLOPs, you come around 40%. So how is he coming up with the numbers? Or the developers he spoke to? And when was it? Prior to the GPU/eSRAM/CPU upgrades, after?

Something doesn't add up about his comment either.

#69 Edited by Seppli (10251 posts) -

@alexglass said:

@subyman said:

Well, PS4 has 50% more shaders. I'm surprised this is overlooked so often. 50% more shaders translates almost 1:1 to compute speed. More shaders does not mean higher resolution textures, but it means better real-time lighting, post processing, water effects, and such. Combine that with GDDR5 and these claims are not far-fetched at all.

Sure but we know the X1's GPU/CPU had clock boosts, and the eSRAM ended up having more bandwidth, and a console isn't a GPU. It's made up of many components. And Adrian is making a blanket statement about the console as a whole. I mean, CPUs, bandwidth, sound chips, architecture,....last time I checked, they're important and add up to the total computational capabilities of each box. And even if you go off of total TFLOPs, you come around 40%. So how is he coming up with the numbers? Or the developers he spoke to? And when was it? Prior to the GPU/eSRAM/CPU upgrades, after?

Something doesn't add up about his comment either.

All Microsoft is doing is overclocking Xbox One, which is detrimental to their chip yield, making the box more expensive in production, and will lead statistically to more hardware failures overall (which still likely will be absolutely negligible). The tweaking increases Xbox One's raw power by a few measly percent. What it's been mostly about is Spin. And you're lapping it up wholesale.

#70 Edited by Seppli (10251 posts) -

@jasonr86 said:

@seppli said:

@jasonr86 said:

Specs don't matter unless the difference is very big. Such as Wii U compared to X1 and PS4. If the difference is big enough that you see a resolution different, framerate difference, texture quality difference, etc. then the specs matter an awful lot. I just don't know what this means until games get out there and reviewers start reviewing products.

As soon as the hardware is in a different league, sure. Wii U won't be able to run most of the games built specifically for PS4 & Xbox One & Gaming PCs. If the Wii U tried to run a game like The Witcher 3, it'd crash and burn at 3 frames per second - there's just no way.

I don't believe minor differences make that much of a difference, as long the hardware plays in the same league, seeing how Playstation 3 managed to sell equally well as the 360, despite most games running noticeably worse on it. I think it will be similar this generation with PS4 and Xbox One, just with reversed roles.

I don't think we can know anything as consumers until a year into these consoles. In the second and third year is when we'll see noticeable differences if there are any.

Nah, it will be apparent day 1, unless publishers put an effort into keeping parity amongst platforms, rather than making the best version for each. Other than deliberately holding back, there's no conceivable reason whatsoever for games not to run better on PS4 out of the gate.

#71 Edited by AlexGlass (688 posts) -

@seppli said:

@alexglass said:

@subyman said:

Well, PS4 has 50% more shaders. I'm surprised this is overlooked so often. 50% more shaders translates almost 1:1 to compute speed. More shaders does not mean higher resolution textures, but it means better real-time lighting, post processing, water effects, and such. Combine that with GDDR5 and these claims are not far-fetched at all.

Sure but we know the X1's GPU/CPU had clock boosts, and the eSRAM ended up having more bandwidth, and a console isn't a GPU. It's made up of many components. And Adrian is making a blanket statement about the console as a whole. I mean, CPUs, bandwidth, sound chips, architecture,....last time I checked, they're important and add up to the total computational capabilities of each box. And even if you go off of total TFLOPs, you come around 40%. So how is he coming up with the numbers? Or the developers he spoke to? And when was it? Prior to the GPU/eSRAM/CPU upgrades, after?

Something doesn't add up about his comment either.

All Microsoft is doing is overclocking Xbox One, which is both detrimental to their chip yield, making the box more expensive, and will lead to more hardware failures overall. The tweaking increases Xbox One's raw power by a few measly percent. What it's been mostly about is Spin. And you're lapping it up wholesale.

Speak facts. I'm not lapping up anything, but it seems to me you are and are listing a whole bunch of assumptions with nothing to back them up. The increases add actual performance and yeah it's a small percentage but a percentage nonetheless. The facts based on the numbers we do know just don't add up to 50%, irrelevant of whether or not you even take that into consideration.. It's not just spin.

Turning a GPU CU % difference into a blanket statement that refers to the entire console...now that's lapping it up wholesale. A GPU =/ console. That's a fact. MS doesn't need to spin anything there. You just need some basic common sense to know that people going around spreading that line are in fact spreading FUD.

#72 Posted by JasonR86 (9659 posts) -

@seppli said:

@jasonr86 said:

@seppli said:

@jasonr86 said:

Specs don't matter unless the difference is very big. Such as Wii U compared to X1 and PS4. If the difference is big enough that you see a resolution different, framerate difference, texture quality difference, etc. then the specs matter an awful lot. I just don't know what this means until games get out there and reviewers start reviewing products.

As soon as the hardware is in a different league, sure. Wii U won't be able to run most of the games built specifically for PS4 & Xbox One & Gaming PCs. If the Wii U tried to run a game like The Witcher 3, it'd crash and burn at 3 frames per second - there's just no way.

I don't believe minor differences make that much of a difference, as long the hardware plays in the same league, seeing how Playstation 3 managed to sell equally well as the 360, despite most games running noticeably worse on it. I think it will be similar this generation with PS4 and Xbox One, just with reversed roles.

I don't think we can know anything as consumers until a year into these consoles. In the second and third year is when we'll see noticeable differences if there are any.

Nah, it will be apparent day 1, unless publisher put an effort into keeping parity amongst platforms, rather than making the best version for each. Other than deliberately holding back, there's no conceivable reason whatsoever for games not to run better on PS4 out of the gate.

Unless they want to keep the budget down and make the games run similarly until a clear console leader is known. Again, this is all conjecture cause none of us here know jack-shit.

#73 Edited by RoarImaDinosaur (191 posts) -

The problem I have with Albert's argument is he is saying the 6% up clock is more significant then the gpu advantage the PS4 has over the Xbox One. I feel like he is picking and choosing what benefits the consoles performance without really knowing or understanding what he is saying. He is just the messenger after all and I think he should hold off on comments comparing the specs and just show off how great the launch games look. It will go much further for them rather than wasting time dispelling the disparity between the two consoles performance.

Beauty is in the eye of the beholder.

#74 Posted by AlexGlass (688 posts) -

The problem I have with Albert's argument is he is saying the 6% up clock is more significant then the gpu advantage the PS4 has over the Xbox One. I feel like he is picking and choosing what benefits the consoles performance without really knowing or understanding what he is saying. He is just the messenger after all and I think he should hold off on comments comparing the specs and just show off how great the launch games look. It will go much further for them rather than wasting time dispelling the disparity between the two consoles performance.

Beauty is in the eye of the beholder.

I don't think he is saying it's more significant than the PS4's GPU advantage. I think he was trying to say it's more significant than people give it credit for but I don't think he understands that argument doesn't make a whole lot of sense.

#75 Posted by Istealdreams (148 posts) -

@alexglass: http://i.word.com/idictionary/fervent

Maybe you should take a look at the large majority of your posts on Giant Bomb and get back to me after you've slept it off.

And just as a point of fact before it gets brought up, I have an Xbox One day one edition reserved.

#76 Edited by AlexGlass (688 posts) -

@istealdreams said:

@alexglass: http://i.word.com/idictionary/fervent

Maybe you should take a look at the large majority of your posts on Giant Bomb and get back to me after you've slept it off.

And just as a point of fact before it gets brought up, I have an Xbox One day one edition reserved.

I know very well what the word means. Perhaps you should study it a bit more and see if it applies before you start throwing words around at people for no apparent reason that don't actually fit? Especially when in this thread and especially when it comes to hardware. Hot glowing, zelaous, explosiveness....point it out. Or stop throwing around words just to sound cool and put people down or demean them for whatever reason you have. It's pretty obvious to anyone that's been following this discussion that it's been nothing but laid back, fact driven, calm and respectful. So once again, care to point it out?

#77 Posted by Humanity (9062 posts) -

All this talk about performance and specs is really hyping up my expectations of next gen consoles. At this point, after having witnessed this ongoing dick measuring contest between PS4 and XBO fans in terms of hardware, if the nextgen titles don't blow me away with their graphics I'll be horribly disappointed.

Still excited though.

#78 Posted by subyman (606 posts) -

@seppli said:

@alexglass said:

@subyman said:

Well, PS4 has 50% more shaders. I'm surprised this is overlooked so often. 50% more shaders translates almost 1:1 to compute speed. More shaders does not mean higher resolution textures, but it means better real-time lighting, post processing, water effects, and such. Combine that with GDDR5 and these claims are not far-fetched at all.

Sure but we know the X1's GPU/CPU had clock boosts, and the eSRAM ended up having more bandwidth, and a console isn't a GPU. It's made up of many components. And Adrian is making a blanket statement about the console as a whole. I mean, CPUs, bandwidth, sound chips, architecture,....last time I checked, they're important and add up to the total computational capabilities of each box. And even if you go off of total TFLOPs, you come around 40%. So how is he coming up with the numbers? Or the developers he spoke to? And when was it? Prior to the GPU/eSRAM/CPU upgrades, after?

Something doesn't add up about his comment either.

All Microsoft is doing is overclocking Xbox One, which is both detrimental to their chip yield, making the box more expensive, and will lead to more hardware failures overall. The tweaking increases Xbox One's raw power by a few measly percent. What it's been mostly about is Spin. And you're lapping it up wholesale.

Speak facts. I'm not lapping up anything, but it seems to me you are and are listing a whole bunch of assumptions with nothing to back them up. The increases add actual performance and yeah it's a small percentage but a percentage nonetheless. The facts based on the numbers we do know just don't add up to 50%, irrelevant of whether or not you even take that into consideration.. It's not just spin.

Turning a GPU CU % difference into a blanket statement that refers to the entire console...now that's lapping it up wholesale. A GPU =/ console. That's a fact. MS doesn't need to spin anything there. You just need some basic common sense to know that people going around spreading that line are in fact spreading FUD.

Actually, a console pretty much is a GPU these days, especially with devs off loading more tasks to the GPU. The big push will be OpenCL for physics this generation, which blazes on AMD GPUs. Sony having a 50% increase in GPU compute will be very important. A low end Intel i3 runs with high end CPUs in most PC games other than sims like Civ 5 and Shogun or poorly threaded games in which IPC is king (neither of these consoles have good IPC compared to current Intel processors anyway) while, say, jumping from a 7850 to a 7970 is a massive increase in speed.

The clock speed increase is thermally controlled. It'll boost into a higher bin if the console has the thermal headroom otherwise it will down clock. It is a firmware feature that Sony can add at anytime. It may just be a few cores that boost, much like Intel's Turbo feature.

50% more shaders is the difference between 40FPS and 60FPS in a graphically intensive game.

#79 Edited by Griffinmills (149 posts) -

http://i.word.com/idictionary/fervent

This link doesn't work for me. It just returns an advertisement page about being able to search on an iPhone. Regardless, I don't think fervent applies to the posts from Alex in this thread, which I would more describe as, "measured." Keep up the good effort, Alex.

Once it got rolling there has been lots of good discussion in this thread, especially compared to that gaf thread linked in the OP. I too will be interested when the consoles do launch and we can get some concrete answers. My own pov lines up well with @jimbo's take in post #52. That the enthusiast and hardcore tend to have more pull than their simple numbers would indicate as they influence their friends and family who look to them for advice and guidance in these matters. This influence isn't absolute, and sometimes the VHS beats the Beta-max!

#80 Edited by jgf (387 posts) -

@alexglass: First of I want to make sure that we are talking about the same statement:

So one developer guy said on twitter that he spoke to several other developers and they told him that in their tests, PS4 performes significantly (up to 50%) better then Xone.

As it is no further specified I'll assume they were talking about some in-game 3D engine (framerate) performance as it would make the most sense. I hope we can agree on that.

So lets talk about these more specific points the MS guy made. Some of them are quite interesting. For me it boils down as follows.

  1. 18 CUs vs 12CUs is not 50% faster, because of some multithreading/communication offset. He misses to state how much faster it is then. From my experience as a programmer I would say that multithreading in graphic cards scales quite well. So perhaps we're talking 45% faster. He explicitly misses to address that though.
  2. A single Xone CU is 6% faster then a PS4 CU, so we've got something like 18 vs 12,7 so we're at 41-42% faster, minus the unspecified multithreading overhead lets say 35-38% faster. For graphics card performance alone that a number I could believe.
  3. 10% faster CPU + "better" audio chip. Although I don't like the unspecific "better" audio part I get that the CPU is faster by about 10-15%, depending on how much that audio thing is really relevant. From PC gaming I know that 3D games rarely max out cpu and instead put a heavy load on GPU.
  4. The memory bandwidth "on paper". We have 176gb/sec vs 272gb/sec which he derives from 68gb/sec DDR3 + 102gb/sec ESRAM read + 102gb/sec ESRAM write. I believe him that the theoretical bandwidth is 272gb/sec because of the simultanous read/write on ESRAM. However he does not address how that translates to game performance, because I see 2 bottlenecks here:
    1. Those 102gb/sec read/write are only available for a small portion of the memory (32MB vs 8GB) and with the 102gb/sec write speed that buffer is full in about 3 milliseconds, after that you'll have to use the 68gb/sec DDR3. So simply adding numbers here is very theoretical.
    2. The biggest issue I have is that he adds up the simultanous read/write speed to the ESRAM. So in order to max that bandwidth out, you'll need to perform exactly as many read as write operations to ESRAM at any given point in time during the game. Thats seems like a though restriction to pull off in a real world sceanrio.

So here is my conclusion. I still believe that developers see a significant performance advantage for PS4 in their current tests, because some of the Xones special features (especially on paper memory bandwidth advantage) seem hard to utilize in practice. However when some of the undoubtly smart guys at MS figure that stuff out and provide devs with an easy to use library support, things may change and the performance gap could narrow down, but someone has to put extra work in to figure stuff out. Kind of like with the PS3.

But it seems reasonable that when you as a developer simply take your current engine and port it to PS4/Xone without any further special optimizations, you are going to see a significant performance gap. Therefore I still believe that Adrian just told the truth about how things are in their current state. But also the MS guy had some fair points, that somewhere down the road there is untouched optimization potential left. He failed to make some conclusions for a real world scenario though.

#81 Edited by AlexGlass (688 posts) -

@subyman said:

Actually, a console pretty much is a GPU these days, especially with devs off loading more tasks to the GPU. The big push will be OpenCL for physics this generation, which blazes on AMD GPUs. Sony having a 50% increase in GPU compute will be very important. A low end Intel i3 runs with high end CPUs in most PC games other than sims like Civ 5 and Shogun or poorly threaded games in which IPC is king (neither of these consoles have good IPC compared to current Intel processors anyway) while, say, jumping from a 7850 to a 7970 is a massive increase in speed.

The clock speed increase is thermally controlled. It'll boost into a higher bin if the console has the thermal headroom otherwise it will down clock. It is a firmware feature that Sony can add at anytime. It may just be a few cores that boost, much like Intel's Turbo feature.

50% more shaders is the difference between 40FPS and 60FPS in a graphically intensive game.

GPU is no doubt the most important component but to sit here and say it's so important it makes everything else irrelevant is just a fallacy. To extrapolate that CU difference out to your entire system is just a big fat lie.

Bandwidth, CPU, memory, architecture still play a big role. And if you're going to start using GPGPU to do compute functions traditionally reserved for the CPU, then obviously you're not going to have that GPU advantage anymore. It's going to take away from that. On top of that, there's still a lot of code better suited to run on the CPU, so dumping it all off on the GPGPU isn't feasible. Code better suited for GPGPU has to be highly parallel. Whatever CPU code you're running on GPGPU you should still be able to run on the CPU but it doesn't work as nicely the other way around.

Also, having 50% more CUs in this case still doesn't equal 50% advantage even in that one area since they're now running at slower speed, 853Mhz vs 800Mhz. And the other issue is, the more CU you have the more lines of parallel code you need in order to efficiently keep them fed every cycle. For example, it's a lot easier for developers to keep 9 fed, run them at more cycles per second and reach their peak theoretical numbers than 18. You're code would need to be twice as fat in parallel calculations to get the same efficiency from having 18, so efficiency will definitely play a role.

#82 Posted by AlexGlass (688 posts) -

@jgf said:

@alexglass: First of I want to make sure that we are talking about the same statement:

So one developer guy said on twitter that he spoke to several other developers and they told him that in their tests, PS4 performes significantly (up to 50%) better then Xone.

As it is no further specified I'll assume they were talking about some in-game 3D engine (framerate) performance as it would make the most sense. I hope we can agree on that.

So lets talk about these more specific points the MS guy made. Some of them are quite interesting. For me it boils down as follows.

  1. 18 CUs vs 12CUs is not 50% faster, because of some multithreading/communication offset. He misses to state how much faster it is then. From my experience as a programmer I would say that multithreading in graphic cards scales quite well. So perhaps we're talking 45% faster. He explicitly misses to address that though.
  2. A single Xone CU is 6% faster then a PS4 CU, so we've got something like 18 vs 12,7 so we're at 41-42% faster, minus the unspecified multithreading overhead lets say 35-38% faster. For graphics card performance alone that a number I could believe.
  3. 10% faster CPU + "better" audio chip. Although I don't like the unspecific "better" audio part I get that the CPU is faster by about 10-15%, depending on how much that audio thing is really relevant. From PC gaming I know that 3D games rarely max out cpu and instead put a heavy load on GPU.
  4. The memory bandwidth "on paper". We have 176gb/sec vs 272gb/sec which he derives from 68gb/sec DDR3 + 102gb/sec ESRAM read + 102gb/sec ESRAM write. I believe him that the theoretical bandwidth is 272gb/sec because of the simultanous read/write on ESRAM. However he does not address how that translates to game performance, because I see 2 bottlenecks here:
    1. Those 102gb/sec read/write are only available for a small portion of the memory (32MB vs 8GB) and with the 102gb/sec write speed that buffer is full in about 3 milliseconds, after that you'll have to use the 68gb/sec DDR3. So simply adding numbers here is very theoretical.
    2. The biggest issue I have is that he adds up the simultanous read/write speed to the ESRAM. So in order to max that bandwidth out, you'll need to perform exactly as many read as write operations to ESRAM at any given point in time during the game. Thats seems like a though restriction to pull off in a real world sceanrio.

So here is my conclusion. I still believe that developers see a significant performance advantage for PS4 in their current tests, because some of the Xones special features (especially on paper memory bandwidth advantage) seem hard to utilize in practice. However when some of the undoubtly smart guys at MS figure that stuff out and provide devs with an easy to use library support, things may change and the performance gap could narrow down, but someone has to put extra work in to figure stuff out. Kind of like with the PS3.

But it seems reasonable that when you as a developer simply take your current engine and port it to PS4/Xone without any further special optimizations, you are going to see a significant performance gap. Therefore I still believe that Adrian just told the truth about how things are in their current state. But also the MS guy had some fair points, that somewhere down the road there is untouched optimization potential left. He failed to make some conclusions for a real world scenario though.

I agree with your numbers. That's what I have as well, and that's why I say, not even under the most ideal of comparisons in favor of the PS4 can you come up to 50% difference...unless you specifically measured an engine running heavy GPGPU compute prior to the clock upgrade. Then yeah...but his Twitter is certainly nowhere near that specific and he clearly comes back and stats "The PS4 is faster". Meaning as a whole.

As far as the eSRAM part, sure, it's obvious that is theoretical peak numbers, but I think the reason they're doing that is because the PS4's bandwidth is also peak and theoretical. It's true that X1 peak bandwidth applies to a fraction of RAM and the PS4's bandwidth applies to the entire GDDR5 ram pool so it should still have a significant advantage. But the truth is we have no idea just how much of that bandwidth things such as framebuffer ops will chew up this gen and that's why I think there is some truth to them adding that bandwidth up. The eSRAM will be very handy for things like that. That eSRAM will likely be filled and utilized close to 100% at all times, while the peak bandwidth to main memory at 176GB/s is a pipe dream under normal operating conditions. Especially if you start taking into consideration latency, which will also play a role.

So yeah them adding that up, and the need to find simultaneous read/write operations and keep that eSRAM busy and get its peak bandwidth, is similar in a lot of ways to needing to write and have find parallel code to keep 27 CU busy at all times vs 18 to get your 40% difference in compute. Both probably unrealistic.

Speaking of latency and GPU compute units, how will the lower latency and eSRAM affect this area? This is another topic not really discussed. If you do plan on utilizing GPU compute, then bandwidth isn't going to be nearly as important as latency.

#83 Edited by The_Laughing_Man (13629 posts) -

@jgf: so are you saying that the Xboxone is the "cell" of this generation?

#84 Posted by jgf (387 posts) -

@the_laughing_man: I would say that the cell was even worse in terms of complexity, but in a sense yes! I'll say it: maxing out esram performance on xone makes it the "cell" of next gen gaming ;)

#85 Posted by The_Laughing_Man (13629 posts) -

@jgf: I still wanna know why they are having a developer show the day after the AMD stuff.

#86 Posted by jgf (387 posts) -

As far as the eSRAM part, sure, it's obvious that is theoretical peak numbers, but I think the reason they're doing that is because the PS4's bandwidth is also peak and theoretical. It's true that X1 peak bandwidth applies to a fraction of RAM and the PS4's bandwidth applies to the entire GDDR5 ram pool so it should still have a significant advantage. But the truth is we have no idea just how much of that bandwidth things such as framebuffer ops will chew up this gen and that's why I think there is some truth to them adding that bandwidth up. The eSRAM will be very handy for things like that. That eSRAM will likely be filled and utilized close to 100% at all times, while the peak bandwidth to main memory at 176GB/s is a pipe dream under normal operating conditions. Especially if you start taking into consideration latency, which will also play a role.

So yeah them adding that up, and the need to find simultaneous read/write operations and keep that eSRAM busy and get its peak bandwidth, is similar in a lot of ways to needing to write and have find parallel code to keep 27 CU busy at all times vs 18 to get your 40% difference in compute. Both probably unrealistic.

Speaking of latency and GPU compute units, how will the lower latency and eSRAM affect this area? This is another topic not really discussed. If you do plan on utilizing GPU compute, then bandwidth isn't going to be nearly as important as latency.

Of course both bandwidth numbers are theoretical, but I would still argue that the 176gb/sec of PS4 are far less theoretical then the 272gb/sec of Xone, because of those additional restrictions (size and simultanous read/write) that have to be met. I think there is a reason why all PC GPUs go with GDDR5 instead of the DDR3/ESRAM combo.

The added latency is most certainly a drawback for CPU intensive tasks with many small read/write operations, how that translates to GPGPU performance - I have no clue if latency trumps bandwidth for high memory volumes there. I was just arguing that due to the simpler architecture and bigger GPU I can see why PS4 performes significantly better in early 3D heavy tests. How things change with more refined algorithms/optimizations on the Xone side will be very interesting to see.

#87 Posted by MonkeyKing1969 (2691 posts) -

If you ask me that whole conversation on GAF seemed murky at best. We don't know the full specs of either machine, so until we do I'd take this sort of discussion with a grain of salt. I'm not even a huge fan of MS hardware, but it seems very unlikely there is that much of a difference would be happening. It could be that the CPU/GPU w/ the faster RAM in combination boosts the raw advantages for PS4...but 50% seems far fetched.

It just a guess what I think the games will look pretty much the same it might just come down to Digital Foundry tests showing the slight variations or the trickery going on. In the end does it matter? This generation will be dominated by Indie games, or so it seems, the last time I looked both system can handle some FEZ.

#88 Edited by AlexGlass (688 posts) -

@jgf said:

@alexglass said:

As far as the eSRAM part, sure, it's obvious that is theoretical peak numbers, but I think the reason they're doing that is because the PS4's bandwidth is also peak and theoretical. It's true that X1 peak bandwidth applies to a fraction of RAM and the PS4's bandwidth applies to the entire GDDR5 ram pool so it should still have a significant advantage. But the truth is we have no idea just how much of that bandwidth things such as framebuffer ops will chew up this gen and that's why I think there is some truth to them adding that bandwidth up. The eSRAM will be very handy for things like that. That eSRAM will likely be filled and utilized close to 100% at all times, while the peak bandwidth to main memory at 176GB/s is a pipe dream under normal operating conditions. Especially if you start taking into consideration latency, which will also play a role.

So yeah them adding that up, and the need to find simultaneous read/write operations and keep that eSRAM busy and get its peak bandwidth, is similar in a lot of ways to needing to write and have find parallel code to keep 27 CU busy at all times vs 18 to get your 40% difference in compute. Both probably unrealistic.

Speaking of latency and GPU compute units, how will the lower latency and eSRAM affect this area? This is another topic not really discussed. If you do plan on utilizing GPU compute, then bandwidth isn't going to be nearly as important as latency.

Of course both bandwidth numbers are theoretical, but I would still argue that the 176gb/sec of PS4 are far less theoretical then the 272gb/sec of Xone, because of those additional restrictions (size and simultanous read/write) that have to be met. I think there is a reason why all PC GPUs go with GDDR5 instead of the DDR3/ESRAM combo.

The added latency is most certainly a drawback for CPU intensive tasks with many small read/write operations, how that translates to GPGPU performance - I have no clue if latency trumps bandwidth for high memory volumes there. I was just arguing that due to the simpler architecture and bigger GPU I can see why PS4 performes significantly better in early 3D heavy tests. How things change with more refined algorithms/optimizations on the Xone side will be very interesting to see.

Yeah I'm with you on that. You should easily be able to get more than 68GB/s out of that 178GB/s to main RAM even at 50% efficiency.

As far as latency, I don't think it will be so much a disadvantage in the PS4's case due to GDDR5 in terms of typical main ram access due to it just having so much available. But rather, I wonder if having the eSRAM on chip there will help with GPGPU on the X1. Of course you'd lose the ability to fill the eSRAM for other uses, but I'd be interested to know if it's flexible enough to help in that area. Could it serve a similar purpose to GPGPU compute units as something of a slower L2 or L3 cache to a CPU?

I don't think I've heard anyone ever touch up on this point but it's interesting to think about.

#89 Posted by Istealdreams (148 posts) -

@alexglass: It also mean impassioned; which when a majority of your posts are championing/defending/explaining and postulating on the actual function of anything that Xbox One can do - by a large margin-are, then yes I would say that you fervently defend every position of the system. Even to the point of calling out speculation of the ps4 while speculating on the functionality and uses of the Xbox One, Cloud, and Kinect.

--And fervently is a commonly used word that wasn't used to impress or look cool, just to get my point across. Unless you're a developer or work for microsoft or sony, everything people post in regards to use and function is conjecture. No matter how many times they post about it.

#90 Posted by MonetaryDread (2020 posts) -

Here is one thing that most people on the forums keep forgetting to bring up in conversation. One of the main reasons why game studios preferred working on Microsoft's hardware this generation is because of the tool sets that were available to them. Microsoft is a software company first and knows how to create dev tools that allow the creatives to express themselves easier. Sure this generation involves a large number of upgrades to the hardware, but I am certain that the biggest difference between this generation and the last generation is the software dev tools. Think of multiplayer on the 360 vs PS3. The main reason why Xbox live was so much more reliable was because Live was a software package that took a lot of the redundant labour out of the development cycle, whereas the PS3's network software was in such infancy at launch that developers had to basically code everything from scratch.

I remember listening to Cliffy B on Joe Rogans podcast a few years ago and he was talking about the differences between Unreal Engine over the years. He brought up how the new version (what would eventually become Unreal 4, I assume) was designed to streamline the whole process of creating games. Epic was doing things like creating a massive library of models so 3D modellers never had to make another table, or door frame again. Then you have to imagine features that allow the streamlining of actor path finding, or improved physics based animation systems (thing GTA and how the character would react if you walked too close to a moving car). I assume that the same kind of process is going on at Microsoft

#91 Posted by The_Laughing_Man (13629 posts) -

@jgf said:

@alexglass said:

As far as the eSRAM part, sure, it's obvious that is theoretical peak numbers, but I think the reason they're doing that is because the PS4's bandwidth is also peak and theoretical. It's true that X1 peak bandwidth applies to a fraction of RAM and the PS4's bandwidth applies to the entire GDDR5 ram pool so it should still have a significant advantage. But the truth is we have no idea just how much of that bandwidth things such as framebuffer ops will chew up this gen and that's why I think there is some truth to them adding that bandwidth up. The eSRAM will be very handy for things like that. That eSRAM will likely be filled and utilized close to 100% at all times, while the peak bandwidth to main memory at 176GB/s is a pipe dream under normal operating conditions. Especially if you start taking into consideration latency, which will also play a role.

So yeah them adding that up, and the need to find simultaneous read/write operations and keep that eSRAM busy and get its peak bandwidth, is similar in a lot of ways to needing to write and have find parallel code to keep 27 CU busy at all times vs 18 to get your 40% difference in compute. Both probably unrealistic.

Speaking of latency and GPU compute units, how will the lower latency and eSRAM affect this area? This is another topic not really discussed. If you do plan on utilizing GPU compute, then bandwidth isn't going to be nearly as important as latency.

Of course both bandwidth numbers are theoretical, but I would still argue that the 176gb/sec of PS4 are far less theoretical then the 272gb/sec of Xone, because of those additional restrictions (size and simultanous read/write) that have to be met. I think there is a reason why all PC GPUs go with GDDR5 instead of the DDR3/ESRAM combo.

The added latency is most certainly a drawback for CPU intensive tasks with many small read/write operations, how that translates to GPGPU performance - I have no clue if latency trumps bandwidth for high memory volumes there. I was just arguing that due to the simpler architecture and bigger GPU I can see why PS4 performes significantly better in early 3D heavy tests. How things change with more refined algorithms/optimizations on the Xone side will be very interesting to see.

Yeah I'm with you on that. You should easily be able to get more than 68GB/s out of that 178GB/s to main RAM even at 50% efficiency.

As far as latency, I don't think it will be so much a disadvantage in the PS4's case due to GDDR5 in terms of typical main ram access due to it just having so much available. But rather, I wonder if having the eSRAM on chip there will help with GPGPU on the X1. Of course you'd lose the ability to fill the eSRAM for other uses, but I'd be interested to know if it's flexible enough to help in that area. Could it serve a similar purpose to GPGPU compute units as something of a slower L2 or L3 cache to a CPU?

I don't think I've heard anyone ever touch up on this point but it's interesting to think about.

So do we need to worry about the X1 being 50% weaker?

#92 Edited by Seppli (10251 posts) -

@alexglass said:

@seppli said:

@alexglass said:

@subyman said:

Well, PS4 has 50% more shaders. I'm surprised this is overlooked so often. 50% more shaders translates almost 1:1 to compute speed. More shaders does not mean higher resolution textures, but it means better real-time lighting, post processing, water effects, and such. Combine that with GDDR5 and these claims are not far-fetched at all.

Sure but we know the X1's GPU/CPU had clock boosts, and the eSRAM ended up having more bandwidth, and a console isn't a GPU. It's made up of many components. And Adrian is making a blanket statement about the console as a whole. I mean, CPUs, bandwidth, sound chips, architecture,....last time I checked, they're important and add up to the total computational capabilities of each box. And even if you go off of total TFLOPs, you come around 40%. So how is he coming up with the numbers? Or the developers he spoke to? And when was it? Prior to the GPU/eSRAM/CPU upgrades, after?

Something doesn't add up about his comment either.

All Microsoft is doing is overclocking Xbox One, which is both detrimental to their chip yield, making the box more expensive, and will lead to more hardware failures overall. The tweaking increases Xbox One's raw power by a few measly percent. What it's been mostly about is Spin. And you're lapping it up wholesale.

Speak facts. I'm not lapping up anything, but it seems to me you are and are listing a whole bunch of assumptions with nothing to back them up. The increases add actual performance and yeah it's a small percentage but a percentage nonetheless. The facts based on the numbers we do know just don't add up to 50%, irrelevant of whether or not you even take that into consideration.. It's not just spin.

Turning a GPU CU % difference into a blanket statement that refers to the entire console...now that's lapping it up wholesale. A GPU =/ console. That's a fact. MS doesn't need to spin anything there. You just need some basic common sense to know that people going around spreading that line are in fact spreading FUD.

What assumption am I making here exactly?

  • That PS4 has a significantly more powerful GPU? Nope that's a fact.
  • That PS4 has significantly faster RAM? Nope that's a fact.
  • That the Xbox One team is increasing clockspeed for a insignificant speedboost (single digit percentage-wise), which does both inherently increase the cost of production and risk of operation (albeit likely just as marginally as the resulting computing power increase is)? Nope that's a fact.

To me, it is clear that increasing clockspeed this late in the game is primarily spin. It's about winning back some of the mindshare of core consumers. The Xbox One may only be 1% faster than before, but Microsoft is hoping to win back 10% mindshare. Sure - you can say this is pure conjecture, but really, the corporate tailspin caused by post E3 preorder numbers, and the resulting backpaddling and constantly on-going spin on Xbox One and all of its features, especially hardware-power-wise, is painfully obvious.

We don't know how much more powerful PS4 is. All we do know is, that it is built with significantly better components, that are more elegantly assembled, and that we will pay a 100$ less for it all. That's really all I need to know right now.

#93 Edited by Seppli (10251 posts) -

@syed117 said:

@seppli: the world has changed to the point where core gamers don't necessarily have the influence they used to. If that were the case, the Wii never would have taken off. That console was almost universally hated by core gamers. It was the joke of the industry for an entire generation. That's why we will probably never have another home console success like the ps2 was.

I'm not sure how sales or preorders affect me, you or anyone else. Does it really make you feel better about your purchase if other people also buy the same product? That is high school cool kid mentality at its finest.

Do you remember what happened last time? Everyone is so quick so point out how terrible the pre orders for xbox one are. Last time the PS3 launched a year later with what Sony claimed was a much more advanced console and sales were abysmal. They were terrible for a long time.

The PS3 was a financial disaster for Sony but at the end of the generation they did just fine sales wise. I think they've even pulled ahead by a few hundred thousand units. We are talking about a console that suffers from a lack of core features and is still getting inferior ports.

People writing off the xbox one because of what it might sell in the first year are ridiculous. Microsoft knows that this isn't a sprint. If it were, Sony would have collapsed within a year of the ps3 launching and the xbox 360 would never have been made.

If all hell breaks loose and one company has to go under. Who would it be? One of the largest companies in the world with billions in revenue? Or Sony, a company that is slashing, burning, and selling off massive buildings just to return to profitability?

Dude, your grasping at straws here. Most of the things you seem to read into my posts weren't even implied.

  1. Core consumers were immensely hyped for Wii. It was not universally hated. Core gamers were universally disappointed way after it mattered, but they certainly did put units in front of family and friends early in the game.
  2. Are we here to discuss how preorders affect you? We're here playing at being market analysts. Preorders are certainly indicative, especially when pretty much every decision in the gaming industry builds on this metric. 2:1 preorder in favor for PS4 means that retailers are keen to put twice as many PS4 on their shelves later down the line.
  3. What has PS3 got to do with anything? It's yesterday's news. And let me tell you, my consumer experience with PS3 & PS+ has been way better than 360 & Live Gold ever was. Never was somebody who gave a rat's ass about party chat, and now Sony's got all that too and much more on top of it.
  4. Nobody's writing off Xbox One in the longterm. Strictly speaking *Phase 1 (Launch Window)* however, a majority obviously has, myself included.
  5. Who's talking doom and gloom anyways?

#94 Edited by BIGJEFFREY (4984 posts) -

I play console games to not give a shit about this stuff, All i know is Halo will burn my eyes because how fucking fantastic it looks.

#95 Posted by jgf (387 posts) -

So do we need to worry about the X1 being 50% weaker?

I would say that the performance difference is nothing one as a gamer should be worried about. Games will look good on both systems. Obviously I can't tell you an absolute number, given the information we have my wild guess would be 20-30% in-game performance lead for ps4.

#96 Edited by Seppli (10251 posts) -
@jgf said:

@the_laughing_man said:

So do we need to worry about the X1 being 50% weaker?

I would say that the performance difference is nothing one as a gamer should be worried about. Games will look good on both systems. Obviously I can't tell you an absolute number, given the information we have my wild guess would be 20-30% in-game performance lead for ps4.

30% is plenty. Hypothetically, a game locked at 60 FPS on PS4 might only manage 40'ish on Xbox One, which would likely have it end up being locked at 30 FPS. The difference between Call of Duty on PS3 in comparison to its 360 versions is about what to expect from all multiplatform games on PS4/Xbox One, if things go as predicted. The 360 version of Call of Duty was always pretty much a rocksteady 60 FPS, the PS3 version was looking more grungy and washed out, whilst also stuttering along at an erratic 25-50 FPS.

People who care about Call of Duty, care about their frames. With an alleged performance gap like that, everybody should care really. PS4 supposedly being that far ahead performance-wise, without any other mitigating factors, every multiplatform game that's pushing the envelope will be facing similar issues, and we will have to deal with the disparity in frames and such.

It's a big enough potential disparity to see problems arising pretty soon. Imagine CD Project RED manages to squeeze The Witcher 3 onto PS4 rendering natively at 1080p and locking it at a smooth 30 FPS. What will they do to make it run well enough on the 30% weaker system? Reduce rendering resolution? Drawing distance? Just go with a rocky 25 FPS? Or have it be downright broken like all of Bethesda's open world games were on PS3 (admittedly that was likely due to PS3 having less overall system memory, there's no such disparity with this new generation)?

Why wouldn't we be worried about these things? If multiplats are consistently running better on PS4 (as they well should, comparing the spec sheets), and there are no other mitigating factors in favor of Xbox One, PS4 should be your mainstay for multiplatform gaming for sure, unless you've got a very capable current PC. While it's too soon to tell for sure, I feel pretty confident that the known specs and the word coming down through the grapevine from various devs is a clear indication that we've got a significant disparity on our hands.

I only buy one of the new gen boxes. At least for the next year or two. It's really no competition on what box to pick, at least based on performance and price. Playstation 4 it is for me.

#97 Edited by jgf (387 posts) -

@seppli: I think there are enough tricks they can pull. Just look at ryse or forza. Those don't look too shabby.

We don't know how big the difference will be, its just a somewhat educated guess. But if you are mostly concerned with the performance of multiplatform titles and want to decide on PS4 vs Xone right now, in my opinion the PS4 is the safe bet. It has a simpler architecture that very likely performes great without too much time spent on optimization. On the other hand the Xone is weaker then the PS4 - not even MS is debating that - its only a question how big the gap will be, when all those special features are fully utilized.

#98 Edited by Seppli (10251 posts) -

@jgf said:

@seppli: I think there are enough tricks they can pull. Just look at ryse or forza. Those don't look too shabby.

We don't know how big the difference will be, its just a somewhat educated guess. But if you are mostly concerned with the performance of multiplatform titles and want to decide on PS4 vs Xone right now, in my opinion the PS4 is the safe bet. It has a simpler architecture that very likely performes great without too much time spent on optimization. On the other hand the Xone is weaker then the PS4 - not even MS is debating that - its only a question how big the gap will be, when all those special features are fully utilized.

Agreed.

By the way, isn't AMD's hUMA - which PS4 alledgely supports - doing approximately the same thing as Xbox One's ESRAM? You seem knowledgable about these kinds of things. When we are already on the topic of *trickery*.

#99 Posted by jgf (387 posts) -

@seppli: I'm more of a software then a hardware guy (I work in the field of compiler/program analysis) so please take everything I say with a grain of salt. To my knowledge hUMA describes more or less the ability to efficiently access the same memory from CPU as well as from GPU. So its not the same thing as the ESRAM. Also I read that both systems will support some sort of hUMA-like behavior.

So before huma you took your memory and decided beforehand which part you would dedicate to the CPU and which to the GPU. In the PS3 case this wasn't even up to you to decide, it was fixed at an even 256mb/256mb split - which turned out to be quite annoying. So now when you want to compute stuff, there are tasks that are better suited (aka faster) on the CPU and others are faster on the GPU. What you would do now is take your data, copy it to the CPU part of the memory and let the CPU do its job for task 1. Then as task 2 is better suited for the GPU you would need to copy your whole data from the CPU part of the memory to the GPU part of the memory before the GPU can start doing its job. With huma you don't need to do this anymore. You can leave the data wherever it is in memory and simply tell the CPU to get lost and the GPU to take over or vice versa. This essentially saves you the time and additional memory you would need to copy the data back and forth. Its obviously a nice thing to have, but I have no idea how much impact huma is going to have on the performance of real world tasks.

#100 Posted by AlexGlass (688 posts) -

@alexglass: It also mean impassioned; which when a majority of your posts are championing/defending/explaining and postulating on the actual function of anything that Xbox One can do - by a large margin-are, then yes I would say that you fervently defend every position of the system. Even to the point of calling out speculation of the ps4 while speculating on the functionality and uses of the Xbox One, Cloud, and Kinect.

--And fervently is a commonly used word that wasn't used to impress or look cool, just to get my point across. Unless you're a developer or work for microsoft or sony, everything people post in regards to use and function is conjecture. No matter how many times they post about it.

You realize you are the only person in this thread taking a technical discussion and trying to turn it into something personal for no apparent reason? You don't even make an accurate point in your argument nor is your point in any way relevant to the discussion. I'm pretty sure I'm calling out speculation on both. And this new post is just a continuation of your original attempt to smear, discredit, dilute and are attempting to generalize and rewrite my history in whatever skewed perception you have of me and feel the need to come in here and splurge about it. You came in here with nothing to add, other than to make the argument personal. And you continue to do so. What does that say about you? Stop projecting your feelings on me or in this thread. PM me if you have an issue.