Selfish tears

I'm crying because I just learned that Ryan "Taswell" Davis died last week. I didn't know him outside of Giantbomb. I came to giantbomb because I learned of Jeff from his firing drama, and thought "there's a man with some integrity", and I fucking respect that.

Started listening to the bombcast. In the beginning I couldn't even separate the voice of Jeff from Ryan, which just seems idiotic today.

Whatever. Ryan was always the perfect host, for the bombcast, and for the various on-camera shows. I honestly don't even see how there can be a giant bomb without him. I love giant bomb, but Ryan was the heart and soul of this machine.

I'm crying because I know I'll never hear "Hey everyone, it's TUUUEEsday June the 25:th twenty-thirteen, and you're listening to the giant bombcast, I'm Ryan Davis..." again. I will never hear the TNT intro again. I will never hear him talk about Saints Row 4. I will never see him in the studio for the first PS4 and XBone shows. I will never hear about his continued adventures at California Extreme. I will never hear about his post-marriage adventures, like getting some alone time from his GF to sit on a cake. I'll never hear him rip some assholes a deserved one, or put down Will Smith, or ...

I'm crying for selfish reasons, and I'm angry. Fuck Ryan Davis.

1 Comments

XBone eSRAM/GPU Performance Rumors - Scenarios

(In a previous blog post I ranted about the latest rumor regarding MS having yield problems fabbing their large APU for the XBone, and promised a follow up with some possible implications...)

So, let's look at some #potentialscenarios ASSUMING that the rumors are true.

(Remember all that cloud-computing nonsense? Makes sense they'd trott that out if they knew they there were...erh... dark clouds ahead. "Don't worry, the out-of-the box performance doesn't matter when we have... the cloud!" Anyhoo...)

FAB scenarios

Microsoft could just go ahead with what they've got. Low yields doesn't necessarily mean no-yields. Eat the cost and test for the 'golden samples' that hit the specification they want, or maybe even push out HW that will fail down the line, but prepare to eat that cost red-ring style (warning sign: generous warranties). Long term, MS could still be good, if they can get to the next process shrink, they just have to suffer through the pain for now.

This could mean very low availability and/or very high cost at introduction. Want an XBone under the Christmas tree? Better hope you're early in the pre-order queue. Microsoft are rich, which is good because this could be a very costly adventure.

Second option, Microsoft could downgrade; specify a lower performance target that is low enough that they get they yield they need, while not crippling the APU too much. The raw GPU performance on the XBone was already believed to be quite a bit weaker than the PS4, and this would only make it worse. Could even make it much worse. We're talking a noticable difference here. 1:st party Sony devs would be able to outperform MS exclusives easily. Multi-platform games would have better frame-rates on the PS4, or maybe even same frame-rate at higher resolution. XBone version lose some layer of shader effects.

This is the scenario that seems most likely at this point in time, based on what I've read about this rumor. The exact 'cost' is of course WILD speculation, but we're talking a significant performance delta here, not the stale-mate that we had Xbox360 vs PS4 between their respective exclusives.

Third option they could delay, and work with their fabs till the process is good enough to get the yield they require. This would potentially, though of course no dates have been announced, mean giving up the holidays to Sony. If you're Microsoft, that probably doesn't seem like a very good idea.

Developer scenarios

One reaction to this is that it doesn't matter. The XBone is good enough that people will be happy. This is true, I don't expect anyone who doesn't directly compare against the competition to decry its performance.

You can also say that this makes the XBone the lowest-common-denominator, and games are written for the LCD. This has been true, but I hope this will change. If you need a picture in thee different resolutions, you don't start with a 100x100px image and scale it up to 150x150px and 300x300px. You start with the larger image and sample down, because this yields the best results.

It's true that in this current generation games were ported from the 360 to PS3 and PC, which never made sense to me. I hope and believe that games will now instead be ported from the PC to the 360 and PS3 (half because I want it to be so, half because they're now closer in architecture than ever before). If this happens, the LCD factor won't hide the performance difference.

Exclusives may simply look and feel much better, graphics wise, on the PS4. They will have more room to play with resolution vs graphical fidelity vs frame-rate.

Lifetime scenarios

If the XBone is downgraded, that may mean that we must subtract from its lifetime, meaning this next cycle will actually be a short one. Granted, our current console cycle has been pretty long (or at least it feels like), but IMO these consoles are already far behind the state-of-the-art. In performance I mean, not price/performance, but consoles have fixed performance, while PCs do not.

Furthermore, that they're now all on x86-64 means it'll be easier to roll out new hardware while maintaining a high degree of compatibility, maybe even full backward-compatibility. That would allow the platform holders to more readily push out new hardware down the line. They don't WANT to of course, but they may have to.

(Another thing I believe may point to a shorter cycle is that these devices aren't designed for "4k" and beyond. I believe that "4k" will have a significant market share in three-four years, maybe even sooner on the PC side. With this evolution comes even faster graphics cards on the PC side, leaving the consoles even further in the dust.)

E3 Scenarios

These events are always a lot of "show" and very little "tell", but if this rumor is correct, this may be even more true this time. Even if you see something running on a devkit, you can't know if that devkit is running final spec. Granted, the software will probably be so early that it won't matter anyway, frame rates always improve at the end of the project. That said, there might be games that have to lose a rendering pass or some effect just to make their target (whether 30 or 60fps). (This is very flimsy, I know)

Equal Opportunity Scenarios

There's still plenty of time for Sony to fuck things up. Don't believe for a minute that just because I've ranted a lot about MS here I believe Sony to be some knight in shining armor. They've said all the right things on the technology side so far. There's no doubt in my mind they'll fuck it up service side.

15 Comments

XBone eSRAM/GPU Performance Problems Rumors

There's a rumor that Microsoft are having problems with the XBone. There's been rumors before that they're behind on software, but these latest rumors pertain specifically to the hardware and the combination of eSRAM and GPU. The rumor is saying that they're having yield problems, specifically related to the eSRAM.

I will now go on and speculate widely. You have been warned.

It is claimed that the XBone APU is a one-die CPU+GPU+eSRAM solution, meaning that it's very large (~410mm^2). The only thing MS really said at their reveal was "5 billion transistors". This is interesting, and some correctly pointed out that it probably meant they couldn't compete with the PS4 on specs, because if they could, they'd be more specific.

The primary risk with large dies of course are low yields, resulting in higher costs to get enough dies that meet your specification. Now, Microsoft was not building a gaming machine first. They wanted to load a familiar operating system on there. They wanted more focus on applications that historically have not been heavily optimized for one platform. A relatively fat OS, and applications from developers who are not used to the scant resources typically awarded a gaming console, leads to one design conclusion; "we need lots of memory!"

So they design for 8GB, from the start. Again, they put applications (and cost) above gaming, and go for a well known, cheap, plentiful technology in DDR3. Back then 8GB of GDDR5 would look too expensive, almost insane, and DDR4 wasn't on the immediate horizon.

To compensate the gaming side, they now need some fast RAM, hence 32MB eSRAM on-die. The XBone eSRAM is said to be a 6T type, meaning that each bit requires six transistors. 32MB gives us 32*1024*1024*8 bits times six equals 1,610,612,736 transistors. That's 1.6 billion, a huge chunk out of the total 5 billion!

They do desperately need this very fast cache-like memory to fix the fact that they're using "slow" DDR3, which they settled on because they wanted to have a lot of it, for apps -- not games.

Recap: Decision to not focus on games results in design for a lot of memory, which is "slow" for cost reasons, which is compensated with cache. Result: Large die, with large percentage dedicated to what turns out to be a possible problematic memory to fab. Because the cache use so much die, they have to scale down on GPU.

They're reportedly roughly 2 billion transistors larger than the APU used by Sony in the PS4, which means worse thermals, which means lower clocks. The rumor is saying that the yields on this APU, due to the eSRAM, is so low, that they may have to cut the clocks on the eSRAM+GPU even further than what they've previously communicated to developers under NDA. This would make the XBone even weaker on the GPU side vs the PS4, which was already believed to be quite a bit stronger.

MS probably thought for the longest time, like PS4 devs did, that the PS4 would only have 4GB of GDDR5. Microsoft could work with that, they'd have the fast cache (which no doubt can be awesome when properly used -- I'm a huge cache nerd) and more total memory. They designed their system around being the platform with More Memory.

Cerny meanwhile had bet on GDDR5, and his bet has been that availability will increase and cost decrease at such a pace, that even back when they were first thinking about the PS4 and 2GB seemed like ample amounts, as time went on 4GB became viable.. and then at the last second... tick. Out of nowhere Sony steps up on stage and announce that they'll be using 8GB of GDDR5, twice the amount insiders thought for sure.

And they probably did this to match Microsoft, which has now lost their one advantage in specification. They on their parts are left having to pay the price of designing for apps before games, and that price just went up if this rumor is true.

There are so many "#potentialscenario" implications here, which I'll write about in a later post.

60 Comments

PhysX Benchmarks and the Future Of

Pierre Terdiman, a developer on nVidia's PhysX, have put up a blog post called "The evolution of PhysX" with benchmarks over several PhysX versions and Bullet, an open source physics engine.

"I felt compelled to also include the last version of Bullet (2.81), to provide an objective external reference. The thing is, there are many people out there who still think that “PhysX is not optimized for CPU”, “PhysX does not use SIMD”, “PhysX is crippled on purpose”, and so on."

Let me first make it clear that I'm far from an expert in physics engines, and I'm not even that up to date with the latest in the business. I'm pretty sure though that PhysX is still CUDA only as far as GPUs go, meaning it's nVidia hardware only. The rest of my argument rests on that basis.

Though the benchmarks clearly proves that each version of PhysX is faster on the CPU, it doesn't adequately disprove the idea that nVidia have engaged in 'crippling of PhysX' in games.

The problem here is that you probably don't want to run your physics on the CPU to begin with, and as far as I know, there is no publicly available OpenCL backend for PhysX, though I'm sure they have one internally. When I've heard PhysX critiqued, it is this that is the point. It's all very well that PhysX is faster than the competition on the CPU, but what matters to gamers is scenarios where you have CPU and GPU available.

In gaming, nVidia and AMD have about 85% of the market between them. Both losing ground to Intel recently. Simply put, if you belong to the 33% that run an AMD GPU and have your physics run on the CPU where the nVidia owners get to run on the GPU, certainly it's valid to say that PhysX is crippled in this respect, especially if the competition will run on the GPU on both nV and AMD.

The benchmarks as presented only show that PhysX is fast on the CPU in the best case, but it doesn't show that it can't be made to be slow, for instance by artificially resource-starving it. For instance, in a "TWIMTBP" game, nVidia may have instructed the physics to use a maximum of one or two extra threads on the CPU, even if more were readily available. In fact, that they're fast on the CPU could encourage them to cripple an integration just to make sure the CPU path can't catch up with the GPU path even on the fastest of machines.

"Some early investigations into PhysX performance showed that the library uses only a single thread when it runs on a CPU. This is a shocker for two reasons. First, the workload is highly parallelizable, so there's no technical reason for it not to use as many threads as possible; and second, it uses hundreds of threads when it runs on an NVIDIA GPU. So the fact that it runs single-threaded on the CPU is evidence of neglect on NVIDIA's part at the very least, and possibly malign neglect at that." -- Ars Technica, 2010

It would take someone posting game and/or driver disassembly AND benchmarks from a recent title to prove nVidia in the wrong. Again.

PhysX on Next-Gen

It's worth thinking about this in the context of the "next-generation of consoles" rolling out later this year. We know with a fair amount of certainty that they will both use AMD APU parts, so PhysX will almost certainly NOT run on the GPU on them as long as nVidia insists that PhysX should be CUDA only.

I can see three scenarios.

  1. nVidia gives up on the physics engine market.
  2. nVidia provides an OpenCL backend for all platforms.
  3. nVidia special-license GPU-version for the consoles.

(1) is unlikely, (2) would be great for everyone involved, but nVidia being nVidia I wouldn't be surprised to see them take the third options; crippling PhysX on the PC where they (think they) can use it for marketing, but giving in where required to be able to compete at all.

Actually a (4) where everyone come together around a common API would be the best outcome, but it's my view that only MS could push something like that through, and then it'll be some DirectX specific hackery that'll be useless for cross-platform and then what was the point?

On these things and more, the future will tell.

EDIT: @Ken_Addison linked to this press release: "NVIDIA Announces PhysX and APEX Support for Sony PlayStation 4". From the wording it appears as though PhysX will run on the CPU only on the PS4. I should have listed this as an option since it's the most obvious one. Let's call it option 0.

11 Comments

Review of the Kotaku Review of the IGN Review of Bioshock: Infinite

The review is almost unreadable. Graphics; The choice to use a small image instead of industry standard text is very odd. It's also very short, at less than 150 words (estimated) you can read it in well under one minute. On the upside, loading times are great. Sound design is non-existant, and the silent music track gets repetitive after a while.

I give it a ten out of ten.

Start the Conversation

Way - Without Words

There's a growing trend with the type of anonymous online gaming found in games like Journey and Demon/Dark Souls. I just recently found out about this cool little game called "Way" through the Let's Play by Deceased Crab (part 1, part 2), though it seems to have been around since at least 2011.

"Way" is a (forced) cooperative platforming'n'puzzle game where you can only make it through the game by getting help from a stranger on the internet. The trick is that you don't know who your partner is, and the game only allow you to communicate through avatar gestures, similar to those in LittleBigPlanet, where you can change your facial expression and point your arms independently of each other.

I recommend watching those videos, UNLESS you're the kind of person who likes going in blind of course.

Way homepage where you can download the alpha, or read more on the project page at CMU.

Is there a good word for this kind of multiplayer? Suggestions!

Start the Conversation

Omgpop developer decides against joining Zynga after buyout.

Did one man alone resist when Zynga bought out Draw Something developer Omgpop? His reasoning is one I share, especially with regard to being able to fully own and control projects I make on my own time.

"Zynga has an Austin studio, where several good friends of mine work. Yet I had never applied to Zynga. Why? Because the company's values are completely opposed to my own values, professionally and creatively. Because I believe that developers are at the front lines of game development and deserve to be treated well, and I didn't trust Zynga to do so. All this was still true -- except that their complete unwillingness to negotiate with me only confirmed my concerns. Why on earth was I even considering joining?" 
Full story at Gamasutra: "Turning down Zynga: Why I opted out of the $210M Omgpop buy"
15 Comments

"Appearance of impropriety"

Something just came across my screen which made me feel both sad and a bit angry.

Morgan Webb, host on G4TV and reviewer for X-Play, tweeted:

https://twitter.com/#!/MorganWebb/status/98484038662242305
"The best cake ever!!!!!!!!! Thanks Id and Bethesda! !!! I can't bear to cut it! http://t.co/zLq9Vs8"

I realize people in the business receive all sort of crazy promotional items;
drinks, figurines, apparel, pieces of meat... but something about this particular
tweet just rubbed me the wrong way. I think it's exclamation marks. The excitement.
The tangy smell of 'quid pro quo' lingering in the air.

It just feels wrong.
 
It has the appearence of impropriety.
19 Comments

Let's Make It Better: Mass Effect 2

I've been meaning to write about this since I first completed Mass Effect 2 early this year, but with the new Assassin's Creed: Brotherhood video demo (which basically showcases the idea, though probably a bit less ambitious) out I'll cut my own ambition about a larger article short and settle for this blog post.

CENTRAL ARGUMENT: Mass Effect 3 need to have a team away-mission management component.
 

No more hang arounds

 
I found it quite ridiculous that while you and two members were out fighting the good fight, the rest of the team, which is quite large, just sat around isolated to their own chambers.

Being able to send members, the ones you're not personally taking on a mission, on side missions would resolve many issues and open up a whole new dynamic to the game. This is how you could weave in more of the choices you did earlier into the game. Instead of just emails, missions!
 

Core mechanics

 
Here's how I picture it working. From your ship (base) you have a team management terminal. Here you can see the status of your whole party; their stats, weapons, skills and fighting status. You'll also have a mission list. These missions would be the sort of side missions you had in  ME1 where you were driving around the MAKO. Shorter missions, in and out. You should get a set of missions which expand and contract dynamically as you play through the game. You should have a pick, but also the option to skip.

A mission briefing will explain the setting and purpose of the mission. This includes intel, like enemies and dangers, and rewards (money, equipment, resources). A risk analysis might be provided, but the player must also think about this. It's up to the player to compose a mission team of one to three companions, selected such that the mission success probability is maximised. The right team for the right job.
 
The core mechanic is this: Sending people on these missions could level them up, provide income, items and even create allies or enemies. The trade-off is that missions will take time, so whoever you send will not be available to you for missions you take on. I think time here would be counted in some measure of main mission time, so maybe assign mission time units to the regular missions, and have the side missions take up a number of those (example: a side mission may complete after you play one long mission or after two shorter ones. In essence player missions are driving the side mission clock).
 
I would initially not consider it possible for you to actually go yourself on these missions. This for the obvious development resource reasons, but on the surface, because it's time for your team to shine.
 

Risk vs Reward


The most important thing here is in the dynamics of reward versus risk. Let's talk risks.

There'll certainly be the possibility of missions failing to some degree or another, which could impact you because now you have team members who are injured and will need time to recouperate (invest in the medic bay!). I do not however believe it would be in the best interest of gameplay to make it possible for team members to die on these missions. Not unless the game is very explicit and basically telling you that, for this particular mission, sending this group makes it all but a suicide mission.

You could feed mission outcome back into team morale. Now you'll have to consider if the side mission is worth a possible hit to your 'rep' with a certain character. Certain members might oppose some missions, even though they'd be the best ones to do them. Do you push or give in? On the other hand, we could have bonding between team members because they had mission successes together.

With side missions, why would you need mining? The way I see it, mining is the cop-out placeholder for a proper resource management system, and here is one that I can't see failing.

On a more meta-level I like this idea because the uncertainy of the side mission outcome overlaps your regular missions. This means that if a side mission fails you'll have to really consider if it's worth to reload, because you haven't been sitting around waiting for the outcome, you've been out accomplishing things yourself.
 

The Final Piece

 
I readily admit I'm in love with this idea. I see if in my mind, and it fits so perfectly. This sort of mechanism opens so many doors, and it's just the thing ME3 need to be the kind of leap from ME2 that ME2 is from ME1.
 
I really hope it's a central piece of ME3 already. It's likely too late to add it now if it isn't!
24 Comments
  • 14 results
  • 1
  • 2