Something went wrong. Try again later

eloj

This user has not updated recently.

753 761 21 23
Forum Posts Wiki Points Following Followers

My Games of 2020 - Definitely NOT GOTY 2020

I don't have a conventional 2020 list this year. Sure, I played a few games actually released in 2020, but for the most part I played older games.

I went in and cleaned up the first two games in The Bard's Tale Trilogy, and got started on the Thief of Fate, a game I originally played in my early teens. Somehow I have 70+ hours with most of BT3 left to play through. This 'remastered' trilogy of games is a great way to both relive the 80's CRPG dungeon crawl, or if you never played one, discover them for the first time.

I played more Slay the Spire, a game I adore the design of, and also love watching other people playing. Late this year Kevin "Purge" Godec got in to it, so I've been watching that on the youtube.

Speaking of... my record of playing zero hours of DOTA2 continues, but I've watched probably a hundred hours, mostly of Purge, but also Ted "Pyrion Flax" Forsyth. Get back on it, Brad!

I played a bit of Baldur's Gate 3 in early access. I never had any real intention of going deep in the game, I want to save most of it (even the first chapter) for the real thing, and knowing how Larian patches their games I should probably hold of for the "definitive" edition in 2023.

Desperados III is a pure 2020 release. I liked it well enough, but after a few missions I sort of fell off. There are some design decisions I just don't agree with and couldn't get over. Posted some of that in the OT.

I'm pretty sure I own a PS4 copy of Horizon Zero Dawn that I never played since the PS4 has been put away after going unused for a year, but the PC version was a great positive surprise. After some initial talk of performance issues, I picked the game up in the current sale, and it runs real well even on my 2015 PC build (i7-6700K/1080TI/64GB). It's good. Unlike Jeff I'm enjoying mixing up the combat, and unlike Brad I'm playing on a PC and don't really understand his camera complaints. I'm about 20 hours plus change in, and I will finish it.

Finally...

I don't know how many times I restarted Pillars of Eternity II after getting several hours into it, but probably three or four times. I played it day one, but it really needed patches, and so here we are in 2020 and the game is fully patched up and all DLC is out and it's fucking fantastic. I'd go so far and say it's probably the best Baldur's Gate-like we've seen yet. I could go on and on about how I love most everything about this game to the point that I don't really want to finish it, which last happened with Planescape: Torment.

That is why Pillars of Eternity II: Deadfire is my game of the year.

Start the Conversation

"Selling violent video game solutions: A look inside the APA's internal notes"

As much as I hate linking to Elsevier, this paper seems interesting.

Selling violent video game solutions: A look inside the APA's internal notes leading to the creation of the APA's 2005 resolution on violence in video games and interactive media

Abstract

For decades politicians, parent groups, researchers, media outlets, professionals in various fields, and laymen have debated the effects playing violent video games have on children and adolescents. In academia, there also exists a divide as to whether violent video games cause children and adolescents to be aggressive, violent, and even engage in criminal behavior. Given inconsistencies in the data, it may be important to understand the ways and the reasons why professional organizations take a stance on the violent video game effects debate which may reflect greater expressed certitude than data can support. This piece focuses on the American Psychological Association's internal communications leading to the creation of their 2005 Resolution on Violence in Video Games and Interactive Media. These communications reveal that in this case, the APA attempted to “sell” itself as a solution to the perceived violent video game problem. The actions leading to the 2005 resolution are then compared to the actions of the APA's 2013–2015 Task Force on Violent Media. The implications and problems associated with the APA's actions regarding violent video games are addressed and discussed below.

Keywords

  • Violent video games and aggression;
  • Violent media;
  • Media & crime;
  • Media & politics;
  • Task forces

Excerpts:

No Caption Provided

...

No Caption Provided

These via https://twitter.com/DegenRolf/status/960384526940868608

3 Comments

Technical Notes on the Level Format of Puzznic for MS-DOS

Technical Notes on the Puzznic Level Format (PC/MS-DOS)

This post contains some technical details regarding the file format used by the MS-DOS version of the Taito puzzle game Puzznic.

Puzznic PC/MS-DOS options.
Puzznic PC/MS-DOS options.

Compression

Headerless LZW with variable length code words ranging from 9 to 12 bits. The string table can contain a maximum of 2^12 (4096) entries.

The initial table for codes 0 through 255 contains the corresponding octet, as expected.

Code 256 is reserved for a clear (or reset) symbol.

Code 257 is reserved for EOF.

The first free code is thus 258.

The clear code resets the state to the inital one, with 9-bit codes, and the next available code being 258. The compressor will signal this when it runs out of space in the string table.

The current code length is increased once the last code we can express has been written to the table. I.e once the 512:th string is written, the current code width is expanded from 9 to 10 bits, and then again at 1024 to 11 bits and 2048 to 12 bits. Since 12-bits is the upper limit, it's a decoding error if we ever have to expand to 13 bits and the next symbol is not the clear symbol.

The files start with a reset symbol, and end properly with the EOF symbol.

LSB-first packing is used. For instance, if a file starts with the octets 00, 05 and 08, after shifting in 16 bits we're holding the value 0x500. Masking off the lowest 9 bits (0x500 & 0x1FF) we get 0x100, the reset symbol (256). After shifting out the previous code word, we're holding 2. Having read 16 bits and shifted out 9 bits, we shift in the next octet by 7 bits, holding 2 + (8 << 7) or 0x0402, masking 0x1FF bits gives the code word 2, and so on.

While I have not tested it, it's possible that the LZW code from a GIF decompressor would work, if properly initialized.

Reference data

The level files I have decompressed are:

$ sha256sum LEVELMP1.MAP EXTRA.PUZ

8b57ee1e373c926182e47afd3d97477c07f98ad6dde076cdf3c3f703f250d46c LEVELMP1.MAP

f397e1e1b58d02ca6f469c8af0f5e50620621f267f48cb71af545f77d550607a EXTRA.PUZ

They are 5596 and 6227 bytes respectively, and decompress to 18432 bytes.

Level format

Each file contains a number of playfields, where each occupy 128 bytes; 10x12 bytes for the blocks, and 8 bytes of unknown data. The unknown data seems to have high entropy, and is likely just garbage for padding. While each playfield has a par-time, these seems to be related to the level number and not stored with the playfield data.

The MS-DOS version of Puzznic comes with a built-in level editor.
The MS-DOS version of Puzznic comes with a built-in level editor.

The game divides the playfields into levels and problems, such that there are eight levels and each level contains level-number of sets of problems, where each set has four playfields. This gives us the total number of playfields as A046092(8), the 9:th triangular number times 4, i.e 2*8*(8+1), or unrolled: (1*4) + (2*4) + (3*4) + (4*4) + (5*4) + (6*4) + (7*4) + (8*4) = 144.

We can go from a level, set and problem number to the linear playfield number via 2*(level-1)*level + (set-1)*4 + (field-1).

The playfield number for level 3, problem 2-3 is 2*(3-1)*3 + (2-1)*4 + (3-1) = 18, which means the data for that level is at offset 18*128 or 2304 in the file.

Level 3, Problem 2-3 playfield data. Seen also in editor shot above.
Level 3, Problem 2-3 playfield data. Seen also in editor shot above.

The octets maps to blocks as follows:

  • 0x00 metal
  • 0x01 blank
  • 0x02 brick_wall
  • 0x03 lift-updown
  • 0x04 lift-leftright
  • 0x14 red-circle
  • 0x15 diamond
  • 0x16 pink-cube
  • 0x17 triangle
  • 0x18 gray-pentagon
  • 0x19 green-bar
  • 0x1a blue-pyramid
  • 0x1b checkered-box

eof

7 Comments

Abbie Heppe left Respawn

Jay Frechette just announced joining Respawn as community manager.

This presumably means Abbie Heppe is no longer there. There's been no announcement of her leaving that I can find, but I hope she's well. From twitter it seems she doesn't have anything lined up or at least isn't talking about it.

Sooo... ... this means my selfish dream of Heppe at Giantbomb is alive again.

EDIT: Just for the record, it seems she left sometime this summer, so she's been out a while.

13 Comments

The Scorpio and the The Osborne Effect

If you don't recognize the term The Osborne Effect, here's the description from wikipedia:

"The Osborne effect is a term referring to the unintended consequences of a company pre-announcement made either unaware of the risks involved or when the timing is misjudged, which ends up having a negative impact on the sales of the current product. This is often the case when a product is announced too long before its actual availability."

One company which now faces this problem is Microsoft, having announced a new powerful console -- code named Scorpio -- to be released in time for Holiday 2017.

So how do you escape the Osborne Effect?

One way is to somehow escape it by luck. You're somehow able to make an announcement that reach the specific part of the audience you want to hit, say investors, but is never widely noticed by consumers. The Scorpio press conference announcement was widely reported in mainstream news, so this seems unlikely.

Another approach could be to sow confusion. I present as evidence the following article: "Spencer: Scorpio won't do anything for you if you don't have a 4K TV" referencing a Eurogamer interview.

It's pretty obvious that this is a lie. Or misunderstanding if you're charitable. Unless Phil Spencer and his team are totally inept, the only reason for this message is to cause confusion and get people to not focus on the future device. They're saying (paraphrasing) "Please buy the One S, the Scorpio isn't for you". Walking back these statements at a later time by providing more correct (but conflicting) information only serves to further increase confusion for a while.

As I thought about this, I realized something very cynical. I realized that it is now in Microsoft's interest to have consumers confusing the Scorpio with the conveniently available "One S". If they can get the mainstream consumer to buy the "One S" thinking they're getting the new updated much faster console they heard about, they're golden.

When I consider all this, the news that the "One S" is faster than the existing XBox One takes on a different light. It doesn't matter if this is technically true or not (the difference would have to be negligible or they'd used it in the marketing for the "One S"), and while I don't think there's an actual conspiracy here, it does play into their hands by creating a false connection ("faster console") between the "One S" and the Scorpio.

If you think "no one would confuse these two clearly different consoles announced at the same time, both described as newer, better, faster with 4k support", I can tell you I have already seen the confusion on gaming forums.

Time will tell how this plays out, but if I think Microsoft have reason to worry, especially if the hype for Scorpio grows and the release slips. We're roughly 18 months out, and they probably need people to continue to buy the boxes they've already built.

9 Comments

Why I think Sony should officially support the PC with PlaystationVR

Officially supporting the PSVR on the PC, via something like OpenVR, would be a win for everyone involved.

Sony, presumably making a profit off sold units (plus cameras, controllers), would enlarge their potential market to be larger than any one of their direct competitors, ultimately driving down costs through scale, and driving up profits through sales.

Developers too enjoy a bigger market, and would have an obvious reason to port PSVR games to the PC. As it stands, VR is best suited for 'smaller experiences', and a smaller developer can't reasonably disregard the PC market anyway.

Gamers with only a PS4 are unaffected.

Gamers with only a PC will have the currently cheapest desktop VR solution available to them.

Gamers with a PC and PS4 will get better value out of their investment.

The counter-argument would be that Sony could lose software sales as people with both a PC and PS4 will opt for the PC version. I find this argument weak because how likely is someone with a VR-capable PC to buy a PS4-only VR solution in the first case? Long-term what matters more is that the market gets established. Expanding market share and brand recognition while making a profit selling units you wouldn't have sold otherwise should be more than enough to counter-balance this 'issue'.

As someone with both a PC and a PS4, the prospect of spending ~$450 on something which will only work on the PS4 is wholly unappealing. If I were able to leverage that investment onto the PC also, 'splitting the cost' as it were, it'd instead be a non-brainer.

The reality is that, as it stands, the right choice for me is to not invest in PSVR at all.

17 Comments

Selfish tears

I'm crying because I just learned that Ryan "Taswell" Davis died last week. I didn't know him outside of Giantbomb. I came to giantbomb because I learned of Jeff from his firing drama, and thought "there's a man with some integrity", and I fucking respect that.

Started listening to the bombcast. In the beginning I couldn't even separate the voice of Jeff from Ryan, which just seems idiotic today.

Whatever. Ryan was always the perfect host, for the bombcast, and for the various on-camera shows. I honestly don't even see how there can be a giant bomb without him. I love giant bomb, but Ryan was the heart and soul of this machine.

I'm crying because I know I'll never hear "Hey everyone, it's TUUUEEsday June the 25:th twenty-thirteen, and you're listening to the giant bombcast, I'm Ryan Davis..." again. I will never hear the TNT intro again. I will never hear him talk about Saints Row 4. I will never see him in the studio for the first PS4 and XBone shows. I will never hear about his continued adventures at California Extreme. I will never hear about his post-marriage adventures, like getting some alone time from his GF to sit on a cake. I'll never hear him rip some assholes a deserved one, or put down Will Smith, or ...

I'm crying for selfish reasons, and I'm angry. Fuck Ryan Davis.

1 Comments

XBone eSRAM/GPU Performance Rumors - Scenarios

(In a previous blog post I ranted about the latest rumor regarding MS having yield problems fabbing their large APU for the XBone, and promised a follow up with some possible implications...)

So, let's look at some #potentialscenarios ASSUMING that the rumors are true.

(Remember all that cloud-computing nonsense? Makes sense they'd trott that out if they knew they there were...erh... dark clouds ahead. "Don't worry, the out-of-the box performance doesn't matter when we have... the cloud!" Anyhoo...)

FAB scenarios

Microsoft could just go ahead with what they've got. Low yields doesn't necessarily mean no-yields. Eat the cost and test for the 'golden samples' that hit the specification they want, or maybe even push out HW that will fail down the line, but prepare to eat that cost red-ring style (warning sign: generous warranties). Long term, MS could still be good, if they can get to the next process shrink, they just have to suffer through the pain for now.

This could mean very low availability and/or very high cost at introduction. Want an XBone under the Christmas tree? Better hope you're early in the pre-order queue. Microsoft are rich, which is good because this could be a very costly adventure.

Second option, Microsoft could downgrade; specify a lower performance target that is low enough that they get they yield they need, while not crippling the APU too much. The raw GPU performance on the XBone was already believed to be quite a bit weaker than the PS4, and this would only make it worse. Could even make it much worse. We're talking a noticable difference here. 1:st party Sony devs would be able to outperform MS exclusives easily. Multi-platform games would have better frame-rates on the PS4, or maybe even same frame-rate at higher resolution. XBone version lose some layer of shader effects.

This is the scenario that seems most likely at this point in time, based on what I've read about this rumor. The exact 'cost' is of course WILD speculation, but we're talking a significant performance delta here, not the stale-mate that we had Xbox360 vs PS4 between their respective exclusives.

Third option they could delay, and work with their fabs till the process is good enough to get the yield they require. This would potentially, though of course no dates have been announced, mean giving up the holidays to Sony. If you're Microsoft, that probably doesn't seem like a very good idea.

Developer scenarios

One reaction to this is that it doesn't matter. The XBone is good enough that people will be happy. This is true, I don't expect anyone who doesn't directly compare against the competition to decry its performance.

You can also say that this makes the XBone the lowest-common-denominator, and games are written for the LCD. This has been true, but I hope this will change. If you need a picture in thee different resolutions, you don't start with a 100x100px image and scale it up to 150x150px and 300x300px. You start with the larger image and sample down, because this yields the best results.

It's true that in this current generation games were ported from the 360 to PS3 and PC, which never made sense to me. I hope and believe that games will now instead be ported from the PC to the 360 and PS3 (half because I want it to be so, half because they're now closer in architecture than ever before). If this happens, the LCD factor won't hide the performance difference.

Exclusives may simply look and feel much better, graphics wise, on the PS4. They will have more room to play with resolution vs graphical fidelity vs frame-rate.

Lifetime scenarios

If the XBone is downgraded, that may mean that we must subtract from its lifetime, meaning this next cycle will actually be a short one. Granted, our current console cycle has been pretty long (or at least it feels like), but IMO these consoles are already far behind the state-of-the-art. In performance I mean, not price/performance, but consoles have fixed performance, while PCs do not.

Furthermore, that they're now all on x86-64 means it'll be easier to roll out new hardware while maintaining a high degree of compatibility, maybe even full backward-compatibility. That would allow the platform holders to more readily push out new hardware down the line. They don't WANT to of course, but they may have to.

(Another thing I believe may point to a shorter cycle is that these devices aren't designed for "4k" and beyond. I believe that "4k" will have a significant market share in three-four years, maybe even sooner on the PC side. With this evolution comes even faster graphics cards on the PC side, leaving the consoles even further in the dust.)

E3 Scenarios

These events are always a lot of "show" and very little "tell", but if this rumor is correct, this may be even more true this time. Even if you see something running on a devkit, you can't know if that devkit is running final spec. Granted, the software will probably be so early that it won't matter anyway, frame rates always improve at the end of the project. That said, there might be games that have to lose a rendering pass or some effect just to make their target (whether 30 or 60fps). (This is very flimsy, I know)

Equal Opportunity Scenarios

There's still plenty of time for Sony to fuck things up. Don't believe for a minute that just because I've ranted a lot about MS here I believe Sony to be some knight in shining armor. They've said all the right things on the technology side so far. There's no doubt in my mind they'll fuck it up service side.

15 Comments

XBone eSRAM/GPU Performance Problems Rumors

There's a rumor that Microsoft are having problems with the XBone. There's been rumors before that they're behind on software, but these latest rumors pertain specifically to the hardware and the combination of eSRAM and GPU. The rumor is saying that they're having yield problems, specifically related to the eSRAM.

I will now go on and speculate widely. You have been warned.

It is claimed that the XBone APU is a one-die CPU+GPU+eSRAM solution, meaning that it's very large (~410mm^2). The only thing MS really said at their reveal was "5 billion transistors". This is interesting, and some correctly pointed out that it probably meant they couldn't compete with the PS4 on specs, because if they could, they'd be more specific.

The primary risk with large dies of course are low yields, resulting in higher costs to get enough dies that meet your specification. Now, Microsoft was not building a gaming machine first. They wanted to load a familiar operating system on there. They wanted more focus on applications that historically have not been heavily optimized for one platform. A relatively fat OS, and applications from developers who are not used to the scant resources typically awarded a gaming console, leads to one design conclusion; "we need lots of memory!"

So they design for 8GB, from the start. Again, they put applications (and cost) above gaming, and go for a well known, cheap, plentiful technology in DDR3. Back then 8GB of GDDR5 would look too expensive, almost insane, and DDR4 wasn't on the immediate horizon.

To compensate the gaming side, they now need some fast RAM, hence 32MB eSRAM on-die. The XBone eSRAM is said to be a 6T type, meaning that each bit requires six transistors. 32MB gives us 32*1024*1024*8 bits times six equals 1,610,612,736 transistors. That's 1.6 billion, a huge chunk out of the total 5 billion!

They do desperately need this very fast cache-like memory to fix the fact that they're using "slow" DDR3, which they settled on because they wanted to have a lot of it, for apps -- not games.

Recap: Decision to not focus on games results in design for a lot of memory, which is "slow" for cost reasons, which is compensated with cache. Result: Large die, with large percentage dedicated to what turns out to be a possible problematic memory to fab. Because the cache use so much die, they have to scale down on GPU.

They're reportedly roughly 2 billion transistors larger than the APU used by Sony in the PS4, which means worse thermals, which means lower clocks. The rumor is saying that the yields on this APU, due to the eSRAM, is so low, that they may have to cut the clocks on the eSRAM+GPU even further than what they've previously communicated to developers under NDA. This would make the XBone even weaker on the GPU side vs the PS4, which was already believed to be quite a bit stronger.

MS probably thought for the longest time, like PS4 devs did, that the PS4 would only have 4GB of GDDR5. Microsoft could work with that, they'd have the fast cache (which no doubt can be awesome when properly used -- I'm a huge cache nerd) and more total memory. They designed their system around being the platform with More Memory.

Cerny meanwhile had bet on GDDR5, and his bet has been that availability will increase and cost decrease at such a pace, that even back when they were first thinking about the PS4 and 2GB seemed like ample amounts, as time went on 4GB became viable.. and then at the last second... tick. Out of nowhere Sony steps up on stage and announce that they'll be using 8GB of GDDR5, twice the amount insiders thought for sure.

And they probably did this to match Microsoft, which has now lost their one advantage in specification. They on their parts are left having to pay the price of designing for apps before games, and that price just went up if this rumor is true.

There are so many "#potentialscenario" implications here, which I'll write about in a later post.

60 Comments

PhysX Benchmarks and the Future Of

Pierre Terdiman, a developer on nVidia's PhysX, have put up a blog post called "The evolution of PhysX" with benchmarks over several PhysX versions and Bullet, an open source physics engine.

"I felt compelled to also include the last version of Bullet (2.81), to provide an objective external reference. The thing is, there are many people out there who still think that “PhysX is not optimized for CPU”, “PhysX does not use SIMD”, “PhysX is crippled on purpose”, and so on."

Let me first make it clear that I'm far from an expert in physics engines, and I'm not even that up to date with the latest in the business. I'm pretty sure though that PhysX is still CUDA only as far as GPUs go, meaning it's nVidia hardware only. The rest of my argument rests on that basis.

Though the benchmarks clearly proves that each version of PhysX is faster on the CPU, it doesn't adequately disprove the idea that nVidia have engaged in 'crippling of PhysX' in games.

The problem here is that you probably don't want to run your physics on the CPU to begin with, and as far as I know, there is no publicly available OpenCL backend for PhysX, though I'm sure they have one internally. When I've heard PhysX critiqued, it is this that is the point. It's all very well that PhysX is faster than the competition on the CPU, but what matters to gamers is scenarios where you have CPU and GPU available.

In gaming, nVidia and AMD have about 85% of the market between them. Both losing ground to Intel recently. Simply put, if you belong to the 33% that run an AMD GPU and have your physics run on the CPU where the nVidia owners get to run on the GPU, certainly it's valid to say that PhysX is crippled in this respect, especially if the competition will run on the GPU on both nV and AMD.

The benchmarks as presented only show that PhysX is fast on the CPU in the best case, but it doesn't show that it can't be made to be slow, for instance by artificially resource-starving it. For instance, in a "TWIMTBP" game, nVidia may have instructed the physics to use a maximum of one or two extra threads on the CPU, even if more were readily available. In fact, that they're fast on the CPU could encourage them to cripple an integration just to make sure the CPU path can't catch up with the GPU path even on the fastest of machines.

"Some early investigations into PhysX performance showed that the library uses only a single thread when it runs on a CPU. This is a shocker for two reasons. First, the workload is highly parallelizable, so there's no technical reason for it not to use as many threads as possible; and second, it uses hundreds of threads when it runs on an NVIDIA GPU. So the fact that it runs single-threaded on the CPU is evidence of neglect on NVIDIA's part at the very least, and possibly malign neglect at that." -- Ars Technica, 2010

It would take someone posting game and/or driver disassembly AND benchmarks from a recent title to prove nVidia in the wrong. Again.

PhysX on Next-Gen

It's worth thinking about this in the context of the "next-generation of consoles" rolling out later this year. We know with a fair amount of certainty that they will both use AMD APU parts, so PhysX will almost certainly NOT run on the GPU on them as long as nVidia insists that PhysX should be CUDA only.

I can see three scenarios.

  1. nVidia gives up on the physics engine market.
  2. nVidia provides an OpenCL backend for all platforms.
  3. nVidia special-license GPU-version for the consoles.

(1) is unlikely, (2) would be great for everyone involved, but nVidia being nVidia I wouldn't be surprised to see them take the third options; crippling PhysX on the PC where they (think they) can use it for marketing, but giving in where required to be able to compete at all.

Actually a (4) where everyone come together around a common API would be the best outcome, but it's my view that only MS could push something like that through, and then it'll be some DirectX specific hackery that'll be useless for cross-platform and then what was the point?

On these things and more, the future will tell.

EDIT: @Ken_Addison linked to this press release: "NVIDIA Announces PhysX and APEX Support for Sony PlayStation 4". From the wording it appears as though PhysX will run on the CPU only on the PS4. I should have listed this as an option since it's the most obvious one. Let's call it option 0.

11 Comments
  • 20 results
  • 1
  • 2