Eurobum's forum posts

Avatar image for eurobum
#1 Posted by Eurobum (349 posts) -

@horseman6 said:

Rogue One - First viewing - I liked this more than Force Awakens, mostly because the cinematography was good. However, the writing and story was really really bad. You never really got any motivation or understanding of why characters were doing anything they did nor why we should care about the characters. The film also jumped around way too much, it probably could have used an extra 30 minutes of time. And lastly, the biggest issue, Felicity Jones character was horribly dull; I just didn't care about her at all. 2 out of 5.

It should have been called, security flaw, the movie. Which would have been fine if it wasn't for the constant fan service and people resigning themselves to their fate / martyrdom. I liked that it was a whistle blower or rather a reluctant collaborator who made it all happen, not a chosen hero. I think people working on this franchise are trying to tell us something, which is great so I give em an extra star. 2/5

A hilarious satire about Rogue One and #Nerdculture podcasts. https://www.youtube.com/watch?v=sExTt4j69zI

Avatar image for eurobum
#2 Posted by Eurobum (349 posts) -

Nintendo needs to keep prices down for their system to be viable as a toy, and they need to split the modest sum between screen, battery, chip and gimmick. That forces them to buy Nvidia leftovers, with an old arch from a two generation old process node. This thing is made on 20 nm, while phones, CPUs and GPUs moved on 14nm/16ff while Samsung is already rolling out 10 nm for phones. Keeping up with technology is a piracy protection of sorts, just as providing a better experience / easy access.

Granted not a lot of people will have a highly clocked desktop PC with a ton of RAM to do the emulation, but that emulator is about to cost Nintendo tens of millions, given that Cemu is ready out of the gate and given that it's impossible to buy a switch at retail price. The much lauded system seller may become an unfortunate comparison once people see it running in glorious resolutions, working physics and all that grass which serves to hide the ugly ground textures. Granted seeing mobile graphics rendered natively and blown up on a big screen isn't always flattering.

If Cemu comes together and its dev isn't hired by Nintendo in the coming weeks, then we'll see emulation hitting the mainsteam for the first time.

Avatar image for eurobum
#3 Edited by Eurobum (349 posts) -

@zurv: To put it in most simple terms, HDR-video has little to do with either colors or their accuracy. What it changes is luminosity/brightness, which changes color in a way. Because it's a kind of post processing it's also faked (or at least can be), photo HDR is fakery and is something quite different from the advertised HDR feature.

HDR started with HDR photography, some smartphones have this feature, In an extreme example you take pictures of a city throughout the day from the same point of view, then an algorithm replaces the parts in the shadow of buildings with their brighter versions, as if the picture was lit by as many suns as pictures were taken. This creates a kind of super-picture which is very well lit, but actually not realistic at all. When using the sun its color spectrum also changes during a sunset and you can get changing colors.

No Caption Provided

An exteme example (Source) of HDR photography, which shows how not to do it and encourages restraint.

Anyway but back to televisions, in TVs HDR is not about colors - they still can't make more colors just combining 3 colored pixels anyway - it's about brightness, televisions have to have the eye-searing brightness to work even during daylight and across a living room. OLEDs got both brighter and darker, so they need a bigger scale/range for luminosity, thus was born high dynamic range video.

I guess you could either have an HDR source or the TV could interpolate finer degrees of brightness comparing subsequent frames (which isn't advisable for games). This makes some sense for really bright things, that would just be max brightness on a normal screen or really dark stuff which is even more important.

Anyway I'm just trying to make sense of it, with these things it's always hard to tell where technology ends and marketing BS begins. And certainly mislabeling this feature "HDR" is an attempt to invoke something common and recognizable. Gamma depth/range could be a more appropriate name maybe, which also would be familiar to anyone who played games on a Doom engine.

I believe Jeff might have referenced this (fluff) article: http://www.polygon.com/2016/9/9/12863078/ps4-pro-hdr-development

Avatar image for eurobum
#4 Posted by Eurobum (349 posts) -

@zurv: Seriously why should color accuracy matter for games of all things? I'm pretty sure all the latest GPUs use / started using color subsampling (4:2:?) to cut down texture sizes and free up memory bandwidth , which basically is the the biggest bottlenecks in high resolution rendering.

Wikipedia: To calculate required bandwidth factor relative to 4:4:4 (or 4:4:4:4), one needs to sum all the factors and divide the result by 12 (or 16, if alpha is present). WKPD

So let's compare (4+4+4)/12=1 and (4+2+0)/12=0.5. So we can halve bandwidth without a noticeable difference, done.

Just like with cameras we seem to be heading for more pixels and more compression, rather than "lossless" accuracy. More pixels = less aliasing >> color accuracy.

In moving pictures a screen pixel spends a small percentage of the time showing the intended color and the rest of the time it spends transitioning (color changing).

Besides isn't the HDR feature of TVs just as filter that compares several frames and merges them to create a picture with slightly higher color contrast, just as Jeff described on the Bombcast. And let's not forget that all digital TV broadcasting and VOD streams are mercilessly compressed. Even a TV that could display a 4:4:4 signal, where would you get it?

Avatar image for eurobum
#5 Posted by Eurobum (349 posts) -

I don't know anything about current OLED other than they use more power and burn out rather quick (lose their brighness and shift color), but seeing how VR uses OLED panels, this is the way forward, surely. However I would seek out the answer for why there are no OLED PC monitors before I buy anything.

Avatar image for eurobum
#6 Posted by Eurobum (349 posts) -

@ravingham91 said:

I read that the higher the resolution the less of a bottleneck the CPU becomes and the more important the GPU becomes. If this is indeed true do you think I will be fine with a i5 7500 ?

Less frequent bottleneck. There are many things that hold back performance and they all take turns more or less, waiting for memory to fetch data, waiting for GPU to finish calculating, waiting for CPU, waiting for network.

So AMD released their Rysen CPUs recently and they offer much better performance, value and future-proofing. The only point where Intel kind of managed to hold on to their crown is with their highly clocked quad cores. However you can't really take advantage of this when you don't overclock them, something you said you don't want. Also those things benefit from highly clocked RAM.

Buying an i5 is a misstep this day and age, because of increasing importance of multi-threading. I think you are the perfect case for AMD, because those CPUs overclock themselves and don't really support very high clocked RAM. *sigh* Their 8 core Rysen R7 1700 sounds like it goes a bit over budget, but they will release 6 and 4 cores in april in the 6/4 core range.

If you are set on Intel you should at least consider higher clocked SKUs. Only their top SKU in the particular Segment i3/i5/i7 manage to maintain a descent resale value, which I use to gauge their worthiness. ^^

Avatar image for eurobum
#7 Posted by Eurobum (349 posts) -

Let met drive this point home.

Zelda: Skyward Sword is released in November, 2011. Fanboy Patrick K. isn't exactly smitten.

Skyrim is released in November 11th, 2011. Goes on to sell more than 20 million copies across systems. Next Zelda is open world and looks a lot like Skyrim. Do you think it is coincidence that they showed someone playing Skyrim on the Nintendo Switch? Nintendo is pandering to the Skyrim audience so hard, acting like they invented the wheel is ludicrous. That GDC talk (shortcut) about how they went back to the freedom of Zelda 1, that was some rather high level, superficial BS, though still interesting.

The Zelda buzz isn't nearly as huge as the cultural Phenomenon in 2011, because of the Wii U install base and the fact that it's impossible to buy a Switch even if you are willing to shell out 400 bucks, which isn't a given.

For people who try to milk these games for all of their content, all open world games end up a disappointment. We are not there yet in the cycle! And there will be a lot of milking and disappointment coming up given it's a new system, with few games.

Avatar image for eurobum
#8 Edited by Eurobum (349 posts) -

@picky_bugger said:

I get why you'd say that but this game to me felt nothing like a Bethesda game. Yeah they have open worlds like this but the interactions you have are so totally different that I don't see a correlation past them both having big maps.

Bathesda games target mature audiences, I don't expect a Nintendo game to open with a beheading anytime soon. But Bathesda were the first to create a truly open "open world" by making enemies scale with the player's level, which sadly feels very fake and forced and weird once you becomes aware of it, which earned them a lot of criticism with Oblivion. Same for Skyrim, except some reviewers mistakenly insisted that this wasn't the case any more. Not sure how this works in Zelda, I hear tell that at least the enemy weapon-drops scale with progression.

Cooking and alchemy are another obvious similarity, awkward UI and inventory, slowing down time while bow-aiming, hunting and gathering, awkward horseback combat, the chosen one "a hero's journey" story, magical Fus Roh Da powers, stealth, crappy puzzles, choice between stamina and health (thus completely simplified RPG progression), snowy / changing landscape, Dwemer automata/drones. The success of Skyrim vs that other Zelda game, as well as timing, look and philosophy seem like a completely obvious influence. There is one big difference: Bathesda games rely heavily on voiced story quests, vs Zelda seems to rely on puzzles and its physics sandbox. In Skyrim the physics sucked, but you could put cauldrons on people's heads, and roll cheese wheels down a mountain.

What i'd like to know is: which game first let enemies and NPCs pick up weapons, I know STALKER did that, which was completely hilarious.

Anyway doing away with terrible video game story is a solid choice, because seeing genre movies reenacted by polygonal dirty faces and jerky animation is just depressing, even though you get to ride the dragon or run from an explosion.

Avatar image for eurobum
#10 Edited by Eurobum (349 posts) -

@slag:Let's not forget that Dark Souls also turned the repetitive and rigid re-spawning of enemies into a game mechanic itself , a Groundhog-Day-like, by not even bothering to create a lived-in dynamic world. Comparing open world games to Dark Souls isn't really flattering.

Dragon's Dogma - Bitterblack Isle expansion was very much their take on Dark Souls, and it was less compelling than the first open world part, IMO. Although there is something to be said about repeating stuff over and over again until you get really good at it or good at runnig past it. Bitterblack Isle also failed, because they started throwing bosses at you over and over thus cheapening the experience, something Dark Souls wisely avoided.

Come to think of it, Zelda appears to be, correct me if i'm wrong, very rigid and scripted. I also suspect that the size of BotW serves to geographically isolate mobs, so the don't run into each other much, without visible leashing.