The_A_Drain's forum posts

#1 Posted by The_A_Drain (4073 posts) -

Ewwww wow, I thought Doom 3 was butt-ugly when it came out, forget now.

Easily the game that's aged the best for me where 3D graphics are concerned (ok ok, so it's kinda cheating) is Resident Evil on the Gamecube. That game looked phenominal back then and still looks really REALLY good now. In terms of visual coherency it looks much better than 99% of things that come out now, and was probably the best set of graphics to ever grace standard definition. With Ninja Gaiden on original Xbox coming a close second.

In terms of 2D graphics I think a lot of games have aged incredibly well, old Zelda games, old Mario games, Contra 3 looks incredible still, so does Super Metroid, etc. A lot of the classic 2D SNES-era games still look amazing now imo.

#2 Edited by The_A_Drain (4073 posts) -

Namco X Capcom.

Frank West references The Prisoner/the Iron Maiden song The Prisoner (doesn't matter which one specifically, I'm a huge fan of both)

It's like the game was made specifically for me.

#3 Posted by The_A_Drain (4073 posts) -

Kotaku's been blocked on my router for 2 years now.

I'm torn and confused about this. Sites can change along with the people who write for them, I mean just look at why we're all here instead of Gamespot, but I gave Kotaku a second chance when Totillo joined, and it wasn't worth it then and, upsettingly I don't think it'll be worth it now. Patrick's content was one of my favorite parts of this site, but I don't think I can justify giving Kotaku clicks even for Patrick, sadly.

I hope he's hugely successful there.

#4 Posted by The_A_Drain (4073 posts) -


No one is going to top Mario

Super Meat Boy is the best 2D platformer ever made, In my opinion, topping Mario.

I'd kill my favorite genre, Survival Horror. Because 9 of every 10 'survival horror' games are nothing of the sort, they confuse thematic genre for gameplay genre because it's convenient at the moment, and offer none of the characteristic features the genre is known for. Instead they are mostly action or puzzle games with a horror theme. Perhaps if it finally died it could be reborn anew one day like Fighting Games were in 2009.

#5 Posted by The_A_Drain (4073 posts) -

I wasn't even looking forward to it, but even then I found Evil Within disappointing. What a heap of garbage.

Pretty much every game this year that's been high profile/high budget has been creatively bankrupt, committee/focus test designed rubbish, or re-releases. Alien Isolation looks a lot better than I originally thought it would be, though.

There's been some decent indie stuff though.

#6 Edited by The_A_Drain (4073 posts) -

This is an unanswerable question. Are movies better at telling stories than books? What about songs? Nobody can say definitively.

They all have their strengths and weaknesses and are better at doing certain kinds of things.

Personally, I feel games strengths are currently being squandered in an effort to be more like movies and television, only shittier. Once games start to get out of this phase of believing they need to be like that to be taken seriously, we'll see what games can truly do. Believe me, we've not seen anything yet, where games are at right now is just the beginning. The medium is in it's teenage years right now, constantly aping it's idols in order to appear more mature. It'll grow out of that, eventually.

For me right now games like The Last of Us represent that perfectly. Trying to emulate film and TV too much is holding the medium back.

#7 Posted by The_A_Drain (4073 posts) -

It's a pretty short and focused list, Fallout 1, Baldur's Gate 1, at least once a year, although I'll often start and play halfway through then give up and restart a few months later. I just really like playing the early portions of those games but don't have time to do multiple complete playthroughs.

Then Resident Evil 2 I'll play whenever, it's an easy chill-out game for me, at one point I was playing it so much and so often I was competitive for a world-record speedrun time.

That's pretty much it, I replay so few games nowadays. I'll tackle Final Fantasy 7 once every few years but I usually get to disc 2 and call it quits.

#8 Edited by The_A_Drain (4073 posts) -

Personally I've always felt that if you're looking to a game's ending for satisfaction, you're doing it wrong. The satisfaction comes from the journey itself, it's the whole point of the whole 'interactivity' thing.

That said, I've found very few games to have truly satisfying closing moments, I'm talking the whole package, satisfying end boss/sequence, satisfying end to the story (or unsatisfying if that was the deliberate intent) and the game itself hadn't either ended too early or outstayed it's welcome.

Personally there's just a literal handful of games that fall into that category, Fallout 1 and 2, Resident Evil 2, Baldur's Gate 2, Deus Ex, Grand Theft Auto: Vice City and Grand Theft Auto 5 are about the only ones I can think of right now that did that for me. And Demon's Souls/Dark Souls, but those don't really 'end' so much as you are given an excuse to NG+, I found the prospect of testing myself and my newfound abilities against harder versions of the earlier bosses to be so enticing I hardly cared how it ended.

That said, a huge number of my favorite games of all-time would fall into the 'disappointing ending' category for most people. Way worse than a lot of todays modern game ending sequences. System Shock 2 for a start, absolutely amazing game that I will replay many times, but that ending sequence? Yeah fuck that sequence, it was garbage. But nobody remembers how bad it was because A) most people didn't finish it, the game was really hard and B) people remember the bits they did enjoy, which was the entirety of the journey between the opening and the ending.

That's not to say I don't think story is important in games, it absolutely is very important, but I think disappointing endings being such a big thing nowadays is symptomatic of just how little 'freedom' players are given in the modern AAA video game.

Edit: World of Goo and Little Inferno. Seriously people, play those games and play them to the END.

#9 Edited by The_A_Drain (4073 posts) -

There's a really really REALLY long answer, and there's a short answer.

The short answer is because video games are HARD FUCKING WORK.

For programmers in the field, it's one of the most demanding jobs you can do AND it's one of the lowest paid.

Add to that the general need to have programmers on staff permanently over artists who can be let go as soon as their assets are completed (that's right folks, 3 month rolling contracts are a standard in this industry and have been for a long time) and you end up with a lot of managers and designers deciding that using lots and lots of middleware and keeping as few staff programmers as possible is a good idea.

It's important to remember something, a lot of people in this thread don't really understand the difference between "a game is a large complex piece of software!" today compared to say, 10 years ago.

10 years ago if your piece of software was a buggy piece of crap, that quite often meant it didn't work. Period. Everything was often home-made and fragile as balls, so easy to make a mistake and then BOOM black screen, no game. And if a game made it to ship date like that, it didn't ship.

Nowadays, everything is so crammed full of middleware, that a lot of programmers time is spent just trying to get those things to play together nicely (and not crumble under the strain of all the special effects and shaders the artists/designers want to use)

Generally this means that games are, on the whole pretty robust. But the individual pieces that make them up have gotten so much bigger and more complex than they ever were before that just trying to get them to play nice together is a struggle. A serious serious struggle.

A large complex piece of video-game software 10 + years ago meant a tight group of developers, often no more than 50 and those were considered very large teams, working together. Often no more than 10. Nowadays? It's hundreds of people collaborating from all over the world, working with tons of different kinds of software and tools, a lot of programmers time nowadays is spent just getting all that shit to play nicely and not crap itself, all while another 300+ people are making changes to the game every day.

So a lot of the time, if the AI system interacts fine with the Pathing system 9 times out of 10? And the 1 in 10 results in the AI glitching through the floor instead of crashing the game completely, that's a fucking win as far as the dev-team is concerned.

Games used to be small enough that one programmer could take responsibility for one system (audio, graphics, AI, whatever) and would know it inside and out. Nowadays, there's no possible way a single person can know the complete ins/outs of even a single system, there's just so much code flying around.

Yeah you can call it laziness or whatever you want, but the fact of the matter is they have to make do with a budget, and a lot of game developers work 50+ hour weeks (I personally know a guy who was often doing 16 hour days and sleeping in the office for the last few MONTHS of a major title that came out recently I won't name) so laziness isn't really the problem.

Fact is, there's a certainly amount of bugs and glitches that are considered 'acceptable' considering the alternative (which is the game doesn't work at all) and that's fine.

People can stamp their feet and shout "But it was sixty dorrah!" all they want, no game is ever gonna be glitch-free. It's very rare we see a legitimate game-breaking bug these days, just a lot of smaller interactions that go haywire sometimes.

Lack of content is another discussion entirely, but I'll say this, the smaller the scope the more bug-free games tend to be. You want more of something? Quality's gotta give. Just like anything in life.

The long answer is literally a thesis.

Also, as a final note of caution, people really like to forget just how many shitty games there was back in the day. Sure, games might not have been as visibly glitchy (especially if we're talking about stuff like the Atari 2600, which has it's collision detection built into the hardware, so it's kinda hard to fuck stuff like that up) as they are now, but a huge chunk of them sure as hell fucking sucked. A terrible jump arc in a platformer for example, might not be considered a 'glitch' but you can bet bottom dollar it's there for the same reasons. Back when game engines didn't do gravity and all that fancy stuff for you automatically, programmers had to be very clever about things like gravity, velocity and anything other than standard box collisions etc. Making a game feel nice was really hard back then. Today it's a lot easier but getting all the background components to play nice is hard.

#10 Edited by The_A_Drain (4073 posts) -

My advice in the current economic climate is try to stay as far away from the menial minimum wage applications as possible, I've found that there is more competition for them than there was for much better much higher paid jobs, because companies are squeezing the shit out of people with crap like zero-hour contracts and other garbage, so there seems to be a high number of people working multiple minimum wage jobs and employers can just cut back the hours whenever they feel like it, so they just aren't hiring. I guess it doesn't help I live in a student city, too. There's so much competition for the low-end jobs.

Last time I job-searched (3 months ago after a 2 month period of unemployment in which most of my time was spent clearing up after spending 2 years self-employed) and I applied to a TON of stuff like mcdonalds, burger king, subway, every burger joint, sandwhich joint and supermarket within miles because I needed a job ASAP to start paying debts.

Not a single one of them got back to me.

The programming job I was 100% sure I wasn't qualified for? Got back to me within an hour of sending my CV, interviewed within a week and made an offer within an hour of the interview.

Bonkers. I'd say concentrate on looking for things that are around what you're looking for, for example I spent 2 years doing games programming, and I moved into InfoSec with my current job because they were more than happy to teach me new skills. So you don't have to limit yourself to what your strictly qualified for right now, a lot of companies are happy to take on people who know very little about their specific area if they are willing and enthusiastic to learn.

Apart from the whole not-having-any-money thing though I loved unemployment, I absolutely hate working. 9-5 5-days a week with a 110 minute each way commute? Yeah, once my debts are all paid, I'm out. Fuuuuck being a grown up.