The_A_Drain's forum posts

#1 Posted by The_A_Drain (3973 posts) -

I wasn't even looking forward to it, but even then I found Evil Within disappointing. What a heap of garbage.

Pretty much every game this year that's been high profile/high budget has been creatively bankrupt, committee/focus test designed rubbish, or re-releases. Alien Isolation looks a lot better than I originally thought it would be, though.

There's been some decent indie stuff though.

#2 Edited by The_A_Drain (3973 posts) -

This is an unanswerable question. Are movies better at telling stories than books? What about songs? Nobody can say definitively.

They all have their strengths and weaknesses and are better at doing certain kinds of things.

Personally, I feel games strengths are currently being squandered in an effort to be more like movies and television, only shittier. Once games start to get out of this phase of believing they need to be like that to be taken seriously, we'll see what games can truly do. Believe me, we've not seen anything yet, where games are at right now is just the beginning. The medium is in it's teenage years right now, constantly aping it's idols in order to appear more mature. It'll grow out of that, eventually.

For me right now games like The Last of Us represent that perfectly. Trying to emulate film and TV too much is holding the medium back.

#3 Posted by The_A_Drain (3973 posts) -

It's a pretty short and focused list, Fallout 1, Baldur's Gate 1, at least once a year, although I'll often start and play halfway through then give up and restart a few months later. I just really like playing the early portions of those games but don't have time to do multiple complete playthroughs.

Then Resident Evil 2 I'll play whenever, it's an easy chill-out game for me, at one point I was playing it so much and so often I was competitive for a world-record speedrun time.

That's pretty much it, I replay so few games nowadays. I'll tackle Final Fantasy 7 once every few years but I usually get to disc 2 and call it quits.

#4 Edited by The_A_Drain (3973 posts) -

Personally I've always felt that if you're looking to a game's ending for satisfaction, you're doing it wrong. The satisfaction comes from the journey itself, it's the whole point of the whole 'interactivity' thing.

That said, I've found very few games to have truly satisfying closing moments, I'm talking the whole package, satisfying end boss/sequence, satisfying end to the story (or unsatisfying if that was the deliberate intent) and the game itself hadn't either ended too early or outstayed it's welcome.

Personally there's just a literal handful of games that fall into that category, Fallout 1 and 2, Resident Evil 2, Baldur's Gate 2, Deus Ex, Grand Theft Auto: Vice City and Grand Theft Auto 5 are about the only ones I can think of right now that did that for me. And Demon's Souls/Dark Souls, but those don't really 'end' so much as you are given an excuse to NG+, I found the prospect of testing myself and my newfound abilities against harder versions of the earlier bosses to be so enticing I hardly cared how it ended.

That said, a huge number of my favorite games of all-time would fall into the 'disappointing ending' category for most people. Way worse than a lot of todays modern game ending sequences. System Shock 2 for a start, absolutely amazing game that I will replay many times, but that ending sequence? Yeah fuck that sequence, it was garbage. But nobody remembers how bad it was because A) most people didn't finish it, the game was really hard and B) people remember the bits they did enjoy, which was the entirety of the journey between the opening and the ending.

That's not to say I don't think story is important in games, it absolutely is very important, but I think disappointing endings being such a big thing nowadays is symptomatic of just how little 'freedom' players are given in the modern AAA video game.

Edit: World of Goo and Little Inferno. Seriously people, play those games and play them to the END.

#5 Edited by The_A_Drain (3973 posts) -

There's a really really REALLY long answer, and there's a short answer.

The short answer is because video games are HARD FUCKING WORK.

For programmers in the field, it's one of the most demanding jobs you can do AND it's one of the lowest paid.

Add to that the general need to have programmers on staff permanently over artists who can be let go as soon as their assets are completed (that's right folks, 3 month rolling contracts are a standard in this industry and have been for a long time) and you end up with a lot of managers and designers deciding that using lots and lots of middleware and keeping as few staff programmers as possible is a good idea.

It's important to remember something, a lot of people in this thread don't really understand the difference between "a game is a large complex piece of software!" today compared to say, 10 years ago.

10 years ago if your piece of software was a buggy piece of crap, that quite often meant it didn't work. Period. Everything was often home-made and fragile as balls, so easy to make a mistake and then BOOM black screen, no game. And if a game made it to ship date like that, it didn't ship.

Nowadays, everything is so crammed full of middleware, that a lot of programmers time is spent just trying to get those things to play together nicely (and not crumble under the strain of all the special effects and shaders the artists/designers want to use)

Generally this means that games are, on the whole pretty robust. But the individual pieces that make them up have gotten so much bigger and more complex than they ever were before that just trying to get them to play nice together is a struggle. A serious serious struggle.

A large complex piece of video-game software 10 + years ago meant a tight group of developers, often no more than 50 and those were considered very large teams, working together. Often no more than 10. Nowadays? It's hundreds of people collaborating from all over the world, working with tons of different kinds of software and tools, a lot of programmers time nowadays is spent just getting all that shit to play nicely and not crap itself, all while another 300+ people are making changes to the game every day.

So a lot of the time, if the AI system interacts fine with the Pathing system 9 times out of 10? And the 1 in 10 results in the AI glitching through the floor instead of crashing the game completely, that's a fucking win as far as the dev-team is concerned.

Games used to be small enough that one programmer could take responsibility for one system (audio, graphics, AI, whatever) and would know it inside and out. Nowadays, there's no possible way a single person can know the complete ins/outs of even a single system, there's just so much code flying around.

Yeah you can call it laziness or whatever you want, but the fact of the matter is they have to make do with a budget, and a lot of game developers work 50+ hour weeks (I personally know a guy who was often doing 16 hour days and sleeping in the office for the last few MONTHS of a major title that came out recently I won't name) so laziness isn't really the problem.

Fact is, there's a certainly amount of bugs and glitches that are considered 'acceptable' considering the alternative (which is the game doesn't work at all) and that's fine.

People can stamp their feet and shout "But it was sixty dorrah!" all they want, no game is ever gonna be glitch-free. It's very rare we see a legitimate game-breaking bug these days, just a lot of smaller interactions that go haywire sometimes.

Lack of content is another discussion entirely, but I'll say this, the smaller the scope the more bug-free games tend to be. You want more of something? Quality's gotta give. Just like anything in life.

The long answer is literally a thesis.

Also, as a final note of caution, people really like to forget just how many shitty games there was back in the day. Sure, games might not have been as visibly glitchy (especially if we're talking about stuff like the Atari 2600, which has it's collision detection built into the hardware, so it's kinda hard to fuck stuff like that up) as they are now, but a huge chunk of them sure as hell fucking sucked. A terrible jump arc in a platformer for example, might not be considered a 'glitch' but you can bet bottom dollar it's there for the same reasons. Back when game engines didn't do gravity and all that fancy stuff for you automatically, programmers had to be very clever about things like gravity, velocity and anything other than standard box collisions etc. Making a game feel nice was really hard back then. Today it's a lot easier but getting all the background components to play nice is hard.

#6 Edited by The_A_Drain (3973 posts) -

My advice in the current economic climate is try to stay as far away from the menial minimum wage applications as possible, I've found that there is more competition for them than there was for much better much higher paid jobs, because companies are squeezing the shit out of people with crap like zero-hour contracts and other garbage, so there seems to be a high number of people working multiple minimum wage jobs and employers can just cut back the hours whenever they feel like it, so they just aren't hiring. I guess it doesn't help I live in a student city, too. There's so much competition for the low-end jobs.

Last time I job-searched (3 months ago after a 2 month period of unemployment in which most of my time was spent clearing up after spending 2 years self-employed) and I applied to a TON of stuff like mcdonalds, burger king, subway, every burger joint, sandwhich joint and supermarket within miles because I needed a job ASAP to start paying debts.

Not a single one of them got back to me.

The programming job I was 100% sure I wasn't qualified for? Got back to me within an hour of sending my CV, interviewed within a week and made an offer within an hour of the interview.

Bonkers. I'd say concentrate on looking for things that are around what you're looking for, for example I spent 2 years doing games programming, and I moved into InfoSec with my current job because they were more than happy to teach me new skills. So you don't have to limit yourself to what your strictly qualified for right now, a lot of companies are happy to take on people who know very little about their specific area if they are willing and enthusiastic to learn.

Apart from the whole not-having-any-money thing though I loved unemployment, I absolutely hate working. 9-5 5-days a week with a 110 minute each way commute? Yeah, once my debts are all paid, I'm out. Fuuuuck being a grown up.

#7 Edited by The_A_Drain (3973 posts) -

Bioforge.

#8 Posted by The_A_Drain (3973 posts) -

@the_a_drain said:

@stonyman65: He owns the IP but not the rights to Wasteland 1. EA still own those and he'd have to negotiate that with them. It's not impossible, but they'd have to be involved. They approved a GoG re-release though so...

Which might even qualify as a 'remaster' depending on what you want from a remaster, as it was modified to contain text that the original didn't have (a lot of text was originally printed in the games manual, and the player was given the number of a paragraph to read).

Oh that's right!

I wouldn't mind a whole new game to be honest. The first Wasteland is pretty rough by today's standards. Some would argue that Wasteland 2 is also, but that still has enough late-90's CRPG elements to feel kinda recent. The original Wasteland is ancient by comparison!

Yeah you're not wrong, I'm usually pretty good with re-visiting older games and not having too much of a problem with them being old, I can deal with awkward controls and cluttered interfaces etc, but sheeeeet, Wasteland barely even has enough resolution to display your equipment without abbreviations and endless scrolling. So it was incredibly difficult for me to play, I did try though and I did enjoy what I played but I only got like 4 hours in before just giving up. (Yeah I'd never played the original when I backed Wasteland 2)

I'd be all over a version that just expands the interface out a little so I can read it all properly. But beyond that I personally don't really need any remakes beyond that kind extreme basic usability stuff. But I do get that to other people that bar is a lot higher, I mean I know people who won't play FF7 because "the graphics are too old". So, to each their own.

#9 Posted by The_A_Drain (3973 posts) -

Influential personally?

Well, I'm alive right now because of two of the people I was fortunate enough to have met and become friends with in my life, one of whom is very sadly no longer with us. I owe both of those people a great debt, not sure anyone else can be any more influential than that. So yeah.

Maybe the Fallout team, who together created the game that first showed me that games could be more than simple timewasters/devices for escapism (other games that would have done that existed way before Fallout, but I did not have a games-capable PC until Fallout was already quite old, and it happened to be the one I picked out of the bargain bin alongside Baldur's Gate) and shaped my life toward getting into making videogames. A career I've been relatively unsuccessful in thus far but whatever.

#10 Edited by The_A_Drain (3973 posts) -

@stonyman65: He owns the IP but not the rights to Wasteland 1. EA still own those and he'd have to negotiate that with them. It's not impossible, but they'd have to be involved. They approved a GoG re-release though so...

Which might even qualify as a 'remaster' depending on what you want from a remaster, as it was modified to contain text that the original didn't have (a lot of text was originally printed in the games manual, and the player was given the number of a paragraph to read).