There's a really really REALLY long answer, and there's a short answer.
The short answer is because video games are HARD FUCKING WORK.
For programmers in the field, it's one of the most demanding jobs you can do AND it's one of the lowest paid.
Add to that the general need to have programmers on staff permanently over artists who can be let go as soon as their assets are completed (that's right folks, 3 month rolling contracts are a standard in this industry and have been for a long time) and you end up with a lot of managers and designers deciding that using lots and lots of middleware and keeping as few staff programmers as possible is a good idea.
It's important to remember something, a lot of people in this thread don't really understand the difference between "a game is a large complex piece of software!" today compared to say, 10 years ago.
10 years ago if your piece of software was a buggy piece of crap, that quite often meant it didn't work. Period. Everything was often home-made and fragile as balls, so easy to make a mistake and then BOOM black screen, no game. And if a game made it to ship date like that, it didn't ship.
Nowadays, everything is so crammed full of middleware, that a lot of programmers time is spent just trying to get those things to play together nicely (and not crumble under the strain of all the special effects and shaders the artists/designers want to use)
Generally this means that games are, on the whole pretty robust. But the individual pieces that make them up have gotten so much bigger and more complex than they ever were before that just trying to get them to play nice together is a struggle. A serious serious struggle.
A large complex piece of video-game software 10 + years ago meant a tight group of developers, often no more than 50 and those were considered very large teams, working together. Often no more than 10. Nowadays? It's hundreds of people collaborating from all over the world, working with tons of different kinds of software and tools, a lot of programmers time nowadays is spent just getting all that shit to play nicely and not crap itself, all while another 300+ people are making changes to the game every day.
So a lot of the time, if the AI system interacts fine with the Pathing system 9 times out of 10? And the 1 in 10 results in the AI glitching through the floor instead of crashing the game completely, that's a fucking win as far as the dev-team is concerned.
Games used to be small enough that one programmer could take responsibility for one system (audio, graphics, AI, whatever) and would know it inside and out. Nowadays, there's no possible way a single person can know the complete ins/outs of even a single system, there's just so much code flying around.
Yeah you can call it laziness or whatever you want, but the fact of the matter is they have to make do with a budget, and a lot of game developers work 50+ hour weeks (I personally know a guy who was often doing 16 hour days and sleeping in the office for the last few MONTHS of a major title that came out recently I won't name) so laziness isn't really the problem.
Fact is, there's a certainly amount of bugs and glitches that are considered 'acceptable' considering the alternative (which is the game doesn't work at all) and that's fine.
People can stamp their feet and shout "But it was sixty dorrah!" all they want, no game is ever gonna be glitch-free. It's very rare we see a legitimate game-breaking bug these days, just a lot of smaller interactions that go haywire sometimes.
Lack of content is another discussion entirely, but I'll say this, the smaller the scope the more bug-free games tend to be. You want more of something? Quality's gotta give. Just like anything in life.
The long answer is literally a thesis.
Also, as a final note of caution, people really like to forget just how many shitty games there was back in the day. Sure, games might not have been as visibly glitchy (especially if we're talking about stuff like the Atari 2600, which has it's collision detection built into the hardware, so it's kinda hard to fuck stuff like that up) as they are now, but a huge chunk of them sure as hell fucking sucked. A terrible jump arc in a platformer for example, might not be considered a 'glitch' but you can bet bottom dollar it's there for the same reasons. Back when game engines didn't do gravity and all that fancy stuff for you automatically, programmers had to be very clever about things like gravity, velocity and anything other than standard box collisions etc. Making a game feel nice was really hard back then. Today it's a lot easier but getting all the background components to play nice is hard.
Log in to comment