Something went wrong. Try again later

devise22

This user has not updated recently.

923 0 22 7
Forum Posts Wiki Points Following Followers

Games With An Active Development Life Cycle; Good or Bad?

You’d be hard pressed not to know what a game having an active life cycle is by now. What used to be something limited to online only games, most notably MMO’s has permeated it’s way into pretty much every genre. Everything from sandbox survival games like Minecraft all the way to shooters both first and third person such as Destiny and the Division have featured an active life cycle form of development.

For those who don’t know, in summary it is games that have no predefined beginning an end date when it comes to the production of content. Yes often these games have stopped being worked on, or shut down, but for the duration of their existence everything from in game mechanics, to additional content can be tweaked, changed or sometimes even removed. Obviously even more traditionally limited games still do things like this now in the form of patches. On top of the day one patch issues we see, it’s also not uncommon to see games add gigs upon gigs of additional fixes, content, and changes months after the game has launched.

So my question for you all is, where do you stand on this subject? Obviously it’s a very subjective question, and it’ll also probably depend on which game in particular is doing it, and how well they do it. You can look no further than Hitman for a game that had an active development life cycle, and managed to release content frequently enough to keep interest but also sparingly enough to allow what they released to really hit the mark for their players. But everything doesn’t handle it that way. Destiny was launched almost missing content, and the developers treated it as if that wasn’t a huge deal because of the fact that they could patch and add content into the game later.

Looking outside of specifics, has the push for more active life cycles in game development had any drawbacks? Obviously. More and more games released incomplete, as developers know they have leeway to make the game right in the months post release. Of course this could also be blamed on publisher pressure to meet shipping dates, but you’d figure a conversation has to happen internally that balances a good ship date with the right amount of development time. Even still, releasing unfinished games is pretty much a standard, especially in a world where early access has also caught on.

While there has been a lot of reaction, both positive and negative, what I think I’ve found most fascinating is how it’s evolved the way people play games. Years ago I would never have thought that a person would not only come back to an active game years later, but wait until a game is later into its life cycle to even begin playing it at all. Leaving games and waiting for sales has not only become the norm because of the cost of games, but because most recognize that if a game they like launches broken, even slightly, if they wait for a sale the game will have been fixed by then. And hell it could even have added content.

Still; in the end I think I side with thinking it has actually hurt the industry probably more than it has helped. An industry as big as gaming is already divided enough, and having people divided by not just when the experience a game but what type of experience they have? Someone who invests 100 hours of Terraria the week it launched will have a completely different experience than someone puts 100 hours into now for example. It just leads to more and more divisive game experiences across the board. But like many things in modern society still remains a fascinating study in how as things grow larger they also often grow more organic in nature. Especially with humans at the helm. What are some other thoughts on this?

4 Comments