• 77 results
  • 1
  • 2
#1 Posted by Seppli (9766 posts) -


#2 Edited by Seppli (9766 posts) -

Just listened to the latest 8-4Play podcast, and they've indulged briefly in the topic of the upcoming console generation's hardware specs. An opinion sounded within that discussion was that "hardware specs don't matter anymore". A quite popular opinion amongst many a gaming enthusiast writer, and ultimately a fallacy.

Saying that hardware and games have already plateau'd is pure insanity. Proof of a complete lack of imagination. That kind of talk would be acceptable, if I'd be getting a mindblowing virtual blowjob by an illegal celebrity whore-bot in a tricked out high end cyber-brothel right now. I won't, not anytime in the foreseeable future. We are so very far from being there, it's painful to see people believe otherwise.

2007's Crysis alone should unmask such opinions as fallacy, seeing how no other game since has been as densely simulated and *physical* - for lack of a better word. It's only now, 6 years later, that the mass market will attempt to catch up to Crytek's ambitions - and we aren't even capable of seriously dreaming of getting a good sloppy cyber blowjob by our virtual celebrity vixen du jour yet - or whatever deviancy we could indulge in, given such liberating circumstances. We do not live in the future yet. I'm sorry to break the news.

#3 Posted by wewantsthering (1460 posts) -

How is this even a question? Hardware will always need to push forward as the demand for better graphics, physics, lighting, number of characters on screen, animation, etc because more advanced. There may be a point someday where it doesn't matter, but we're not even close to it.

#4 Posted by Funkydupe (3293 posts) -

Meaningless? I... No.

#5 Posted by SharkEthic (945 posts) -

From what I've heard, 8-4 dabbles mostly in Japanese gaming where specs aren't the end all be all like with most western games, so maybe that's what brought that on?

Either way, saying hardware specs doesn't matter anymore is just plain wrong.

#6 Posted by mellotronrules (1170 posts) -

it isn't that hardware specs don't MATTER per se- it's that they don't drive sales anymore. the days of buying pc graphics cards on a regular basis to play show-stopper releases are behind us. crysis, in addition to being proof-positive that there is much further to go in terms of fidelity, is also a relic of that time. there's a reason why it hasn't really been dethroned from that lofty position of being the defacto "game-to-kick-your-compy's-ass." sure there are plenty of games (witcher 2, far cry 3, battlefield) that will push your pc to its limits- but none are as absurd and regarded as crysis is/was.

we'll see what the new consoles do, but most seem to predict them to be fairly mid-tier to keep costs down in a tough economy.

tl;dr- games will continue to look better and better as gpu technology progresses. but in a world filled with angry birds, farmville, super meat boy and even CoD for that matter (it's been running on the same core engine for YEARS)- it isn't nearly as important as it used to be. and the success (or lack thereof) of this next batch of consoles will determine a lot- if they falter (for whatever reason), the justification for highly specialized graphics machines will evaporate.

#7 Posted by Giantstalker (1447 posts) -

Saying hardware specs doesn't matter is just plain wrong.

#8 Posted by Branthog (7332 posts) -

Yep. Hardware is totally irrelevant, now that multiplayer games are as big as anyone could ever imagine. Up to 12-18 on console! And AI? Pshaw! AI is so advanced. Why, character interactions and behavior is mindblowingly realistic! And the graphics? Frankly, I can't tell the difference between a model on my 360 and someone in real life. I mean, besides the horrible animation, facial movements, and the giant square legs and clothes that are just textured onto models, so that they stretch in ridiculous ways. And who could possibly want bigger game worlds where you can potentially enter every building, instead of only every fifty of them? And who could ever want an increase in world complexity and depth that improves how a player perceives the reality of the environment and people surrounding him? Four different car horns in a game and three different types of animals is so much variety! And who needs more than three shirt colors to share among hundreds of NPCs in an open world game? And geeze, what kind of spoiled baby needs more than 720 Ps? Christ, in my day, we didn't even have that many Ps! And 30FPS should be enough for anyone, ever!

Seriously, who puts forth these ideas? Who actively dislikes technological advance? Who doesn't want more power to do more things and have their mind blown, more frequently? Hell, my PC has $1,500 in video cards stuffed into it and I can't wait until this rig is obsolete. I want more power. More graphics. More compelxity. More AI. Bigger everything. I want a mission to mars. I want a space elevator. I want sexbots. I want my brain transplanted into an android. I want my trip to low earth orbit. I'm not going to live that many more decades and I don't want to wait for this shit. Anyone who doesn't want awesome stuff faster must be cynical. I can't imagine why else they'd ever say "nyah... I think we're good".

#9 Posted by Icemael (6271 posts) -

Hardware power has always been important and always will be.

#10 Edited by ArtisanBreads (3599 posts) -

You're listening to 8-4 play.

They're big fans of Nintendo. It's a different angle for gaming and that's fine but it's an angle where they aren't focused on tech. They will forgive a lot if it hits whatever aspects they are looking for. Just listen to the Wii U reactions on their podcast vs Giant Bombs... kind of shows you the differences.

To me, and others, tech is a big deal. I do like nice graphics but power can lead to new gameplay and scale.

#11 Posted by Hunter5024 (5180 posts) -

I certainly wouldn't say they're meaningless yet, but sooner or later we will reach a point where increasing hardware specs doesn't noticeably improve a game in a meaningful way, and I think we are getting closer and closer to that point. You need only look at the diminishing technological jumps between console generations to reach that conclusion.

#12 Edited by ArtisanBreads (3599 posts) -

@Hunter5024 said:

I certainly wouldn't say they're meaningless yet, but sooner or later we will reach a point where increasing hardware specs doesn't noticeably improve a game in a meaningful way, and I think we are getting closer and closer to that point. You need only look at the diminishing technological jumps between console generations to reach that conclusion.

I think we are very far from this being the case.

Physics, scale, AI... they are at tiny levels of what they can be.

That's in addition to the graphics themselves. Power means a lot.

This generation was a big leap from the last to me. We will see how the next systems seem in comparison but there is still much we can do to expand our games. Sure, some games are not going to change much. We don't need more power for our 2D platformers. But genres like open world games... added power will be huge for them.

#13 Posted by Branthog (7332 posts) -

@ArtisanBreads said:

You're listening to 8-4 play.

They're big fans of Nintendo. It's a different angle for gaming and that's fine but it's an angle where they aren't focused on tech. They will forgive a lot if it hits whatever aspects they are looking for. Just listen to the Wii U reactions on their podcast vs Giant Bombs... kind of shows you the differences.

To me, and others, tech is a big deal. I do like nice graphics but power can lead to new gameplay and scale.

There needs to be both available. For instance, I want machines with a ton of mind blowing power, but I'm okay with something on the scale of a DSi, if the catalog is super deep. Unfortunately, it rarely is. As an example, I never really played games growing up. So I missed out on a lot of great RPGs and other things in the 80s and 90s. Sure would love to go back and play all of the big franchises . . . and the DSi sure has the power to do it. But, nope, I can only get -- say -- game 3, 4, 7, and 9 from a series on it. Bleh. Give me something low powered that has a very focused intent and that's okay. But from what I've seen, these platforms are just too schizophrenic and shallow.

#14 Posted by M_Shini (548 posts) -

Meaningless for some specific games totally, but not for everything.

#15 Posted by Branthog (7332 posts) -

@Hunter5024 said:

I certainly wouldn't say they're meaningless yet, but sooner or later we will reach a point where increasing hardware specs doesn't noticeably improve a game in a meaningful way, and I think we are getting closer and closer to that point. You need only look at the diminishing technological jumps between console generations to reach that conclusion.

We are an incredibly long way from that point being reached. Just think of some of the most popular games out there. Think of how stale and lifeless they are, compared to their potential. On top of that, you have AI which (if we ever finally start focusing on that) requires a lot of computing power. And on top of all that, you have a rapidly approaching increase in resolutions. We're sadly looking at small incremental improvements for the next two generations, at least -- while trying to keep up with reasonable FPS (30-60) at standard resolutions (which will also scale over the next couple of generations). If resolution was going to remain stayed for at least twenty years, then I think we could look at a lot more gain . . . but we're going to burn so much maintaining what we have at higher resolutions, while trying to keep these stupid boxes under $300 (because god forbid people spend more than $300 for something that they'll have for a decade and that costs less than 5 games for the system will be). :/

In short, we will reach Kurzweil's Singularity, by the time games have stopped making noticeable improvements requiring significant power gains. Either that or people will stop playing video games (which I think is far more likely to happen).

#16 Edited by Hunter5024 (5180 posts) -

@ArtisanBreads: Well I think this generation was a big leap too, but not as big of a leap as PS1 to PS2, certainly not as big of a leap as SNES to PS1, and if current spec rumors are even close to accurate, it sounds like the jump from PS3 to PS4 will be even less significant than this one was. I do think that we have many years to come where we will be pushing the boundaries of AI, physics, and scale, but I have my doubts about whether or not these improvements will continue to fundamentally improve gameplay. Also I think the more advanced the specs become, the more expensive and time consuming it will be to develop for. So we will see a lot of game studios struggle to keep up as a result of this, and what fun would amazing specs be if it meant only companies like EA and Activision could afford to actually stress the technology?

@Branthog: I suppose I just believe audiences aren't particularly perceptive. Marketing a new box because of its incredible graphics is easy, but there are a lot less people who will be blown away by sophisticated AI. Because of that I don't really see companies investing resources into these things, because they won't see increased revenue from it. So in that sense, specs will become less meaningful. I think the time of technology being the driving force of innovation for video games is fading away, and that we're going to start to see a lot of the more noticeable improvements come instead from narrative and design.

#17 Posted by ArtisanBreads (3599 posts) -

@Branthog said:

@ArtisanBreads said:

You're listening to 8-4 play.

They're big fans of Nintendo. It's a different angle for gaming and that's fine but it's an angle where they aren't focused on tech. They will forgive a lot if it hits whatever aspects they are looking for. Just listen to the Wii U reactions on their podcast vs Giant Bombs... kind of shows you the differences.

To me, and others, tech is a big deal. I do like nice graphics but power can lead to new gameplay and scale.

There needs to be both available. For instance, I want machines with a ton of mind blowing power, but I'm okay with something on the scale of a DSi, if the catalog is super deep. Unfortunately, it rarely is. As an example, I never really played games growing up. So I missed out on a lot of great RPGs and other things in the 80s and 90s. Sure would love to go back and play all of the big franchises . . . and the DSi sure has the power to do it. But, nope, I can only get -- say -- game 3, 4, 7, and 9 from a series on it. Bleh. Give me something low powered that has a very focused intent and that's okay. But from what I've seen, these platforms are just too schizophrenic and shallow.

These new android systems with an emulator on them sounds like it might be your answer.

But yeah I agree both have thier own spots to exist.

I'm just a big believer in power because there are so much more games can do with power. Rockstars use of Natural Motion for example, is one of my favorite elements this gen. I hope we see more of that kind of thing.

#18 Posted by Nictel (2312 posts) -

As long as: in-game graphics < cgi < real life, there is still ways to go graphics wise.

#19 Posted by NoelVeiga (1044 posts) -

The question isn't whether hardware specs matter or not, that's moot.

The thing you want to keep an eye out for regarding how much better games can look is the price of pushing graphics further versus the price of games versus the popularity of gaming.

THAT has a hard ceiling, unless triple A gaming starts opening up and embracing the mass market. I hate to break it to you guys, but Dark Souls isn't going to sell enough copies to be able to afford looking like The Hobbit. Ever.

Now, the line of the cost for making things look good doesn't climb steadily. Tech appears that makes graphics cheaper, as well as better looking, but it does become more expensive to make something look better every time hardware leaps forward. Even with that process slowing down, there's a point where you start seeing graphical limitations based on budget, rather than hardware. We are already there on some cases, with Xbox and PS3 games sometimes lacking animation or detail that the hardware can definitely afford because the studio couldn't put the time to build them into the game.

This is the big issue in a gaming industry in which the hardcore become more radical and will give good games crap for being accessible (see DmC), the mainstream will game on any readily available devices with little regard for quality (see iOS), and the triple A industry is often too scared to try to go beyond the niche market. This is a bad combination. The kind of a combination that led the comic book industry to become a haven for nerds and nobody else, with just one genre and one business model. It kind of doesn't matter what the hardware can do if your available audience can't or won't give you enough money to fund it.

#20 Edited by Seppli (9766 posts) -

@NoelVeiga said:

The question isn't whether hardware specs matter or not, that's moot.

The thing you want to keep an eye out for regarding how much better games can look is the price of pushing graphics further versus the price of games versus the popularity of gaming.

THAT has a hard ceiling, unless triple A gaming starts opening up and embracing the mass market. I hate to break it to you guys, but Dark Souls isn't going to sell enough copies to be able to afford looking like The Hobbit. Ever.

Now, the line of the cost for making things look good doesn't climb steadily. Tech appears that makes graphics cheaper, as well as better looking, but it does become more expensive to make something look better every time hardware leaps forward. Even with that process slowing down, there's a point where you start seeing graphical limitations based on budget, rather than hardware. We are already there on some cases, with Xbox and PS3 games sometimes lacking animation or detail that the hardware can definitely afford because the studio couldn't put the time to build them into the game.

This is the big issue in a gaming industry in which the hardcore become more radical and will give good games crap for being accessible (see DmC), the mainstream will game on any readily available devices with little regard for quality (see iOS), and the triple A industry is often too scared to try to go beyond the niche market. This is a bad combination. The kind of a combination that led the comic book industry to become a haven for nerds and nobody else, with just one genre and one business model. It kind of doesn't matter what the hardware can do if your available audience can't or won't give you enough money to fund it.

It's not inconceivable that the gaming industry will free itself from consumer hardware dependency, and stream processing power to the end-consumer. The proof of concept has already happend with services like OnLive and Gaikai, and whilst thus far unsuccessful on the marketplace, it's a thing that's inevitably going to happen. Interactive entertainment will be broadcast to the masses. To every tv screen, and every handheld device. Opening up an immense market, that's currently being tapped only by the simplest of mobile and social games, and that only barely.

You also underestimate how much more efficient game development can become through new development techniques enabled through more powerful hardware and ever smarter software engineering, such as realtime iteration. Procedural content generation too will over time reduce production costs immensely for things like facial animations and such.

Another factor you are not considering is, that best-selling videogames already generate more immediate income than movies. Call of Duty is bigger than The Hobbit. In the longrun, with movies being monetized in multiple phases - from the silver screen to the home theater to broadcasting and streaming, movies might still make more money overall - but unlike games, they've no untapped potential left (other than fully adopting digital distribution for a global audience, and alternative business models, such as sevices similar to Spotify for video based media).

#21 Posted by Gamer_152 (13976 posts) -

Yeah, it's silly. There's definitely less of a weighting on hardware specs when it comes to assessing the overall quality of a console, but they're still a huge part of determining the worth of a system.

Moderator
#22 Posted by RazielCuts (2717 posts) -

I think theres more bullet points on the box these days - Online, Streaming, HDD, DLC, controller inputs etc. But yeah, as everyones said, hardware specs are still a threat, otherwise we wouldn't be going forward we'd just be staying stagnent.

#23 Edited by Hilbert (347 posts) -

I'm not buying the next generation of consoles so I can play Zuma in UHDTV.

Off course I want better graphics so the next Elder Scrolls can look and sound even better, with more variety in missions, less loading times and bigger worlds. More power also means better A.I. which affects the gameplay considerably. Why would you even buy a new console if it would have the same capabilities?.

Edit: I have to add this.

This is one of the dumbest "arrogant" questions I've ever read. It's like someone randomly connected some dots from watching too much bonus rounds at Gametrailers.com with stock broker annalist Michael Pachter. This question has nothing to do with gamers. Your basically asking if Gamer developers are finally at that that point where they can get away with just selling us Angry Birds till we die.

#24 Posted by NoelVeiga (1044 posts) -

@Seppli: Streaming doesn't fix the issue. You still need hardware to render the graphics (server side this time, so you also need to subsidize it with money somehow) and a customer to pay for the development of the game. If not enough people buy the game (or the service or whatever you're doing to charge for this), you can't make games.

That's not broadcasting, by the way. In broadcasting one version of media is pushed out to customer-owned devices. On streaming platforms, each user connected has an incremental cost because each of them need a virtual machine to render their game. Basically, you need to make it so that you can serve every customer but also so that you are making enough money per customer that you can put a virtual machine to their disposal, which costs money. The more people that watch a TV show, the better, because 1 viewer costs the same as 100 viewers or 10000 viewers. Costs increase for streaming, so 10 viewers are more expensive than 1. So no, it's not an immediate solution.

I also don't underestimate how costly it is to ramp up in visuals. Like I said, yes, a lot of stuff is cheaper to do with better tech, but the truth is that historically, each generation of consoles has been more expensive to develop for than the next (unless you count mobile phones). Facial animations created procedurally is a cool idea, but the fact is that facial motion capture is what works, because any animation system is worse than a good actor. As it turns out, hiring an actor, building a mocap rig and a soundstage and shooting 60 minutes of cinematics is more expensive than one guy on 3D Studio working for a couple of months. It looks great, but it's expensive.

Same with other areas. Sure, building a forest is cheaper now if you just get some Speedtree going, right? But a forest is not a game, now you have a huge expanse that you need to populate and make playable. People do that, and that costs money.

I love your last point, though, because it's so frequently quoted by the media and it's. Just. So. Wrong.

See, videogames are and will always be a one window business. That's why the industry is so desperate to find games as a service avenues for monetization. Every film made is money forever. Every time a new format comes out, you get cash from re-purchases. Every time it's shown on TV, you get paid. Netflix is doing wonders for old media libraries. A single piece of video (a piece of video! think about that!) gets paid for over and over, through single viewing payments, subscription payments, one-package-endless-viewings purchases AND ad revenue.

A game is a game is a game. For all the mileage this gen got out of "HD remakes", those are a cultural artifact. You won't re-purchase Shadow of the Colossus 50 years from now. I do own a Blu-Ray copy of Casablanca, though.

But also, movies get rated by the audience they make 10 bucks at a time, of which they share a good chunk with the guy that owns the theatre. Games are judged based on purchases of sixty bucks at a time. Sell five million worldwide and you're a videogame hit. Double digits? You're a blockbuster and a guaranteed yearly franchise.

You know which movies gathered 10 million people in theatres? Well, I'd like to tell you, but BoxOfficeMojo stops counting at 500 movies, and those made more than 200 million, which means at least 20 million people on theatres. I can tell you Garfield: The Movie is in that list.

Yeah.

20 million people paid an average of ten bucks to watch Garfield: The Movie. Not counting DVD sales, TV sales, streaming, merchandising or any other revenue source. That's as many people as Modern Warfare 3. Which was the best selling game in 2011. Worldwide.

Triple A gaming is niche. Triple A gaming makes money once and never again. Immediate income is just that, day one stuff. Lifetime income is what keeps film studios going. If your budget is going to be 200 million for anything, then it needs to be The Avengers-like, not Garfield-like. Again, Dark Souls is never going to do that, so Dark Souls can never cost 200 million to make. That's your ceiling, not technology.

#25 Posted by nintendoeats (5975 posts) -

When I'm being charitable, I interpret the "hardware doesn't matter" point as not being about graphics but about gameplay. It is true to some extent that we can make pretty much any kind of game that we want (albeit with some sacrifices), where previously that was not true. You couldn't make Bayonetta for a Gamecube, it just can't handle that much action. I'm not sure that there are game ideas that are unexplorable on a 360.

#26 Edited by Seppli (9766 posts) -

@NoelVeiga:

Streaming may not be a cure-all for the triple A gaming industry, but it does essentially reduce the barrier of entry to *big experiences* gaming to owning an internet enabled TV or handheld gadget. That's a huge step forward market-potential-wise, in comparison to the current dependency on customer-owned dedicated gaming hardware.

About movies and games, don't you just repeat what I was saying? It's just that I believe games have the potential to monetize beyond the initial sales, and you don't. Services like PSN+, which is essentially a gaming flatrate, will allow content producers to monetize older titles indefinitely. Old assets can be retooled for the free2play business model, like Battlefield Play4Free did with assets from Battlefield 2 and Bad Company 2. Once streaming games becomes a more viable business model, another stream of revenue will open up. Microtransactions, subscriptions, expansions, season passes - successful games do bring revenue beyond the initial sale, and content producers become better at catering to that fact. As far as I can tell, the post-lauch revenue stream is an ever growing segment of the business, with lots of growth potential left to work with.

Sure, the air is thin for big triple A games, but it's a lucrative business that's only going to become more lucrative. Content producers will do fewer big budget games, and will have to learn to spread the risk amongst more much smaller productions, that put gameplay over presentation and mass appeal. I'm certain smaller and smaller teams will be enabled to do greater and greater things as software engineering and processing power progresses down the line. Just wait and see - soon we'll get the first wave of games made with engines capable of realtime iteration, and it will show.

#27 Posted by believer258 (11058 posts) -

@Seppli said:

Another factor you are not considering is, that best-selling videogames already generate more immediate income than movies. Call of Duty is bigger than The Hobbit. In the longrun, with movies being monetized in multiple phases - from the silver screen to the home theater to broadcasting and streaming, movies might still make more money overall - but unlike games, they've no untapped potential left (other than fully adopting digital distribution for a global audience, and alternative business models, such as sevices similar to Spotify for video based media).

Yes. Call of Duty is the most popular video game of all time. It has been using the same engine since 2007 and that engine has bits of the Quake 3 engine in it.

Quake. Fucking. Three. Let that sit in for a minute. The game that makes the most money isn't necessarily the one that looks the best anymore, so companies like EA and Activision aren't going to pour tons and tons of money into a game that looks better than anything else out there and stresses computers harshly. We live in a time where Metro 2033, a game that came out a little more than two years ago (late 2010, it's early 2013) is still one of the most common benchmarks for graphics cards (yes, I know it's badly optimized). All of this means that the market doesn't care as much for graphics and technology as it used to.

Am I saying that technology isn't going to progress, or that it has plateau'd? Absolutely not. Watch Dogs and Star Wars 1313 are proof of that, and Crysis 3 seems like it's going to look pretty good as well. But - and please note carefully what I'm saying here - it is no longer a major reason why consoles and/or games sell so well and so we will not have the same technological jumps that we had before. It's going to slow down. Not stop, but slow down.

Something else:

The proof of concept has already happend with services like OnLive and Gaikai

Yes, it has been proven that it's technically feasible to stream games from a server, but unfortunately it's not really practical at all. It costs a whole lot of money to make that happen and, well, not many people have that kind of internet yet. Plus, the better your graphics, the more bandwidth and less latency you're going to need to properly do that, and even then it could never look as good as a local game could on a local machine connected directly to your monitor. I really doubt that this route is going to be the one where technological leaps are made, and I also think that companies are going to be less interested in it when

whilst thus far unsuccessful on the marketplace

This still holds true. Remember, we're in a world where the vast majority of consumers still go to the store to buy their games and audio CD's still sell enough to take up racks on shelves. Do you honestly think that streaming a video game is ready for mass market appeal when most people aren't even used to the idea of having a game on a hard drive and not on a physical disc?

#28 Posted by Seppli (9766 posts) -

@believer258:

Clearly I talking of the foreseeable future, and not the immediate future. Especially when talking about streaming interactive entertainment.

Technical stunners definitely do sell units, it's only a matter of hardware pricing. If people could have played Crysis maxed out on a new 399$ console back in 2007, that system would have sold like hotcakes for breakfast. However such hardware wasn't as easily affordable then. Now however, if rumours are true, we'll get dedicated gaming machines that are effectively more powerful than current high end PCs, for what I'm assuming to be an acceptable price. The gaming industry will showcase mindblowing games at E3, that will awaken that sense of wonder and awe that Crysis did in 2007, and you'll have the perfect storm of demand and supply.

Shock and awe - baby! You'll see.

#29 Posted by Icemo (626 posts) -

If human would have been satisfied with what he/she had, we would still be living in mud huts. So yeah, I want better hardware and I'm not satisfied with my games until I'm experiencing them in virtual reality or something equally awesome.

#30 Posted by believer258 (11058 posts) -

@Seppli said:

Now however, if rumours are true, we'll get dedicated gaming machines that are effectively more powerful than current high end PCs, for what I'm assuming to be an acceptable price.

I doubt that, but we shall see.

#31 Posted by NoelVeiga (1044 posts) -

@Seppli said:

@NoelVeiga:

Streaming may not be a cure-all for the triple A gaming industry, but it does essentially reduce the barrier of entry to *big experiences* gaming to owning an internet enabled TV or handheld gadget. That's a huge step forward market-potential-wise, in comparison to the current dependency on customer-owned dedicated gaming hardware.

You misplace the barrier to entry, though. It's not in the cash the hardware costs. Everybody has an iPad, and that's more expensive than a 360. The barrier to entry is in the controller, in the tone of the games (believe it or not, the mainstream doesn't really find "fantasy land full of people with huge swords and big boobed ladies with impractical breastplates" to be an appealing proposition) and in the intricacy of the mechanics on display. If Mr. Joe Averagepants, not to mention Ms. Averagepants, had the slightest interest on playing Darksiders, they probably own at least one or two devices that can run Darksiders or Darksiders-like experiences already. They're probably playing Angry Birds, though.

About movies and games, don't you just repeat what I was saying? It's just that I believe games have the potential to monetize beyond the initial sales, and you don't. Services like PSN+, which is essentially a gaming flatrate, will allow content producers to monetize older titles indefinitely. Old assets can be retooled for the free2play business model, like Battlefield Play4Free did with assets from Battlefield 2 and Bad Company 2. Once streaming games becomes a more viable business model, another stream of revenue will open up. Microtransactions, subscriptions, expansions, season passes - successful games do bring revenue beyond the initial sale, and content producers become better at catering to that fact. As far as I can tell, the post-lauch revenue stream is an ever growing segment of the business, with lots of growth potential left to work with.

PSN+ is really interesting, but I honestly have no idea of how much money anybody is making out of it and how that money flows from the customers to Sony to the publishers. I wouldn't be surprised if that ended up working as some manner of second window, though.

That being said, you're mixing two things. In films, the theatre, the disc and the TV broadcast provide different experiences. You can't watch IMAX at home and you can't watch the TV broadcast whenever you like. Those things don't compete among each other. Streaming does compete with discs and broadcasts, and you can see how disruptive that is for the market.

Games are always games. The game you get on a box is the same game you get on PSN+. Those two models drain money from the same pool. Microtransactions, subscriptions and DLC do not, those generate additional revenue... but only for people that buy into the main game in the first place. That's not the same as the TV broadcast expanding the audience of a film. Plus, a few of those significantly alter how the game is designed, which doesn't happen for movies (unless you count ad breaks on TV).

Steam has been the closest to developing a multi-window model for gaming. Pre-order at a small discount, buy at launch full price, wait for a sale for a big discount. Even that model understands that post-release revenue is marginal, though. If anything, it's brave enough to understand this to the point where any sales are better than no sales, which is what you normally get. Everybody is doing this to some extent these days.

Sure, the air is thin for big triple A games, but it's a lucrative business that's only going to become more lucrative. Content producers will do fewer big budget games, and will have to learn to spread the risk amongst more much smaller productions, that put gameplay over presentation and mass appeal. I'm certain smaller and smaller teams will be enabled to do greater and greater things as software engineering and processing power progresses down the line. Just wait and see - soon we'll get the first wave of games made with engines capable of realtime iteration, and it will show.

I fully agree. 100%.

That's not what you stated originally, though. The argument you were making is that there is no ceiling for technology making games look better. My argument is that it doesn't matter whether there is a tech ceiling, because we are actually closer to the budget ceiling. That doesn't mean visuals and quality won't improve, they will, but it does mean that models need to change and expand to account for the new styles of production that will be required. Gaming needs to get the mainstream to buy into triple-A games or risk ending up like superhero comic books, stuck in decades of jokes about cheetos and basements.

#32 Posted by SmilingPig (1337 posts) -

Yes! If you only play Facebook games. Beside; coins are expensive who can afford new hardware.

#33 Posted by Icemo (626 posts) -

@believer258 said:

@Seppli said:

Now however, if rumours are true, we'll get dedicated gaming machines that are effectively more powerful than current high end PCs, for what I'm assuming to be an acceptable price.

I doubt that, but we shall see.

Let's see here, cheapest GeForce GTX 680 that I found with quick search costs $459.99 now. And that is only one part of a PC. Your statement would be true if next gen consoles were to launch in 2014 and console manufacturers started to develop their consoles just now. I imagine that Microsoft and Sony have already developed their new console hardware with other companies and I can assure you that Intel and AMD and Nvidia haven't been developing hardware for consoles in secret that will suddenly be more powerful than what they have been developing for PC all along.

#34 Posted by Azteck (7447 posts) -

Anyone who says that must have no clue how computers and game development works at all.

#35 Posted by timlump (152 posts) -

To anyone here who can tolerate a technical paper - read "Dark Silicon and the End of Multicore Scaling" ftp://ftp.cs.utexas.edu/pub/dburger/papers/ISCA11.pdf

The gist of the paper is - we have hit a brick wall in terms of what silicon transistors can do and while we can stretch out graphics cards a little bit more due to the embarrassing parallel nature of rendering - we are in trouble overall. We hit the brick wall back in 2005 really when single core scaling had to stop, any speedups since have been in squeezing that little bit of parallelism out of our applications and improved algorithms.

#36 Edited by Seppli (9766 posts) -

@NoelVeiga:

You make good points. I've never considered this thing you call *budget ceiling*. Certainly food for thought. Thanks for the input.

#37 Edited by MonkeyKing1969 (2283 posts) -

Meaningless?  No, the specs of hardware are important in many respects.  However, those respects are just resolution and framerate at this time.  It used to be you could not even make a game on another system if you didn't have some sort of hardware that did smooth sprite scrolling or generate textured polygons.  Today nearly any system can make nearly any game as long as texture detail, resolution, and frame rate are allowed to be slightly variable.
 
The last time hardware REALLY mattered was in the PSX, N64, and Saturn generation.  If you look at the chip sets in those system you will notice that what can be done with those chips and how easily was very different.  And, that even makes sense just from the outside, that was the generation where 3D was new.  People may not know this but HOW those three system rendered polygon graphics was very different.  Today a PS3 and a 360 and a Wii work nearly the same how ploygons are rendered is the same, that was no the case in 1995.
 
There is an excellent book that describes that era and how the first PlayStation came about, a few of the chapters talk about how Sega thought about 3D and what hardware they had and discusses how the partnership of Nintendo and Silicon Graphics brought N64/Ultra64 about in the marketplace.  The books is called "Revolutionaries at Sony: The Making of the Sony Playstation and the Visionaries Who Conquered the World of Video Games" by  Reiji Asakura.  If you are interested in video game hardware you should  read it. 

#38 Posted by Spoonman671 (4378 posts) -

As far as graphics are concerned, you can only hire so many artists and pay them for so long before there's no longer a profit to be made on a product. There's plenty of room for improvement, but there is a practical limit to the level-of-detail we can achieve in a commercial product with the way development works now. New techniques and technologies can change that, but they are unpredictable, so you can only take them into account so much. You can also improve graphical performance while maintaining the current level-of-detail with things such as draw distance and framerate, which don't require additional art assets to be created. Greater scale can always be achieved, but eventually the return on that investment will diminish as well.

And of course, a system's specs can be used to improve things other than graphics. Personally, I'm hoping for some better AI systems.

#39 Edited by Seppli (9766 posts) -

@NoelVeiga said:

@Seppli said:

@NoelVeiga:

Streaming may not be a cure-all for the triple A gaming industry, but it does essentially reduce the barrier of entry to *big experiences* gaming to owning an internet enabled TV or handheld gadget. That's a huge step forward market-potential-wise, in comparison to the current dependency on customer-owned dedicated gaming hardware.

You misplace the barrier to entry, though. It's not in the cash the hardware costs. Everybody has an iPad, and that's more expensive than a 360. The barrier to entry is in the controller, in the tone of the games (believe it or not, the mainstream doesn't really find "fantasy land full of people with huge swords and big boobed ladies with impractical breastplates" to be an appealing proposition) and in the intricacy of the mechanics on display. If Mr. Joe Averagepants, not to mention Ms. Averagepants, had the slightest interest on playing Darksiders, they probably own at least one or two devices that can run Darksiders or Darksiders-like experiences already. They're probably playing Angry Birds, though.

About movies and games, don't you just repeat what I was saying? It's just that I believe games have the potential to monetize beyond the initial sales, and you don't. Services like PSN+, which is essentially a gaming flatrate, will allow content producers to monetize older titles indefinitely. Old assets can be retooled for the free2play business model, like Battlefield Play4Free did with assets from Battlefield 2 and Bad Company 2. Once streaming games becomes a more viable business model, another stream of revenue will open up. Microtransactions, subscriptions, expansions, season passes - successful games do bring revenue beyond the initial sale, and content producers become better at catering to that fact. As far as I can tell, the post-lauch revenue stream is an ever growing segment of the business, with lots of growth potential left to work with.

PSN+ is really interesting, but I honestly have no idea of how much money anybody is making out of it and how that money flows from the customers to Sony to the publishers. I wouldn't be surprised if that ended up working as some manner of second window, though.

That being said, you're mixing two things. In films, the theatre, the disc and the TV broadcast provide different experiences. You can't watch IMAX at home and you can't watch the TV broadcast whenever you like. Those things don't compete among each other. Streaming does compete with discs and broadcasts, and you can see how disruptive that is for the market.

Games are always games. The game you get on a box is the same game you get on PSN+. Those two models drain money from the same pool. Microtransactions, subscriptions and DLC do not, those generate additional revenue... but only for people that buy into the main game in the first place. That's not the same as the TV broadcast expanding the audience of a film. Plus, a few of those significantly alter how the game is designed, which doesn't happen for movies (unless you count ad breaks on TV).

Steam has been the closest to developing a multi-window model for gaming. Pre-order at a small discount, buy at launch full price, wait for a sale for a big discount. Even that model understands that post-release revenue is marginal, though. If anything, it's brave enough to understand this to the point where any sales are better than no sales, which is what you normally get. Everybody is doing this to some extent these days.

The lowering of the barrier of entry is of significance, because it increases the market potential extremely. Your fallacy is to assume that the industry wouldn't try to develop products for this broader market. I could imagine interactive crime or romance novels to be a big hit with stay-at-home females, offering interactivity perfectly controllable by the lowliest of tv remotes.

As for multi-window sales, just like movies, it's all about timing. A product trickles down from release price to budget price to flate rate and streaming services. For streaming services it's imaginable that there will be a spotify-like free tier of service, that's advertisment supported and comes in lower fidelity, as well as multi-tiered subscription models for more and better access to a comprehensive library of games. Surely there will also always be a market for dedicated gaming hardware too, with experience enhancing devices like the rumored omni viewer for the PS4, or the projection device for the next Xbox - for the customer-base we are part of - the enthusiast gaming market.

#40 Posted by MordeaniisChaos (5730 posts) -

@SharkEthic said:

From what I've heard, 8-4 dabbles mostly in Japanese gaming where specs aren't the end all be all like with most western games, so maybe that's what brought that on?

Either way, saying hardware specs doesn't matter anymore is just plain wrong.

So what you're saying is they just make the same like... three games over and over?

#41 Posted by Jack268 (3387 posts) -

No, but I think consoles will never be on par with PCs anyway, so complaining about the fact that the PS4 or 720 won't contain an HD8970 or equivalent is just being silly. Consoles have never been on par or stronger than PCs, so if one wants to masturbate over graphics you might as well just stick to your PC.  
 
Stronger hardware would mean a lot more if developers actually used it on improving gameplay but usually it just means "LETS DIVERT MORE RESOURCES TOWARDS MAKING THIS LOOK GOOD" and I think that's kind of unneccesary. Like, I would rather have the slightly worse graphics in MGR with double the framerate and the free cutting over a 30 FPS lock and the unneccesary warping world gimmick in DmC.

#42 Edited by Seppli (9766 posts) -

@Spoonman671 said:

As far as graphics are concerned, you can only hire so many artists and pay them for so long before there's no longer a profit to be made on a product. There's plenty of room for improvement, but there is a practical limit to the level-of-detail we can achieve in a commercial product with the way development works now. New techniques and technologies can change that, but they are unpredictable, so you can only take them into account so much. You can also improve graphical performance while maintaining the current level-of-detail with things such as draw distance and framerate, which don't require additional art assets to be created. Greater scale can always be achieved, but eventually the return on that investment will diminish as well.

And of course, a system's specs can be used to improve things other than graphics. Personally, I'm hoping for some better AI systems.

What about procedural asset generation? What about asset generation outsourcing to countries with cheap labor? And god knows what else. The industry will figure out ways to generate more and higher fidelity assets at lower cost - or it wouldn't be an industry at all. It sure is a problem, but certainly not an unsolvable one.

#43 Posted by 9cupsoftea (652 posts) -

For me, hardware specs are meaningless. I honestly don't care about graphical fidelity - I think well-done creative art direction trumps that every time. I think it's actually a big problem for gaming that everyone is so obsessed with specs. I enjoy Persona 4 as much as Mass Effect, Hotline Miami as much as Borderlands, Day of the Tentacle as much as The Walking Dead, Half Life 2 as much as Far Cry 3. Great games can be made with low-spec tech, and what with studio budgets going out of control this gen I'd rather we held back.

#44 Posted by Fredchuckdave (4493 posts) -

We're looking at a marginal increase so it doesn't really matter in the context of consoles; if consoles become solely mobile devices in the future then that will deteriorate even further. Right now there's precisely 2 games that are worth having a high end PC for, those being Crysis and The Witcher 2; every other game can be run at top or near top settings with an older computer; it's simply another consumerist stupid thing that people like to waste money on; now if there were tons of games that were difficult to run and a continual pushing of newer, better graphics like there used to be then sure hardware matters; but that period seems to be dying out. Of course graphics could get better but they'd also get much more expensive and the only company that is continually successful at present is Activision/Blizzard; guys that like to make the same game over and over with little or no changes.

#45 Posted by Spoonman671 (4378 posts) -

@Seppli said:

@Spoonman671 said:

As far as graphics are concerned, you can only hire so many artists and pay them for so long before there's no longer a profit to be made on a product. There's plenty of room for improvement, but there is a practical limit to the level-of-detail we can achieve in a commercial product with the way development works now. New techniques and technologies can change that, but they are unpredictable, so you can only take them into account so much. You can also improve graphical performance while maintaining the current level-of-detail with things such as draw distance and framerate, which don't require additional art assets to be created. Greater scale can always be achieved, but eventually the return on that investment will diminish as well.

And of course, a system's specs can be used to improve things other than graphics. Personally, I'm hoping for some better AI systems.

What about procedural asset generation? What about asset generation outsourcing to countries with cheap labor? And god knows what else. The industry will figure out ways to generate more and higher fidelity assets at lower cost - or it wouldn't be an industry at all.

I'm not saying we have reached the point of diminishing returns yet, simply that I believe that point exists. Procedural systems cannot make something out of nothing, and the quality of the product of those systems is never as high as products of custom make. Compare any single gun in the Borderlands games to those of the Killzone games. Or the environments of Dark Cloud to that of its contemporary Final Fantasy games. It's entirely possible, in fact it's probable that new technologies will solve these problems, but we don't know when/if this will happen and so it is simply conjecture on our part.

Cheap labor will certainly provide publishers with wiggle room in their budget, but it will never be free labor and game sales will never be infinite, so there is still a limit to the investment a company can make in art assets and remain profitable.

I agree that game makers are going to find ways to improve efficiency, but I don't know when these innovations are going to happen or how drastic they will be.

#46 Posted by iam3green (14388 posts) -

i would say kind of, i was expecting this to be more about computer hardware. anyway, it's kind of important. pc ports sometimes are horrible that you need a super computer to play them. it would be good to have decent computer hardware.

graphics are getting better and better just about every year.

#47 Posted by Seppli (9766 posts) -

@Spoonman671:

Good point.

#48 Posted by SharkEthic (945 posts) -

@MordeaniisChaos said:

@SharkEthic said:

From what I've heard, 8-4 dabbles mostly in Japanese gaming where specs aren't the end all be all like with most western games, so maybe that's what brought that on?

Either way, saying hardware specs doesn't matter anymore is just plain wrong.

So what you're saying is they just make the same like... three games over and over?

That''s not really what I'm saying at all.

#49 Edited by NoelVeiga (1044 posts) -

@Seppli said:

The lowering of the barrier of entry is of significance, because it increases the market potential extremely. Your fallacy is to assume that the industry wouldn't try to develop products for this broader market. I could imagine interactive crime or romance novels to be a big hit with stay-at-home females, offering interactivity perfectly controllable by the lowliest of tv remotes.

You are mixing two things here again. I agree, lowering the entry point in price AND design brings more people over... to Angry Birds and Wii Sports.

Those games are already far below the technical capabilities of the hardware we are using today, so no concern on hardware specs or development cost there.

But that money that goes into the industry at large doesn't go into games that push the visual and gameplay envelope. The barrier there isn't tech or price, it's design, and if you design triple A hardcore games for the mainstream, you'll likely lose both. The point I was making is that there isn't a version of Darksiders that appeals to the mainstream. Even for free that won't happen.

As for multi-window sales, just like movies, it's all about timing. A product trickles down from release price to budget price to flate rate and streaming services. For streaming services it's imaginable that there will be a spotify-like free tier of service, that's advertisment supported and comes in lower fidelity, as well as multi-tiered subscription models for more and better access to a comprehensive library of games. Surely there will also always be a market for dedicated gaming hardware too, with experience enhancing devices like the rumored omni viewer for the PS4, or the projection device for the next Xbox - for the customer-base we are part of - the enthusiast gaming market.

I already mentioned this, but the reason why this doesn't work is that you are getting the same experience from all of those potential windows. A guy that watches a film on a theatre is MORE likely, not less, to buy the DVD or Blu-Ray. A person that purchases a game at retail... already owns the game. If you want to monetize that guy further you have to create new content, and then you're back to spending money, instead of putting your movie on a disc and calling it a day. So the problem of the idea of that Spotify-like service is that whoever signs up for that will stop buying games at retail, and whoever buys all their games at retail will not have a need to sign up for any of those other things.

In the end, those models are treading water, because the hard cap of the industry isn't the amount of people in the world, it's the amount of people interested by the experience you're providing. Triple A games are enjoyed by a tiny amount of people. Being generous, maybe the 150-odd million that purchased a PS2 at any point in history. That's less than the amount of people that went to watch The Avengers, and you're lucky if you get 5 to 10% of that to buy your game in any format. In case I'm not being clear here, this means that if you convinced every moviewatcher that went to see any given action film this summer to also buy one videogame, that game would outsell Modern Warfare by a ratio of anywhere between 5 to 1 and 20 to 1.

Yeah. The cap here is not technology or money, the cap is... well, appeal. You can deliver the content however you want, but with the amount of people willing to play the triple A games the industry is making the math just doesn't work. We need more people interested in games. If games got the penetration of the iPad (which is more expensive than any available console) this would be a different conversation. Apple sold 75 million iOS devices in three months. That's more than the lifetime sales of the 360, at a higher price point.

#50 Posted by Colourful_Hippie (4282 posts) -

@Seppli said:

Saying that hardware and games have already plateau'd is pure insanity. Proof of a complete lack of imagination.

This. Just because people are blown away with what we already have doesn't mean that we've peaked. We can always keep moving forward. It may not be the leap just yet like 2D to polygonal was but over time we will look back at this generation and be like "holy shit, look how far we've come."