#1 Edited by Darji (5294 posts) -

Ok now we have the absolute truth^^

For a full presentation slide from Guerilla click Here

So now with a lot more ram I can not wait what they will be able to do now as well. It will be really interesting. Hopefully we will see stuff already at E3.

#3 Edited by TrafalgarLaw (1132 posts) -

Crysis who? Battlefield what? Get out of the way, or fall into Killzone's shadow.

Get hyped!

#4 Posted by 6n00bkilla9 (153 posts) -

the slides are actually really awesome more ps4 info than ever! so excited for next gen!

#5 Posted by Demoskinos (14882 posts) -

Im all on team Sony this gen. Unless Microsoft can really hit everyone with a haymaker next Tuesday on the Xbox unveil.

#6 Posted by Darji (5294 posts) -

Here it is a bit more clearer. They used like 5GB including all the system stuff.

Not worthy is also that they used over 500 MB for the Sound. So there is a ton more of optimization n there. Also some kind of ray-tracing for the reflections which also is very impressive.

#7 Edited by Colourful_Hippie (4372 posts) -

@darji: That's pretty cool and it's crazy that the AA used was only FXAA. Can't wait to see how further things can be pushed now.

#8 Posted by 6n00bkilla9 (153 posts) -

i would love to have Brad Mir talk about this on a Thursday or friday that would be my dream. i love technical things like this, whether old consoles (LOVE THIS) or new ones, the whole programming etc. process is the reason I come to this site besides Jeff who is the best ever!

#9 Posted by Vertrucio (148 posts) -

Maybe they'll actually write a story worthy of Killzone 1 now that they have gotten all these graphical shenanigans out of the way.

Stop randomly killing off our favorite characters, or making you hate the ones that are left.

#10 Posted by Zero_ (1973 posts) -

I remember reading an article somewhere Sony was pushing for 1080/60fps to all developers - is the current 1080/30fps Killzone demo done before the suppose 8GB surprise announcement? From what I heard, the 8GB was an amount that surprised even developers - but for Guerilla who's part of Sony, wouldn't they have known?

#11 Posted by MachoFantastico (4719 posts) -

The one thing that really does excite me, is the fact that developers won't have to cut so many corners when putting their games on Xbox One and PS4. We've seen some great looking PC games having to be cut here and there technically just so they can run on gaming consoles. I'm sure they'll still be some factors, but it's still exciting.

As anyone recently seen some of the 60FPS gameplay footage Eurogamer have been posting, still makes me secretly hope we see more 60FPS games in the new gen. They look so damn good. Still love the Uncharted 3 video they released in 60 frames per second. Link

#12 Posted by MachoFantastico (4719 posts) -

@zero_ said:

I remember reading an article somewhere Sony was pushing for 1080/60fps to all developers - is the current 1080/30fps Killzone demo done before the suppose 8GB surprise announcement? From what I heard, the 8GB was an amount that surprised even developers - but for Guerilla who's part of Sony, wouldn't they have known?

Not exactly, after all they were using prototype hardware so it's possible Guerilla were working off the base hardware that wasn't quite as powerful at the time and Sony decided to up the RAM. That and it could be a time issue, maybe they just didn't have enough time to develop that demo for the reveal. Plus it's important to remember that the PS4 will have other stuff working in the background that will most likely require some RAM.

#13 Posted by Vertrucio (148 posts) -

Hardware specs always change up to the point of manufacturing.

It's pretty clear that Sony originally planned for 4GB of memory, then use the GDDR5 to make it faster. However, upon MS's announcement of 8GB, and the problems this generation with the PS3 and its funky memory spec and ports, they just decided to bite the bullet and jump up to 8GB GDDR5 and not try to split the memory into two banks of different speeds. Anything to make it easier on developers.

One thing to remember is that up until earlier in this year, all the PS4 devs must have been working on a 4GB and slightly lower spec. It's going to be awesome what developers will do with the full ~7GB, with 1GB being reserved for OS I think.

Let's hope they make bigger and less straight linear levels, and good AI that can use that. I miss the AI from F.E.A.R. and it's great combat level design. Killzone's AI was terrible, however, they put in a lot of cool bits that made them seem interesting, and do fun things.

Also, if you noticed, the main character is wearing powered armor, this would be a nice excuse to give him survivability closer to that of say, Master Chief in normal difficulty. The problem with a lot of "modern" shooters or modern-like shooters is that everything is so deadly and accurate that you can't really throw caution to the wind to just do something action-hero like. Killzone 1 got around this by having AI being braindead PS2 era AI with low framerate and everything moving slowly. KZ2 & 3 got around this by giving you a first person cover and lots of chest high walls. All 3 games so far have also been marred by hardware limitations, in all 3 killzones, you couldn't make the action that fast since it stretched what the hardware was capable of. In KZ2, if you turned around quickly you could see it literally unloading and reloading all the textures behind you.

I think a clear example of them trying to make it more action movie hero-like is the scene with the drop down kill, then you throw the knife at another target. Let alone hanging on to a flying shuttle.

#15 Posted by I_NeverAskedForThis (2 posts) -

Im excited for this game, looks fantastic.

#16 Edited by Red12b (9084 posts) -

I have no idea what any of that means,

it looks nice?

#17 Edited by Badass_Master_Blaster (29 posts) -

Cant wait to see this generations games grow, in a few years this could just be the standard psn game....hopefully. (need to sort out my avatar soon)

#18 Posted by THRICE_604 (210 posts) -

Would really like for there to be some hands on soon with Killzone. It looking as good as it does is great and seeing its ridiculously optimized is even better. But the series has always had some control issues. Do they forgo the weight to the character movement to preserve fluid controls expected of a FPS for example. Do they adopt a more traditional control scheme as well (no more ADS on R3 Guerrilla its not 2004).

And as a side note hopefully the story plays a more central role. Anyone who has actually read the back story fiction to the universe knows that it is really interesting and complex and actually quite unique to most shooters. Of which Guerrilla puts zero percent into the actual games. Like how many people realize that the Helghast are actually the good guys that were oppressed by the Earth government for generations and only attacked as a last resort? Probably not a lot of people.

#19 Posted by mrfluke (5199 posts) -

.And as a side note hopefully the story plays a more central role. Anyone who has actually read the back story fiction to the universe knows that it is really interesting and complex and actually quite unique to most shooters. Of which Guerrilla puts zero percent into the actual games. Like how many people realize that the Helghast are actually the good guys that were oppressed by the Earth government for generations and only attacked as a last resort? Probably not a lot of people.

yep you win,

that never came through in the games at all. and that sounds like a real interesting angle.

also sounds like a VERY interesting twist they hopefully can pull on users now.

#20 Posted by Godlyawesomeguy (6398 posts) -

Crysis who? Battlefield what? Get out of the way, or fall into Killzone's shadow.

Get hyped!

lol

#21 Edited by Krakn3Dfx (2492 posts) -

Don't expect KZ:SF to be mind blowing, but will be pleasantly surprised if it is. Loved KZ2, had some of the best enemy AI in a FPS this generation.

Waiting for next year to see what developers produce when they're not under the gun of a console launch deadline.

Online
#23 Posted by Syed117 (387 posts) -

Looks incredible. Can't wait for a better developer to get some real time in with the console. Cant wait to see what a naughty dog game will look like on PS4.

The Killzone franchise started out horrendously and has gotten pretty good, but it's far from being great. Hopefully they will address some of their issues with this game.

#24 Edited by Silver-Streak (1363 posts) -

@vertrucio: The problem with your theory is that unless i'm missing an entire event, Sony announced their hardware specs months before Microsoft did.

The XBO had no official specs until their first announcement, and said announcement/press conference was made months after the Sony PS4 announcement, where Sony gave detailed specs, including the 8gb of ram.

Edit: @vertrucio My apologies, I did not realize I responded to a month old post. Regardless, info needed to be stated.

Double Edit: For further clarity, the PS4's specs were announced during the February 20th 2013 Playstation Event. During the event they stated it had 8gb of GDDR5.

The XBO's specs were given (although some numbers left vague or omitted entirely) on their May 21 2013 Microsoft Event. This is where they announced the 8gb of DDR3 ram.

Meaning that Sony had their specs officially announced before Microsoft was even willing to confirm that a Next Gen Xbox existed.

#25 Posted by The_Laughing_Man (13629 posts) -

@eujin: MS has not given exact specs. And now rumors are going about a ram increase or GPU up clock.

#26 Posted by Vertrucio (148 posts) -

Sony knew about some of MS's specs well ahead, they were tipped off by developers who wanted Sony to at least have parity with the XB1's 8GB.

Don't confuse announce dates for when these corporations actually got information about the other side. You can bet both MS and Sony had various bits of information about each other's systems in advance of making an announcement.

And, every article and interview says that developers, even first party, were surprised at the jump to 8GB GDDR5.

#27 Edited by Silver-Streak (1363 posts) -

@the_laughing_man: Correct, which is why I said some numbers were left vague or omitted entirely. Microsoft did however put out the spec sheet below during the first conference as one of the slides. It is what included the Ram on it.

That said, there has been no confirmed numbers for CPU/GPU speeds yet on Microsoft's side, and the rumors about upping the GPU clock could either mean: The XBO GPU is currently running slower than the PS4s and they want to nullify that advantage, or they're trying to make it run faster than the PS4s GPU. However, unless it's a dramatic increase, the increase in clock rate does not (fully) counter the fact it only has 768 GPU cores vs the 1152 of the PS4. (Number of cores via Digital Foundry: http://www.eurogamer.net/articles/digitalfoundry-spec-analysis-xbox-one.)

Double Edit: I really hope that the rumors end up true. I'd really like both systems to have as close to an even playing field as possible, so multiplatform games have literally no sway on either system.

#28 Posted by tourgen (4516 posts) -

awesome, thank you for the article. I thought it was really interesting they got such a big speed increase from going from 16bit RGBA buffers to R11B11G10 buffers. It's just basic stuff like this, where they can optimize for 1 piece of video hardware, which gives them such an advantage over PC.

I'm super interested in seeing what they can do with general compute on the GPU with such a wide memory bus. Things like marching cube algorithms and maybe even ray marching with density functions/textures could produce some pretty cool stuff not possible on 1-2GB PC video cards.

#29 Posted by Krakn3Dfx (2492 posts) -

@syed117 said:

Looks incredible. Can't wait for a better developer to get some real time in with the console. Cant wait to see what a naughty dog game will look like on PS4.

The Killzone franchise started out horrendously and has gotten pretty good, but it's far from being great. Hopefully they will address some of their issues with this game.

KZ2 was a pretty awesome game, some of the best enemy AI this generation.

KZ3 was a step down, but still had a pretty great SP campaign and more than serviceable MP.

Online
#30 Edited by The_Laughing_Man (13629 posts) -

@eujin: the entire rumor comes from the dev kits running super cool because of the big ass fan. Also the dev from the AMA on reddit said wile on paper the PS4 had better spec. The X1 games look like stuff right from a top end PC. Not to mention that MS is about software. There is a chance the X1 is super optimized.

#31 Edited by Silver-Streak (1363 posts) -

@the_laughing_man: Agreed on the optimization, I hope that MS pulls it off. Note however, if the remored 3GB of ram being used by the main OS's is true, that's a good example of bad optimization.

Also, I'm not aware of the reddit AMA you refer to, but everything said, and how basic hardware works, makes it exceedingly improbable for this to just be a "better on paper" thing.

Even ignoring the memory bandwidth differences, this is both consoles using the same exact architecture (and even the same Chip, both are jaguars), meaning that code optimization for one CPU/GPU works the same on the other CPU/GPU. Except that one CPU/GPU is the equivalent of a 5 lane highway (PS4 at 1152 gpu cores), and one is the equivalent of a 3 lane highway (XBO at 768 gpu cores).

If they both are packed full of cars (code) driving to the same destination at 60 mph, the 5 lane highway is going to have way more cars reaching its destination than the 3 lane highway would, to the point where the 3 lane highway would need to be going nearly 120 mph to get as many cars to the destination as the 5 lane in the same amount of time.

Then, even if Microsoft doesincrease the GPU clock speed, if the Ram isn't increased enough to make it so at least 7gb is available to games like on the PS4(supposedly), even if they get all of the cars(code) operating at double speed, the final destination for the 3 lane highway can't hold as many cars as the 5 lane highway.

Man. I should never write word problems for textbooks.

#32 Edited by The_Laughing_Man (13629 posts) -

@eujin: I've seen a few things on gaf about the X1 ram being better because of latency or something. Was not sure.

When I get home I will find the AMA. An there was another rumor that the clock speed or the X1 was boosted by 88% for release.

#34 Edited by Silver-Streak (1363 posts) -

@the_laughing_man: I'd definitely be interested in that, so thanks if you can post it/pm it.

However, I think the 88% you're thinking of is the DF post about the 32mb ESRAM being able to understand code designed to read and write at the same time. However, this would only affect code that can actually do both at once (which would limit it to specially coded shaders or AA routines). While it could potentially also affect shadow maps, as there is only 32mb of ESRAM, you're going to actually introduce latency as shadow maps (which can run larger than 32mb) would have to be loaded in chunks. In addition, the 88% boost would only bring the ESRAM(not the DDR3) in line with the PS4's memory bandwidth.

Then there's the chance that NeoGAF has discovered the DF person who wrote the article's math is potentially quite a bit off, if you want to read this thread: http://www.neogaf.com/forum/showthread.php?t=606971&page=67

Edit: To elaborate, as that GAF thread seemed to degrade into credibility flaming at the end, the way the math worked out in the article points to the fact that while the ESRAM itself may be able to perform better if a piece of code can do read/writes simultaneously, the numbers it puts out makes it sound like the clock speed may have still been downclocked. Shortly after this is discovered in the thread, people start to get...attacky at one another.

#35 Posted by The_Laughing_Man (13629 posts) -

@eujin: I wonder if it is possible to up clock this close to release. And I wonder if MS is willing to pay for it.

#36 Posted by Lego_My_Eggo (1056 posts) -

@the_laughing_man: Is this the reddit post? He is basically saying yeah the PS4 is more powerful on paper (because it really is), but our games don't look like crap either. And iv seen a few gaf rumors that Sony has better software tools for devs right now, and with Microsoft being as disorganized as they appear now i can see that being true. And from what people are saying, any upgrade to the clock speed or RAM still wont close the performance gap with the PS4.

But until there both out we don't know how big of a gap there will be in games, if devs will truly take advantage of the extra power, or will it be like previous generations where they develop for the lowest spec console if it sells better.

#37 Edited by Krakn3Dfx (2492 posts) -

@the_laughing_man said:

@eujin: I've seen a few things on gaf about the X1 ram being better because of latency or something. Was not sure.

When I get home I will find the AMA. An there was another rumor that the clock speed or the X1 was boosted by 88% for release.

This is from someone I know on Efnet that develops for a game company out of Toronto. A lot of it goes over my head, but I did check the links he gave me, and the numbers check out:

GDDR5 having much higher latency than DDR3 is a myth that's been constantly perpetuated with no source to back it up. Absolute latency has always been the same, at around 10ns. It has been around that since DDR1. Since the data rates have been increasing, the latency in clock cycles has increased but the absolute latency has always been the same.

From Wikipedia: DDR3 PC3-12800 @ IO frequency 800MHz has typical CAS latency 8. This means the absolute latency is 10ns. DDR2 PC2-6400 runs at IO frequency 400MHz, with CAS latency 4. This is also 10 ns.

Here's a typical GDDR5 chip datasheet: http://www.hynix.com/datasheet/pdf/graphics/H5GQ1H24AFR(Rev1.0).pdf

Here is the table showing CAS latency vs frequency: http://i.imgur.com/dnHldht.png (page 43)

The data rates are a factor of 4x faster than the memory clock. So at a typical 5.0Gbps output data rate, the memory runs at 1.25GHz (source: http://i.imgur.com/FEhkHNm.png page 6) and supports CL latency of 15. This is 15/(1.25 GHz) = 12 ns

The rumor about the 88% boost was from this DF article and refers to a possible increase in the 32MB ESRAM cache that is "up to 88%" faster, not sustained, if the rumors are to be believed.

Online
#38 Posted by Silver-Streak (1363 posts) -

@the_laughing_man: Rumor was that the DRM change decision only happened due to preorder sluggishness (which would be worrying as supposedly the PS4 sold through day 1 presales faster than the XBO, even though there were way more PS4 presales available).

If Microsoft is willing to throw away an entire set of engineering systems and bring back on-disc DRM this late in the game, I wouldn't be surprised to see them attempt a hardware change, if developers who have now gotten up to speed on the PS4 have said they are unhappy with the XBO.

To bring this back on topic to the OP, with how easily all of the major first party and third party developers have said it has been to port code to the PS4, Indie devs hating Microsoft's policies, and how easy it has been for first parties like Guerrilla Games to bring Killzone demos up to snuff, I could very easily see Microsoft throwing cash around to even the playing field.

I really hope they do. Equal footing is better for all of us.

#39 Posted by Syed117 (387 posts) -

I think I've accepted that the ps4 will be more powerful. There's nothing wrong with that. Technically, the PS3 is more powerful than the xbox 360. Difficultly in creating games is a different matter entirely.

The development side seems like it will be much better this time around for Sony. The real question is how third party games will run. Will they be significantly better on the ps4? Seems much more likely that they might have a slight edge. Any differences will become apparent after some time.

At the end of the day, there will be great games for both consoles. The same it has always been. I'm sure Microsoft exclusives will look great on the XB1 and PS4 exclusives might look better overall in the way they do on the 360 and PS3.

As much as hardware might matter, it really doesn't until someone makes a game to force us to believe that it does. I love amazing visuals as much as the next guy, but they don't make a game. They just make it look better.

#40 Edited by Silver-Streak (1363 posts) -

@syed117: Agreed. As far as 1st/2nd party games, I hope that KZ focuses on a much more involving story this time (which seems likely based off the premise given so far). I'm also interested in Project Spark for XBO, and potentially in The Order for PS4 (I hope it's a story driven game and not a Left4Dead style game, although that could be cool too).

Multiplats look pretty good right now, too, looking at Destiny and Watch_Dogs

#41 Posted by The_Laughing_Man (13629 posts) -

@the_laughing_man said:

@eujin: I've seen a few things on gaf about the X1 ram being better because of latency or something. Was not sure.

When I get home I will find the AMA. An there was another rumor that the clock speed or the X1 was boosted by 88% for release.

This is from someone I know on Efnet that develops for a game company out of Toronto. A lot of it goes over my head, but I did check the links he gave me, and the numbers check out:

GDDR5 having much higher latency than DDR3 is a myth that's been constantly perpetuated with no source to back it up. Absolute latency has always been the same, at around 10ns. It has been around that since DDR1. Since the data rates have been increasing, the latency in clock cycles has increased but the absolute latency has always been the same.

From Wikipedia: DDR3 PC3-12800 @ IO frequency 800MHz has typical CAS latency 8. This means the absolute latency is 10ns. DDR2 PC2-6400 runs at IO frequency 400MHz, with CAS latency 4. This is also 10 ns.

Here's a typical GDDR5 chip datasheet: http://www.hynix.com/datasheet/pdf/graphics/H5GQ1H24AFR(Rev1.0).pdf

Here is the table showing CAS latency vs frequency: http://i.imgur.com/dnHldht.png (page 43)

The data rates are a factor of 4x faster than the memory clock. So at a typical 5.0Gbps output data rate, the memory runs at 1.25GHz (source: http://i.imgur.com/FEhkHNm.png page 6) and supports CL latency of 15. This is 15/(1.25 GHz) = 12 ns

The rumor about the 88% boost was from this DF article and refers to a possible increase in the 32MB ESRAM cache that is "up to 88%" faster, not sustained, if the rumors are to be believed.

We gotta keep in mind that comparing this to normal computer stuff wont work since chances are its all custome tweeked.

#42 Posted by Vertrucio (148 posts) -

To be frank, both systems have enough "power" to handle these next gen games well. The big thing is that they both have 8GB of RAM, regardless of the type. That's a huge thing. I think you guys are all in for a surprise this generation once studios get over the initial shock of being able to do a bunch of graphical stuff with it. The real benefit is gameplay, and a lot of games will benefit from being able to keep a lot more data in memory.

As to the rumors of XB1s specs being upped, they were thoroughly debunked by insider XB1 developers over on GAF. It's not happening, and the original rumor came out of some wishful thinking while looking at the dev kit specs. So no more on that topic please.

As much as I admit I am PS fanboy, the reality is, we've all seen what kind of stupidity any corporation can get to if they are left to dominate a console generation. We saw it with the PS3, and now the XB1. This competition is funding a new generation of games out of the pockets of these hardware manufacturers.

#43 Posted by LiquidPrince (15969 posts) -

That's awesome. With more optimization they can probably make that even less, and then they also have access to like 5 more GB's of RAM. Crazy.

#44 Edited by Marcsman (3209 posts) -

Day One for me.

#45 Edited by TheHBK (5488 posts) -

And here we go. It starts again with fucken rookies who start these threads some of you may remember from this last generation.

"Oh this dev says they are only using 60% of the power of the PS3"

"Hey guys, how much is Gears of War pushing the Xbox 360?"

"Uncharted 2 maxes out the PS3!"

Like seriously, I love tech, I like reading about how it works together but as you can see, it is fucking stupid to get hyped up about a game because the graphics use up a certain amount of resources? Killzone 2 sucked ass. The aiming was stupid. "Oh but its more realisitc" Oh yeah? In a game about space Nazi's that take 3 headshots to kill? All those characters in that game were fucking terrible. Yeah, This new game looks pretty as hell and it says something for the potential about the system but I am more impressed when devs can do more with less and get creative on the console later in the life cycle. Right now, they are brute forcing the graphics so we don't know how good the games can look, but again, I am more impressed by what it means for the complexity of games like a GTA or Elder Scrolls. Point being, these pretty graphics and resource statistics don't mean it will be a good game. If only naughty dog were putting out a game instead.

#46 Edited by Krakn3Dfx (2492 posts) -

@thehbk said:

Killzone 2 sucked ass.

KZ2 has a 91% on Metacritic, had a great story, some of the best enemy AI in a game this generation, and great multiplayer to boot.

I don't even know what game you're talking about, but no.

Online
#47 Edited by Turbyne (98 posts) -

@thehbk said:

And here we go. It starts again with fucken rookies who start these threads some of you may remember from this last generation.

"Oh this dev says they are only using 60% of the power of the PS3"

"Hey guys, how much is Gears of War pushing the Xbox 360?"

"Uncharted 2 maxes out the PS3!"

Like seriously, I love tech, I like reading about how it works together but as you can see, it is fucking stupid to get hyped up about a game because the graphics use up a certain amount of resources? Killzone 2 sucked ass. The aiming was stupid. "Oh but its more realisitc" Oh yeah? In a game about space Nazi's that take 3 headshots to kill? All those characters in that game were fucking terrible. Yeah, This new game looks pretty as hell and it says something for the potential about the system but I am more impressed when devs can do more with less and get creative on the console later in the life cycle. Right now, they are brute forcing the graphics so we don't know how good the games can look, but again, I am more impressed by what it means for the complexity of games like a GTA or Elder Scrolls. Point being, these pretty graphics and resource statistics don't mean it will be a good game. If only naughty dog were putting out a game instead.

What a dickhead.

Yeah homie GTA IV's gameplay was definitely not dogshit. Elder Scrolls games definitely don't have bad combat.

#48 Posted by troll93 (388 posts) -

@turbyne said:

@thehbk said:

And here we go. It starts again with fucken rookies who start these threads some of you may remember from this last generation.

"Oh this dev says they are only using 60% of the power of the PS3"

"Hey guys, how much is Gears of War pushing the Xbox 360?"

"Uncharted 2 maxes out the PS3!"

Like seriously, I love tech, I like reading about how it works together but as you can see, it is fucking stupid to get hyped up about a game because the graphics use up a certain amount of resources? Killzone 2 sucked ass. The aiming was stupid. "Oh but its more realisitc" Oh yeah? In a game about space Nazi's that take 3 headshots to kill? All those characters in that game were fucking terrible. Yeah, This new game looks pretty as hell and it says something for the potential about the system but I am more impressed when devs can do more with less and get creative on the console later in the life cycle. Right now, they are brute forcing the graphics so we don't know how good the games can look, but again, I am more impressed by what it means for the complexity of games like a GTA or Elder Scrolls. Point being, these pretty graphics and resource statistics don't mean it will be a good game. If only naughty dog were putting out a game instead.

What a dickhead.

Yeah homie GTA IV's gameplay was definitely not dogshit. Elder Scrolls games definitely don't have bad combat.

Dude, I swung that sword at that bandit's face like 20 times and he then died, way better than those fucking space nazis that took 3 headshots.

#49 Posted by Turbyne (98 posts) -

@troll93 said:

@turbyne said:

@thehbk said:

And here we go. It starts again with fucken rookies who start these threads some of you may remember from this last generation.

"Oh this dev says they are only using 60% of the power of the PS3"

"Hey guys, how much is Gears of War pushing the Xbox 360?"

"Uncharted 2 maxes out the PS3!"

Like seriously, I love tech, I like reading about how it works together but as you can see, it is fucking stupid to get hyped up about a game because the graphics use up a certain amount of resources? Killzone 2 sucked ass. The aiming was stupid. "Oh but its more realisitc" Oh yeah? In a game about space Nazi's that take 3 headshots to kill? All those characters in that game were fucking terrible. Yeah, This new game looks pretty as hell and it says something for the potential about the system but I am more impressed when devs can do more with less and get creative on the console later in the life cycle. Right now, they are brute forcing the graphics so we don't know how good the games can look, but again, I am more impressed by what it means for the complexity of games like a GTA or Elder Scrolls. Point being, these pretty graphics and resource statistics don't mean it will be a good game. If only naughty dog were putting out a game instead.

What a dickhead.

Yeah homie GTA IV's gameplay was definitely not dogshit. Elder Scrolls games definitely don't have bad combat.

Dude, I swung that sword at that bandit's face like 20 times and he then died, way better than those fucking space nazis that took 3 headshots.

GOAT BATTLE SYSTEM MAN FUCK DARK SOULS

Hit reaction is for bitches who can't swing fast enuf

#50 Posted by TheManWithNoPlan (5595 posts) -

Next Gen will have all the graphics!