Is it just me or does Cry Engine 3 look worse than 2?

Avatar image for deactivated-5e0e0ee2ea170
deactivated-5e0e0ee2ea170

870

Forum Posts

6

Wiki Points

0

Followers

Reviews: 55

User Lists: 0

http://www.gametrailers.com/player/47150.html

I mean it looks nowhere near as good and that makes no sense for me with an engine named 3 which should look better than 2. When it is nowhere near as good and they ain't even heard of HDR or AA it appears in the trailer...... The effects always look identicle each time aswel when he fire ths granade.

I dunno It's a console engine I'm guessing that they optimized for consoles but i just don't get why they'd name it 3.
Avatar image for liveordie1212
LiveOrDie1212

944

Forum Posts

2

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#2  Edited By LiveOrDie1212

they want to integrate it into the xbox360, thats why it looks so crappy. its like they downgraded the engine to fit into the xbox (just my opinion, don't flame me)

Avatar image for diamond
Diamond

8678

Forum Posts

533

Wiki Points

0

Followers

Reviews: 0

User Lists: 4

#3  Edited By Diamond

1. It's very early, wait until they make a game on it.
2. There's a new video that shows off parts better.  The outdoor areas look pretty good, and the redwood-like forest with a waterfall is great looking.
3. Supposedly the engine has new graphical features that weren't in CE2.
4. They named it 3 because it's coming after 2.  It's also designed for future consoles, which will probably have graphics in all games that blow Crysis away.

Avatar image for tekmojo
tekmojo

2365

Forum Posts

104

Wiki Points

0

Followers

Reviews: 1

User Lists: 3

#4  Edited By tekmojo
Diamond said:
"1. It's very early, wait until they make a game on it.2. There's a new video that shows off parts better.  The outdoor areas look pretty good, and the redwood-like forest with a waterfall is great looking.3. Supposedly the engine has new graphical features that weren't in CE2.4. They named it 3 because it's coming after 2.  It's also designed for future consoles, which will probably have graphics in all games that blow Crysis away."
All speculation of course.
Avatar image for deactivated-5e0e0ee2ea170
deactivated-5e0e0ee2ea170

870

Forum Posts

6

Wiki Points

0

Followers

Reviews: 55

User Lists: 0

What new graphical features? Cry Engine 2 uses DX10 and Cry Engine 3 is made for consoles whic use DX8..... I dunno I just don't see why they named it 3 when it looks nowhere near as good as 2.

Avatar image for diamond
Diamond

8678

Forum Posts

533

Wiki Points

0

Followers

Reviews: 0

User Lists: 4

#6  Edited By Diamond
PapaLazarou said:
"What new graphical features? Cry Engine 2 uses DX10 and Cry Engine 3 is made for consoles whic use DX8..... I dunno I just don't see why they named it 3 when it looks nowhere near as good as 2."
Ummm no.  You should read up more on console tech.  Both 360 and PS3 have the capability of using many graphical features not supported in DX9.  360 has a GPU that has features that go beyond DX10 but also lack some DX10 features.  PS3 has the Cell which is utilized heavily for graphics in games like Killzone 2, which can't really be defined by DirectX standards at all.

For example, the 360 and PS3 versions of Tomb Raider Underworld have graphical featuers not available in the PC version.  One of the developers explained on Beyond3D that DX9 shaders weren't long enough, and to enable that effect in DX9 would divide the framerate to less than 1/4.
Avatar image for ninjamunkey
NinjaMunkey

211

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#7  Edited By NinjaMunkey

Its becuasa all the footage weve seen is for the ps3 and 360, which will never produce high end crysis graphics.

Apparently they will show the PC version soon, which will kick ass.

Avatar image for sticky_pennies
Sticky_Pennies

2089

Forum Posts

308

Wiki Points

0

Followers

Reviews: 1

User Lists: 3

#8  Edited By Sticky_Pennies
NinjaMunkey said:
"Its becuasa all the footage weve seen is for the ps3 and 360, which will never produce high end crysis graphics.Apparently they will show the PC version soon, which will kick ass."
He speaks the truth. I was not expecting to be blown away by CryEngine 3 on the PS3 and 360. They did say, however, that it's "next-gen ready" however, which one might assume means that this engine will look far better than CryEngine 2 does come the PS4/XBox 3 or whatever.
Avatar image for hairydutchman
Hairydutchman

1042

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

#9  Edited By Hairydutchman

More games should use the Killzone 2 engine.

Avatar image for randiolo
randiolo

1173

Forum Posts

217

Wiki Points

60

Followers

Reviews: 1

User Lists: 8

#10  Edited By randiolo

Ps3 version looked a little sharper than the 360, but thoe hole thing looks average.. killzone looked far better 

Avatar image for bigboss1911
BigBoss1911

2956

Forum Posts

488

Wiki Points

0

Followers

Reviews: 3

User Lists: 3

#11  Edited By BigBoss1911
randiolo said:
"Ps3 version looked a little sharper than the 360, but thoe hole thing looks average.. killzone looked far better "
It does.
Avatar image for rip_icewood
RIP_Icewood

107

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 4

#12  Edited By RIP_Icewood
PapaLazarou said:
"What new graphical features? Cry Engine 2 uses DX10 and Cry Engine 3 is made for consoles whic use DX8..... I dunno I just don't see why they named it 3 when it looks nowhere near as good as 2."
Just because its named "3" doesnt have to mean its better than "2". I could use a ton of movie and videogame examples if ya want. Not to mention the 3 in the title signifies its the next step in their line of graphics engines. Again, that doesnt mean its suppose to be better. Not to mention its for consoles and since your a PC gamer (I figured you were the moment I read the post and then looked just to make sure I was correct) so you'll automatically say" CryEngine 2(PC only) looks better than CE3(console)"

Basically, I LOL at fanboys....
Avatar image for biggerbomb
BiggerBomb

7011

Forum Posts

-1

Wiki Points

0

Followers

Reviews: 0

User Lists: 4

#13  Edited By BiggerBomb

Yeah...well, we have rumble!

Avatar image for deadlypain
DeadlyPain

458

Forum Posts

10

Wiki Points

0

Followers

Reviews: 0

User Lists: 3

#14  Edited By DeadlyPain

i agree - it shouldn't be 3 but rather 1.5 prehaps
and anyw
ay, in comparison to the PC engine.....*commence laughter mode*
the jaggies alone! ouch!!

i thought Cryengine 2 was there to broadcast - Hey look at us!, Graphical beauty at it's finiest!
now alls i hear is - Hey look at us!, we are taking a stepback in graphical technically because ps3/x360 Fanboy whores are complaining they want crysis on consoles and to prove that consoles that play crysis just as good or even better then PC's!!!!!
*notice - Laughter mode still active!*

¬_¬ - At the end of the day Crytek new crysis will run just fine on consoles and it was Enevitable - but PLEASE console fanboys PLEASE for the love of god don't start doing comparison videos/pictures X_x - just play and enjoy the fucking game when it comes out!

Avatar image for hamz
Hamz

6900

Forum Posts

25432

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

#15  Edited By Hamz

Personally I find CryEngine 3 to look a little less spectacular compared to CryEngine 2. However I was under the impression CE3 was mean't to have a better balance between visual quality and overall performance for both the PC and consoles. Afterall Crytek did say in early 2008 they were hoping to branch into developing games on consoles as well as the PC, so it makes sense for them to produce a new engine that can work on all platforms.

What I'm more interested in isn't so much the visual quality, but rather the overall performance. Crytek have already proven they can make the best looking games ever, but they have still to prove they can make games that have a decent performance rating and don't require an expensive PC to run them. Crysis: Warhead was a step in the right direction to proving this but they still have a long way to go before they can actually say they create stable running games with the best visuals on the market.

Avatar image for jayge_
Jayge_

10269

Forum Posts

2045

Wiki Points

0

Followers

Reviews: 1

User Lists: 3

#16  Edited By Jayge_

Honestly, I think it's just you. Lighting, physics interactions, draw distance detail and organic density seem to be improved (if just slightly) to me.

Avatar image for black_raven
Black_Raven

1764

Forum Posts

8

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

#17  Edited By Black_Raven

It does look worse but that's a demo showing off what to expect on consoles, Crysis on PC running on this engine will probably look slightly better than the original/Warhead and probably run better, it looks great considering its running on consoles but they still have a long way to go, the framerate looked to be bordering on unplayable in a lot of those scenes.

Avatar image for deactivated-5e0e0ee2ea170
deactivated-5e0e0ee2ea170

870

Forum Posts

6

Wiki Points

0

Followers

Reviews: 55

User Lists: 0

Jayge said:
"Honestly, I think it's just you. Lighting, physics interactions, draw distance detail and organic density seem to be improved (if just slightly) to me."

What are you on about?

-The Lighting isn't as nature
- The Physics, specially when he shoots the trees are way over the top
- The Draw Distance is worse and theres less in the backgrounds


:\ I mean when I saw that level from Crysis it looked WAYYYYYYYYYY Worse.
Avatar image for deactivated-5e0e0ee2ea170
deactivated-5e0e0ee2ea170

870

Forum Posts

6

Wiki Points

0

Followers

Reviews: 55

User Lists: 0

Ok I read up on it and it is basically a console optimized Cry Engine 2 so thats why it looks worse but I have a problem with them naming it 3 and now if developers use this for console......... we're gonna have alot of shit looking ports when we coulda had Cry Engine 2 games on PC :(

Avatar image for al3xand3r
Al3xand3r

7912

Forum Posts

3

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#20  Edited By Al3xand3r
Lol.

As for people seeing "jaggies" well dur, if you run it on pc obviously you'll turn run it @ high res with AA so uh, lol. Whine whine whine elitists.
Avatar image for diamond
Diamond

8678

Forum Posts

533

Wiki Points

0

Followers

Reviews: 0

User Lists: 4

#21  Edited By Diamond
PapaLazarou said:
"Ok I read up on it and it is basically a console optimized Cry Engine 2 so thats why it looks worse but I have a problem with them naming it 3 and now if developers use this for console......... we're gonna have alot of shit looking ports when we coulda had Cry Engine 2 games on PC :( "
I understand that you feel threatened by CryEngine 3 and you're being defensive, but read this IGN interview, the engine WILL have new graphical features, and it will be more optimized on PC by benefit of them optimizing for console.  Once again this tech demo was probably thrown together quickly before GDC just to get investor interest.
Avatar image for glowing
Glowing

6

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#22  Edited By Glowing
Diamond said:
"PapaLazarou said:
"What new graphical features? Cry Engine 2 uses DX10 and Cry Engine 3 is made for consoles whic use DX8..... I dunno I just don't see why they named it 3 when it looks nowhere near as good as 2."
Ummm no.  You should read up more on console tech.  Both 360 and PS3 have the capability of using many graphical features not supported in DX9.  360 has a GPU that has features that go beyond DX10 but also lack some DX10 features.  PS3 has the Cell which is utilized heavily for graphics in games like Killzone 2, which can't really be defined by DirectX standards at all.For example, the 360 and PS3 versions of Tomb Raider Underworld have graphical featuers not available in the PC version.  One of the developers explained on Beyond3D that DX9 shaders weren't long enough, and to enable that effect in DX9 would divide the framerate to less than 1/4."
Yes well, you should read up about tech in general because your way of here mate.. You might be right that both the 360&ps3 has features that aren't in dx10, but in no way are they way beyond, why? Because they use a modified version of dx9 and opengl respectively. And the cell, well that flopped big time didn't it? It was meant to be a software renderer, but it was nowhere near powerful enough so they had to slap on a gfxcard at the last minute. Oh and about the shaders i highly doubt that's what it said. You will be hard pressed to hit the instruction limits of Shader model 3.0, and if you do make such a master/monster shader you should really consider a change of profession.

And the new engine, it most likely have alot of new bells and whistles. The console version looks like it suffers from severe lod's and bad lightning and shadow techniques, they probably couldn't show off it's true potential with that lite horsepower. But don't count it out yet, the pc version will hopefully be able to produce graphics beyond there last engine, which the console version could not.
Avatar image for jakob187
jakob187

22967

Forum Posts

10045

Wiki Points

0

Followers

Reviews: 8

User Lists: 9

#23  Edited By jakob187

Because graphics are what makes a game worthwhile...

Avatar image for diamond
Diamond

8678

Forum Posts

533

Wiki Points

0

Followers

Reviews: 0

User Lists: 4

#24  Edited By Diamond
Glowing said:
"Diamond said:
"PapaLazarou said:
"What new graphical features? Cry Engine 2 uses DX10 and Cry Engine 3 is made for consoles whic use DX8..... I dunno I just don't see why they named it 3 when it looks nowhere near as good as 2."
Ummm no.  You should read up more on console tech.  Both 360 and PS3 have the capability of using many graphical features not supported in DX9.  360 has a GPU that has features that go beyond DX10 but also lack some DX10 features.  PS3 has the Cell which is utilized heavily for graphics in games like Killzone 2, which can't really be defined by DirectX standards at all.For example, the 360 and PS3 versions of Tomb Raider Underworld have graphical featuers not available in the PC version.  One of the developers explained on Beyond3D that DX9 shaders weren't long enough, and to enable that effect in DX9 would divide the framerate to less than 1/4."
Yes well, you should read up about tech in general because your way of here mate.. You might be right that both the 360&ps3 has features that aren't in dx10, but in no way are they way beyond, why? Because they use a modified version of dx9 and opengl respectively. And the cell, well that flopped big time didn't it? It was meant to be a software renderer, but it was nowhere near powerful enough so they had to slap on a gfxcard at the last minute. Oh and about the shaders i highly doubt that's what it said. You will be hard pressed to hit the instruction limits of Shader model 3.0, and if you do make such a master/monster shader you should really consider a change of profession.

And the new engine, it most likely have alot of new bells and whistles. The console version looks like it suffers from severe lod's and bad lightning and shadow techniques, they probably couldn't show off it's true potential with that lite horsepower. But don't count it out yet, the pc version will hopefully be able to produce graphics beyond there last engine, which the console version could not.
"
I didn't say console GPUs were way beyond DX10, the 360's GPU's functions that go beyond DX10 specs are minimal and probably won't ever even be used.  I don't think Cell has flopped.  It's too hard for most developers to use, so it's not very practical, but when used effectively, you get something like Killzone 2, which couldn't be done with the PS3's GPU alone.  They originally intended to use it as a software renderer, but that would have been a dumb design, there is no CPU that could have done that.

You highly doubt what I said?  Then you know nothing about tech.  Here's the developer talking about what was cut from the PC version because of DX9 limitations.  It's not the only case of 360 having shaders that couldn't be done in DX9.  You can criticize all you want, but DX9 has limits, and consoles are already surpassing them.

Bad lighting and shadow techniques?  Biggest shadow fault I saw in the video was shadow buffers not being updated every frame.  Don't count out the console version either, if you knew anything about development, you'd know how much polish and time go into a game to make it look good.  Wait for a product.
Avatar image for jayge_
Jayge_

10269

Forum Posts

2045

Wiki Points

0

Followers

Reviews: 1

User Lists: 3

#25  Edited By Jayge_
PapaLazarou said:
"Jayge said:
"Honestly, I think it's just you. Lighting, physics interactions, draw distance detail and organic density seem to be improved (if just slightly) to me."
What are you on about?-The Lighting isn't as nature- The Physics, specially when he shoots the trees are way over the top- The Draw Distance is worse and theres less in the backgrounds:\ I mean when I saw that level from Crysis it looked WAYYYYYYYYYY Worse."
Looked much more optimized to me. Which I consider to be "better". I have no idea how you define it.
Avatar image for glowing
Glowing

6

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#26  Edited By Glowing
Diamond said:
I didn't say console GPUs were way beyond DX10, the 360's GPU's functions that go beyond DX10 specs are minimal and probably won't ever even be used.  I don't think Cell has flopped.  It's too hard for most developers to use, so it's not very practical, but when used effectively, you get something like Killzone 2, which couldn't be done with the PS3's GPU alone.  They originally intended to use it as a software renderer, but that would have been a dumb design, there is no CPU that could have done that.

You highly doubt what I said?  Then you know nothing about tech.  Here's the developer talking about what was cut from the PC version because of DX9 limitations.  It's not the only case of 360 having shaders that couldn't be done in DX9.  You can criticize all you want, but DX9 has limits, and consoles are already surpassing them.Bad lighting and shadow techniques?  Biggest shadow fault I saw in the video was shadow buffers not being updated every frame.  Don't count out the console version either, if you knew anything about development, you'd know how much polish and time go into a game to make it look good.  Wait for a product."
Well obviously Killzone 2 could not have been done with the gpu alone... Anyway why would a software renderer been stupid? Because it doesn't have the limitations gpu's have? Because with a software renderer you aren't stuck with the 20 years old or so, model for triangle rendering?

So I know nothing about "tech" do I? I am going to assume you have never written a shader before, because "not long enough" just makes no sense.  And the post that you linked to, assuming he/she was part of the porting team. What he said was extremely vague and nothing of a 1/4 fps drop.. And either they write extremely inefficient shaders or they have to many that makes no sense and doesn't show up on screen at all. "shadow buffers"? (please explain how a presumably hardware buffer fails to update) If shadow buffers somehow correlate to the depth buffer not being able to save the z-buffering into a texture and making the shadows seem like they aren't rendered that frame then no i can't say that i saw that.. I would have expected atleast ssao from Cry Engine 3..

And "per-instance motion-blur"  (who in there right mind would motion blur instances anyway..per-pixel sure but instances?) and "per-instance illumination models like the anisotropic surface and transmissive lighting" (starting to think he's mistaking instances for pixels..) are both very much possible on dx9c, don't assume just because they ran out of instructions in the shader that you can't replicate the effect.

Avatar image for diamond
Diamond

8678

Forum Posts

533

Wiki Points

0

Followers

Reviews: 0

User Lists: 4

#27  Edited By Diamond
Glowing said:
"Diamond said:
I didn't say console GPUs were way beyond DX10, the 360's GPU's functions that go beyond DX10 specs are minimal and probably won't ever even be used.  I don't think Cell has flopped.  It's too hard for most developers to use, so it's not very practical, but when used effectively, you get something like Killzone 2, which couldn't be done with the PS3's GPU alone.  They originally intended to use it as a software renderer, but that would have been a dumb design, there is no CPU that could have done that.

You highly doubt what I said?  Then you know nothing about tech.  Here's the developer talking about what was cut from the PC version because of DX9 limitations.  It's not the only case of 360 having shaders that couldn't be done in DX9.  You can criticize all you want, but DX9 has limits, and consoles are already surpassing them.Bad lighting and shadow techniques?  Biggest shadow fault I saw in the video was shadow buffers not being updated every frame.  Don't count out the console version either, if you knew anything about development, you'd know how much polish and time go into a game to make it look good.  Wait for a product."
Well obviously Killzone 2 could not have been done with the gpu alone... Anyway why would a software renderer been stupid? Because it doesn't have the limitations gpu's have? Because with a software renderer you aren't stuck with the 20 years old or so, model for triangle rendering?So I know nothing about "tech" do I? I am going to assume you have never written a shader before, because "not long enough" just makes no sense.  And the post that you linked to, assuming he/she was part of the porting team. What he said was extremely vague and nothing of a 1/4 fps drop.. And either they write extremely inefficient shaders or they have to many that makes no sense and doesn't show up on screen at all. "shadow buffers"? (please explain how a presumably hardware buffer fails to update) If shadow buffers somehow correlate to the depth buffer not being able to save the z-buffering into a texture and making the shadows seem like they aren't rendered that frame then no i can't say that i saw that.. I would have expected atleast ssao from Cry Engine 3..And "per-instance motion-blur"  (who in there right mind would motion blur instances anyway..per-pixel sure but instances?) and "per-instance illumination models like the anisotropic surface and transmissive lighting" (starting to think he's mistaking instances for pixels..) are both very much possible on dx9c, don't assume just because they ran out of instructions in the shader that you can't replicate the effect. "
Cell for software would be stupid because it wouldn't have the power to render good graphics.  There are parts of GPUs that are specialized in a very good way, and developers now are more used to making graphics on a GPU, not CPU.  They would have had to have 2 Cells and it would have been an even bigger nightmare for developers.

I think we may move out of rasterisation pretty soon with streaming voxels and raytracing, but with the technology when Cell was in development it would have been a much bigger mistake than the bluray / Cell direction was for business.  Might have brought some interesting things, but there'd probably very few good looking games as developers wouldn't be used to the system.

Actually I have worked on shaders (never made one from scratch though), messed around with real time tech demos (always wanted to get ATI tech demos running on Nvidia GPUs, had some success), also worked on some graphical fixes for the PC Halo 1 port.  Messed with graphics in suites like Blender, and some higher end stuff.  Shaders have code length, DX9 is more limited than the 360's GPU in this manner.  DX10 is even less limited.  It makes perfect sense if you understand how shaders work.

The man was NOT part of any porting team, the game was NOT ported, they even say so in the thread.  The game was developed for all platforms simultaneously.  Once again, DX9 has limits and stop assuming you know more than you do.

Shadow buffering, also known as shadow mapping, look it up.  It's a very common technique, as common as normal mapping these days.  I'm surprised you've never heard of it because most laymen in computer graphics even know of the technique.  You can simply choose not to update the shadow buffers every frame.  They are stored in video RAM.

CE3 is supposed to have SSAO AFAIK, CE2 had it, and many other games do too.

I'll admit you know something about technology, but I get frustrated with people that know some things but are incredibly ignorant and arrogant about what they know.   Where did you learn so much about computer graphics and you're ignroant of shadow buffers anyways?  Picked it up in bits and pieces from forums or something?  Give it a break.  I'll give you that you know more than most people about computer graphics, hell of a lot more than the OP of this thread who said consoles used DX8 hardware.

The instances he speaks up might just be a typo, but stop assuming you know more than he does because it's obvious to anyone that DX9 has limits.  Sure they could waste cycles trying to do the same effect as in DX10, but what would be the point?  The complexity of shader usage in games has skyrocketed, and you talk like someone who's been on an island since 2002 or something.

Please stop taking my comments out of context, or get some reading comprehension before you respond to me again.  If you still don't know what I'm talking about, the above software renderer assumption you made.
Avatar image for atejas
atejas

3151

Forum Posts

215

Wiki Points

0

Followers

Reviews: 2

User Lists: 1

#28  Edited By atejas
Hairydutchman said:
"More games should use the Killzone 2 engine. "
PS3-specific engine would be optimised for OpenGL instead of DirectX.
Basically, remember the Orange Box on PS3? Imagine that on 360 and/or PC.
Avatar image for glowing
Glowing

6

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#29  Edited By Glowing
Diamond said:
A LOT OF TEXT..
Nah didn't mean to sound so harsh, just a late night with a physics test in the morning..  I know that a "shadow buffer" is commonly used as a layman term for basically all texture shadow techniques, what I meant was that you didn't say anything about the technique used, more about all techniques that has it's roots in shadow mapping. oh and you forgot to say why the "shadow buffer" would fail to update a frame, which was your biggest complain about the shadows. Anyway I was critisizing the technique initself (pssm or whatever they used) not there implementation of it. I know ce2 had ssao, that's why i expected ce3 to have it as well, just didn't see it in the video, then again it was quite a low quality video. But you have to agree the outside scene seemed to suffer from an enormous amount of lod "levels".

Flopped might have been abit harsh, but they didn't realize there original goal, which is sad. As you might have noticed I eagerly await the return of software renderers, and software renderers are nothing new, remember the original unreal, all software there. About the two cell's I thought that the cell architecture was a glorified task manager, basically putting the load on the spu's, so wouldn't more spu's and a better threading system in one cell be a better option? And on the topic of software renderes intels new line seems really promising and a step in the right direction.

I have written many a shaders from scratch and dx9 do have limits (never said they don't), a set number of instructions executed per cycle to be more precise, I would think they would hit that limit before they hit the limit of "length", 32000 lines of code or whatever that was (which is a limit in the hardware, the graphic cards),  just found it hard to belive they would show up in tomb raider of all games. And as a fun note 360 uses a modified version of sm 3, guess you could say sm 3.5, so assuming they use a standard graphic card, it really doesn't make sense to say that they hit the limit in length, unless they have found a way around that which i don't know about.

"Picked it up in bits and pieces from forums or something?" I admit I deserved that, I shoulden't have jumped to assumptions. Where I learned it? I have studied programming several years now (in a school) , so i know a fair bit. But as you assumed now, I assumed you where the ordinary teenage console gamer with zero knowledge outside of there moms apartment.

So what shader language do you use? porting the shaders from ati to nvidia, not sure what you mean here, do you mean demos that uses ati only functions and somehow trying to do something similar on a nvidia card? Regardless looking at other peoples shaders and "messing them up" has to be the best way to learn, props for that. Most people just jump head first and gets frustrated and quit.

The argument about ce3 is starting to spiral out of control into personal attacks, so i think we should just put it to rest and say that neither of oss will know till they have released it, but do feel free to answer my other points in the post. And yes, that todays consoles would use dx8 is ridiculous (don't take it personally OP).
 
Avatar image for red
Red

6146

Forum Posts

598

Wiki Points

0

Followers

Reviews: 5

User Lists: 11

#30  Edited By Red

No. The consoles don't have as much power as the PC, but if you were to get CE3 running on a PC, you'd be getting the most razzling, bazzling, and bedazzing game the world has ever seen.

Avatar image for pause
pause422

6350

Forum Posts

16

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#31  Edited By pause422

Yes it does, and notice how we've only seen it on the PS3 and 360 hardware in their trailers..so thats to be expected. If it doesn't even look better than CE2 on the PC, they have no right to call this a new engine.

Avatar image for pause
pause422

6350

Forum Posts

16

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#32  Edited By pause422

Yes it does, and notice how we've only seen it on the PS3 and 360 hardware in their trailers..so thats to be expected. If it doesn't even look better than CE2 on the PC, they have no right to call this a new engine.

Avatar image for jayge_
Jayge_

10269

Forum Posts

2045

Wiki Points

0

Followers

Reviews: 1

User Lists: 3

#33  Edited By Jayge_
pause422 said:
"Yes it does, and notice how we've only seen it on the PS3 and 360 hardware in their trailers..so thats to be expected. If it doesn't even look better than CE2 on the PC, they have no right to call this a new engine."
What kind of stupid crap is that? No right to call it a new engine? If they want to call it a new engine, it's obviously qualified to be called a new engine, given that it's their engine.
Avatar image for diamond
Diamond

8678

Forum Posts

533

Wiki Points

0

Followers

Reviews: 0

User Lists: 4

#34  Edited By Diamond
Glowing said:
"Diamond said:
A LOT OF TEXT..
Nah didn't mean to sound so harsh, just a late night with a physics test in the morning..  I know that a "shadow buffer" is commonly used as a layman term for basically all texture shadow techniques, what I meant was that you didn't say anything about the technique used, more about all techniques that has it's roots in shadow mapping. oh and you forgot to say why the "shadow buffer" would fail to update a frame, which was your biggest complain about the shadows. Anyway I was critisizing the technique initself (pssm or whatever they used) not there implementation of it. I know ce2 had ssao, that's why i expected ce3 to have it as well, just didn't see it in the video, then again it was quite a low quality video. But you have to agree the outside scene seemed to suffer from an enormous amount of lod "levels".

Flopped might have been abit harsh, but they didn't realize there original goal, which is sad. As you might have noticed I eagerly await the return of software renderers, and software renderers are nothing new, remember the original unreal, all software there. About the two cell's I thought that the cell architecture was a glorified task manager, basically putting the load on the spu's, so wouldn't more spu's and a better threading system in one cell be a better option? And on the topic of software renderes intels new line seems really promising and a step in the right direction.

I have written many a shaders from scratch and dx9 do have limits (never said they don't), a set number of instructions executed per cycle to be more precise, I would think they would hit that limit before they hit the limit of "length", 32000 lines of code or whatever that was (which is a limit in the hardware, the graphic cards),  just found it hard to belive they would show up in tomb raider of all games. And as a fun note 360 uses a modified version of sm 3, guess you could say sm 3.5, so assuming they use a standard graphic card, it really doesn't make sense to say that they hit the limit in length, unless they have found a way around that which i don't know about.

"Picked it up in bits and pieces from forums or something?" I admit I deserved that, I shoulden't have jumped to assumptions. Where I learned it? I have studied programming several years now (in a school) , so i know a fair bit. But as you assumed now, I assumed you where the ordinary teenage console gamer with zero knowledge outside of there moms apartment.

So what shader language do you use? porting the shaders from ati to nvidia, not sure what you mean here, do you mean demos that uses ati only functions and somehow trying to do something similar on a nvidia card? Regardless looking at other peoples shaders and "messing them up" has to be the best way to learn, props for that. Most people just jump head first and gets frustrated and quit.

The argument about ce3 is starting to spiral out of control into personal attacks, so i think we should just put it to rest and say that neither of oss will know till they have released it, but do feel free to answer my other points in the post. And yes, that todays consoles would use dx8 is ridiculous (don't take it personally OP).
 
"
Shadow buffer really isn't a layman's term.  There actually isn't a whole lot of variation in the technique.  You can cascade the shadow maps so shadows closer to the camera are higher resolution, you can selectively have higher resolution shadow maps in places like KZ2 and RE5 do.  You can combine them with lightmaps, but there isn't even as much variation in the shadow mapping techniques as say, shadow volumes.  That's what it sounded like you were talking about, and while shadow mapping was defined in the 70's, it's only been used in games for a few years extensively.

It's not that the shadow buffer fails to update, it's that they choose to skip frames for performance reasons.  It frees up performance for other tasks.  Not many games do this (can't think of any that are already out actually).

I don't think you can ever expect to actually see SSAO in any compressed video, and for all you know in many cases it may just be lightmaps, until you actually play to see for sure (or find out otherwise).

I don't think we'll go back to all-software, I think there will be more opportunities for non-traditional rasterisation, and that more work will be offloaded to a CPU (probably a CPU/GPU hybrid in the future for cost reasons.  Part of Cell is a task manager, but other limitations like the small caches for each SPU are a bigger limitation.  Upping the number of SPUs when PS3 was released would have dramatically hurt manufacturing rates and cost of the entire system.  The only thing about Larrabee I really wonder is if it'll have enough support if it's not picked up for a next (really next) gen console.  With how long it'll take for the next gen of consoles to come out, I doubt the first Larrabee will be in it, but what developers on PC will support Larrabee correctly (take advantage of it) otherwise?

On consoles like the 360, there's real reasons to go ahead and take advantage of hardware features, like you say with a sort of SM3.5.  It's not a standard graphics card, these things are made for the consoles, and they should exploit them because every single 360 is going to be able to take advantage of that code.  They're running custom APIs as well, they aren't just running DirectX9 or 10, it's a custom DirectX for the 360.  You just sounded like someone who still ran Windows XP so you didn't want DX10 to be successful.  But anyways, yes it can be hard to take advantage of long shaders / lots of shaders with limited performance.  Tomb Raider Underworld renders at sub-720p for example.  Personally I think these sub-720p games are fine, I run a HDCRT and I can barely see the difference in most games unless they're really rendering low resolution.  I'd like to see what Crytek could do if they ported Crysis to 360 on CE3 but rendered at like 640p or something.  It's too bad tiling is so inconvinient.

It's been a while since I messed around with the ATI tech demos.  There was a guy on Beyond3D who was absolutely masterful, got Ruby running on Nvidia cards (had a 5900 at the time if I remember right, then got a 6800).  Anyways, I talked with him and optimized some for my own.  Made my own wrapper and custom shaders so the thing would run on my videocard.  I really think the Ruby demos were neat, and stuff like the Animusic 9700 demo (I think it was that) were just a joy to watch.  I gave up when ATi moved to 10-10-10-2 HDR, because that would have been either too much work, or absolutely impossible without some kind of real reverse engineering.  I still don't have a Nvidia card that does 10-10-10-2.  Honestly, I think it was just C inspired HLSL, but it's been so long since I worked on those shaders I couldn't tell you.  The stuff I could never get to work right in the Ruby and other ATi demos was always interesting.  The things I did for Halo 1 were more making subtle changes to wrappers as well.  I still believe that Gearbox intentionally locked out Nvidia cards from being able to render the frame-buffer refraction effets, and at the time I believed there was some sort of conspiracy going on, because Gearbox had ties to Valve, and Valve had ties to ATI.

I absolutely agree that CE3 seems less than impressive on 360 and PS3.  The city parts were just bad, even at console standards.  I just find a lot of PC fanboys who aren't really interested in tech beyond graphics will feel bad about Crytek supporting consoles so they make up crap.  Most console fanboys then just take it and accept whatever the PC fanboys say.

I apologize for losing patience with you at some points.  Looking foward to any comments you wish to make.
Avatar image for get2sammyb
get2sammyb

6686

Forum Posts

1993

Wiki Points

0

Followers

Reviews: 1

User Lists: 4

#35  Edited By get2sammyb

I honestly don't know why everyone's going crazy over it. It looks nowhere near as good as Killzone in my opinion.

I'll reserve judgement until I see it running an enjoyable game though.

Avatar image for gunner
Gunner

4421

Forum Posts

248

Wiki Points

0

Followers

Reviews: 6

User Lists: 6

#36  Edited By Gunner

I think what kills it the most is the resolution, jaggies out the ass. Can be fixed in a later version though.

Avatar image for get2sammyb
get2sammyb

6686

Forum Posts

1993

Wiki Points

0

Followers

Reviews: 1

User Lists: 4

#37  Edited By get2sammyb
Tarsier said:
"they downgraded it to fit the ps3

cause the ps3 can't handle good grafX you know?

so dey had to make it look wurse

u kno?

da 360 fotage loks bettr

dats jus my opiniun
"
lol. Decent attempt.
Avatar image for crunchuk
crunchUK

6052

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#38  Edited By crunchUK
LiveOrDie1212 said:
"they want to integrate it into the xbox360, thats why it looks so crappy. its like they downgraded the engine to fit into the xbox (just my opinion, don't flame me)"
Fanboy post

Diamond said:
"1. It's very early, wait until they make a game on it.2. There's a new video that shows off parts better.  The outdoor areas look pretty good, and the redwood-like forest with a waterfall is great looking.3. Supposedly the engine has new graphical features that weren't in CE2.4. They named it 3 because it's coming after 2.  It's also designed for future consoles, which will probably have graphics in all games that blow Crysis away."

Sensible post
Avatar image for glowing
Glowing

6

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#39  Edited By Glowing
Diamond said:
I won't make you scroll through all this (he didn't actually say that..).
Yeah your right, there probably isn't that many "techniques" per se. Guess I was thinking more about diffrent ways to do the projections and filtering and such, uniform, lispsm, pcf and so on, and we tend to make names even for the smallest changes, like "soft shadows" where we just samples the shadow texture multiple times.  When someone says shadow buffers I immediately think about shadow mapping instead of lets say stencil shadows. Might be a fault on my part. 1976-9 somewhere there i think it was (don't quote me on that) Shadow mapping introduced by Lance Williams, I sure can remember useless stuff..

I have never heard of anyone skipping frames for shadow rendering, but it probably would save some fps, might look alittle weird though. That sounded alot more reasonable than the render loop going all wonky, which I though you meant first. True spotting ssao can be hard, escpically when you don't see much of animated characters, but if you really look at the scene at about the 1 minute mark at the characters, it doesn't really look like they utilize ssao, or they have really toned it down since crysis.

We will have to see what cuda and other more programmable gpu's/inventions will appear. I don't think nvidia and ati is going to lose their market over night, but i just can't see the future in the rigid model we use in todays gpu's, unless they adapt (ati/amd partnership anyone).  Larrabee will be an intresting first step by intel, I don't think the market for it will boom instantly but if a console maker decides to pick it upp I can see alot of developers jump on it, seeing as they apparently arent that afraid of change (ps3's cell). Intel recently bought up Project Offset, so they might have some interesting things going on there.

I hold no special feelings for xp or dx9, just that i haven't really made the jump to dx10 yet, programming "vice" (wrong word? sorry, english is not my native language) so I can't really say much about it. Yes, 360 uses some kind of variation of dx9, but seeing as dx10 is a complete rewrite and newer I don't think they have that much in common. But they probably optimized it considerbly, so the limits of dx9 might very well be diffrent. I to would like to see a crysis on consoles, I do think they would have to remove some of the more "fancy" features for it to happen though regardless of resolution. And 640p is just fine by me, the prices of hdtv's, I could just never justify/afford a purchase, atleast not in my country.

Here the age diffrence shows (old man j/k), my first computer that i bought myself had an ati 9600 when that was new. Way back then I had just fiddled with cpp maybe a year or so, nothing to do much with dx/ogl or any realtime rendering. So all the demos you talk about, I have no clue, actually I never really looked in to any of there demos much. Though it is funny that most people think HDR is some new tech even though it has been around since atleast the ati 9700. Valve have/had ties with ati, and anti nvidia? Always got the impressions "everyone" tried to make ati the underdog, guess it goes both ways then.

Yes i agree, this "allegiance" some people feel to companies or consoles I never really understood, I hold no grudge against consoles, infact i own several (9 to be more precise, not any of this generation though. To pricey) And the pc fanboys who make upp stuff really shouldn't look at consoles as pure pc's they are a diffrent beast, I just think this tech demo just wasn't up to it, either console or pc standards.

Sorry if I sounded like a jack*ss, not really an excuse but the multitude of ignorant pc/console fanboys and barely any sleep thanks to school, just doesn't mix well.
Avatar image for warxsnake
warxsnake

2720

Forum Posts

33

Wiki Points

0

Followers

Reviews: 0

User Lists: 3

#40  Edited By warxsnake

they messed up when they called this cryengine 3 instead of 2.5

Avatar image for mracoon
mracoon

5126

Forum Posts

77070

Wiki Points

0

Followers

Reviews: 1

User Lists: 15

#41  Edited By mracoon

It'll obviously look better on PC, this was just a demonstration for consoles. Also it's still early in development so they've got a long way to go.

Avatar image for diamond
Diamond

8678

Forum Posts

533

Wiki Points

0

Followers

Reviews: 0

User Lists: 4

#42  Edited By Diamond
Glowing said:
text snip
Yea, the filtering or processing techniques for shadow maps vary.  It's interesting to see the differences in games.  Street Fighter 4 softens the shadow maps by sampling twice on 360, but only samples once on PS3.  The crazy dithered (is that the right term in this case?) shadow maps of GTA4.  I only say shadow mapping to distinguish between that and stencil shadows / shadow volumes in the case of the CE3 video.  I can tell it's shadow mapping, but I couldn't tell you anything about filtering or sample resolution or anything just from the video.

Did you know Toy Story used shadow mapping?  I personally really like the technique, despite the artifacts it often causes.

I can see clearly the skipped shadow mapping frames in the CE3 video (the 2nd one), they shoot down a tree and you can see the shadow updating not as often.  There might actually be more games that use it than I think.  Most of the time in games most objects aren't moving anyways.  If GTA4 didn't use deferred rendering, it'd probably be good to create shadow maps that updated rarely for the environment, and update cars, humans, and other objects more often.  Then again that may be less efficient anyways, but you get the point.

Good point about characters though, I remember how well SSAO stood out in MLB The Show 08 (don't know about 09 but I assume that has it too).  Good point about CUDA too, I think following that path will diversify rendering down the road.  I'm really interested to see how Larrabee performs for today's games.  Because I think it'll really hurt adoption if it's not at least competative with traditional GPUs.

I don't think DX9/10 and 360's DX have much in common either.  It'd be terribly inefficient if it wasn't completely different from DX9 and DX10.  They are writing for a single make of GPU / CPU.  When Crysis was first coming out, Crytek was citing the RAM of the consoles as the biggest limitation.  For as linear as the Crysis gameplay was, maybe they could simplify distant assets or something?  Instead of using lower LOD trees in the distance, maybe simple meshes?  I think they could make something that looked really nice, but I don't know if it'd be worth it for Crytek, EA, or 360 users really.  Probably better to make a whole new game.

Wow, my first IBM had a Trident SVGA card, my first computer was from the 70's.  I would have been better off with an Amiga or Commodore 64, I regret missing that age of computers.

You should at least check out the ATI demo called 'animusic', it's a real time version of one of a series of 90's or 80's CG videos.  The ATI real time version holds up EXTREMELY well and is REALLY fun to watch.

  


HDR is very new compared to most graphics techniques.  Most of the lighting techniques are many decades old.  HDR was first invented less than a decade ago by some Japanese guy at SIGGRAPH (I think).  It's kind of a different effect too because it's more 'visual' and less 'mathy' if you know what I mean.  EDIT - one interesting point I forgot, just that HDRI was originally a photography technique, and that type of 'HDR' is actually old too.  Just the computer side of HDR is relatively new.

Valve seemed (to me) to have strong ties to ATI because the Nvidia 5x00 had a half assed implementation of pixel shader 2.0 (If I remember right).  So they were talking about the Source engine and always praising the ATI cards out at the time.  I tried one of the ATI cards at that time, but there was a conflict with my motherboard, and I didn't have the money to build a whole new computer just for that videocard.  AMD (ATI) seems to REALLY be the underdog now.  For a while AMD caught up, but now both AMD CPUs and GPUs (ATI) aren't selling so well.

When I was really into PC gaming it never seemed like my fellow PC gamers were as ignorant about tech as many seem today.  Maybe it's just how easy over time PC gaming has become that some kid can get a Dell bought by his parents and use it without ever having to learn anything.  Obviously there are still some really brilliant PC gamers, but still a lot of ignroant ones who think all console gamers are dumb.  Here in the US, it's cheaper to buy the consoles than a equivalent PC, so it's been a while since I upgraded.  There are other reasons too, but for someone like me who used to be a HUGE PC gamer and now mainly games on consoles, these 'new' PC gamers annoy me.

I'm glad we've come to a better understanding.  It's been fun chatting.
Avatar image for alexander
Alexander

1760

Forum Posts

731

Wiki Points

0

Followers

Reviews: 1

User Lists: 4

#43  Edited By Alexander
  


For those that could run Crysis maxed out, it's quite something and still the best looking game.
Avatar image for valkyr
Valkyr

746

Forum Posts

1196

Wiki Points

0

Followers

Reviews: 0

User Lists: 37

#44  Edited By Valkyr

CryEngine 2 was/is able to run in a wide range of hardware, if you play on Max it's beautiful, if your play using the minimum configuration, it is ugly as hell, so that 360/ps3 videos were surely using CryEngine 3 comparable to medium on a mid-end pc, I am sure that CryEngine 3 with the proper pc hardware will blow us away