• 52 results
  • 1
  • 2
#1 Edited by FaPaThY (135 posts) -

http://www.cinemablend.com/games/Xbox-One-Games-E3-Were-Running-Windows-7-With-Nvidia-GTX-Cards-56737.html

"You know how EA's COO Peter Moore told Geoff Keighley during the post-conference interview at E3 that the games they were showing off on stage were running on comparable dev kit specs to the actual home consoles? Well...that's not really true. What is true, however, is that to get the best performance on home console games, Microsoft made sure that they were running on the most stable system specs available on the market and that happened to be an HP powered, Windows 7 system with Nvidia's 700 series GTX GPU."

Saw this posted elsewhere and didn't see it here yet. Demo'ing console games on PCs at these events is Par for the course, but everything else about this is just hilarious.

#2 Edited by Bismarck (432 posts) -

I don't know if there is anyone out there that doesn't know that most if not all of the demos shown of the conferences are on a high end PC. Even when they showed the multyplayer demo and the camera went past the dudes playing it, you could see they were using Mouse and Keyboard, and some were using xbone controllers.

Edit: I dont think that even with next gen consoles a lot of companies will go up there and do the demos on the consoles theirselfs. Probably games that werent coming for the PC itself also got shown on a PC.

#3 Posted by impartialgecko (1624 posts) -

I expected nearly everything to be running on PC's but I would have hoped that the spec would be comparable to the actual hardware. Otherwise that all seems a bit like snakeoil to me.

#4 Posted by TruthTellah (9311 posts) -

Well. Of course they were.

Now, the real question is, were PS4 games running on Win7, 8, or XP? :|

#5 Posted by bitcloud (646 posts) -

@truthtellah: Actual PS4 devkits confirmed by Infamous devs and Jonathan Blow.

#6 Posted by Bollard (5663 posts) -

HP? Stable? Ahahahahah

#7 Edited by FaPaThY (135 posts) -

I can tell some of you guys read the article/post.(Not) Hint: Consoles games running on PCs isn't the punchline here.

#8 Posted by TruthTellah (9311 posts) -

@bitcloud said:

@truthtellah: Actual PS4 devkits confirmed by Infamous devs and Jonathan Blow.

Oh yeah? For on stage? Because I thought those were all PC except for the ones specifically stated to be running on dev hardware. I know things were a bit different on the floor.

@fapathy Then your summary up top and headline was misleading. Having read the article, I just came away with another instance of someone mentioning how console games were being shown off a PC. I definitely get the "joke" that they were on HP hardware and running W7 instead of W8, but considering the hundreds of times I see posts about console games being shown on PC, you can understand why some people might respond as they have. I don't see anything particularly surprising here.

I'm glad you got some kind of a laugh out of it though. Maybe some others will.

#9 Edited by Zero_ (1976 posts) -

@truthtellah: not sure about anything else, but all knack machines were on proper dev kits. There's a pic floating around of knack that crashed to the PlayStation 4 os.

#10 Posted by TruthTellah (9311 posts) -

@zero_ said:

@truthtellah: not sure about anything else, but all knack machines were on proper dev kits. There's a pic floating around of knack that crashed to the PlayStation 4 os.

Oh, I can believe they had dev kits out on the show floor. I was just thinking it was unlikely for up on stage. The developers are probably using whatever they've got for show floor demos. I'd definitely believe that they had one crash like that. And I could understand people having their own personal setups for the demos, too. It would depend on the developer.

#11 Edited by FaPaThY (135 posts) -

@truthtellah: Not sure how the headline or quote of the first paragraph(It's not a summary) was misleading. It's exactly what's printed on that page, minus some shortening of the headline to fit the 60 character limit. Windows 7 and an Nvidia card, which are in both the headline and paragraph, are relevant to the joke.

Next time read the article before randomly commenting to fluff up the post count.

Anyway, after I read the article, I had to check to see if I missed news of MS ditching AMD for Nvidia GPUs or something :P

#12 Posted by TruthTellah (9311 posts) -

@fapathy said:

@truthtellah:

Next time read the article before randomly commenting to fluff up the post count.

There's no need to be a jerk, fapathy. I checked out the article, but I didn't find it as funny as you did. Thus, your original post came off a different way. Sorry I didn't understand your intent, but that's still no reason to be rude.

#13 Posted by Intelon (1 posts) -

it's kind of was expected considering that the xbox 360 had two mac pro's g5's running at e3 back when they revealed the machines right?

i think out of all of e3 the most shocking aspect to me was that the ps4 and the xbox one are pretty identical and the only difference will be slightly better render or certain things like a pc running high end to ultra settings that tiny bit of scheme, well that is if developers decide to take advantage of the extra power but since being pretty much pc's wouldn't be too hard.

we all laughed at the wii u and i was honestly deciding to take it to game and trade it in for £190 but i dunno now.

#14 Posted by ch3burashka (5112 posts) -

>comparable dev kit specs

>high end PCs

I see no problem here, especially since the new consoles are basically high-end PCs, more so than the current/previous gen.

#15 Edited by SomeJerk (3304 posts) -

Best way to handle this is to wait, and give Microsoft some the most hell ever if the finalized products end up running and looking worse.

I've started to think it's like this from all the talk (twitter, leaks):
- Xbox One devkits were too unstable to use on the floor, for one reason or another
- Emergency solution of office watercooled fat PCs were brought in
- And laptops for the Forza demo machines

#16 Posted by believer258 (11991 posts) -

>comparable dev kit specs

>high end PCs

I see no problem here, especially since the new consoles are basically high-end PCs, more so than the current/previous gen.

Yeah, but they were using an old operating system and a graphics card from a different manufacturer than the one they're using on the Xbox One. Said graphics card is also far, far more powerful than the one reportedly in the Xbox One. The Xbox One's is similar to an AMD HD 7790, a card on the low-medium end of the generation that is about to be replaced. The one in these computers is a very high end one from the generation that's coming.

It wasn't comparable dev kit specs.

#17 Edited by Bismarck (432 posts) -

This image started poping up on the internet, gotta hand it to Microsoft, well done.

#18 Edited by The_Laughing_Man (13629 posts) -

Isnt this pretty normal for E3? Them not wanting to chance a Dev kit crashing to much for anything to be shown off?

#19 Posted by BaneFireLord (2950 posts) -

Well that's good. When the few XBONE exclusives I'm interested in are eventually ported to PC, I won't have to get Windows 8 to run them.

Online
#20 Edited by Warchief (658 posts) -

meh. when the 360 was on the show floor for the first time, all the 360's were in fact apple G5 towers.

#21 Edited by thetenthdoctor (291 posts) -

@believer258:

You obviously don't understand how coding straight to the metal affects performance. A PC running windows and a 3gb GTX780 would probably give comparable performance to a closed console architecture running something in the neighborhood of an 8gb GTX7790 (the approximate spec of the XB1).

I've said in numerous threads here (and there's plenty of articles on Digital Foundry) about how consoles and PCs differ. While the quantifiable capabilities of the CPU/GPU in a console might match mid/low end PC hardware, the actual performance capabilities far exceed them. The Xenos chip in the Xbox 360 is comparable a Radeon X1800 card, a GPU that if it could even boot up Tomb Raider or Bioshock infinite probably couldn't render 10fps on low settings, yet the Xenos runs them at 720p30 on settings equivalent to PC "medium". How? Closed architecture, code optimization for fixed specs, direct access to RAM and processor clocks (instead of having to shuttle stuff around between the system ram, GPU ram, northbridge, CPU, GPU, etc).

It's embarrassing for AMD that they were using a competitor's brand instead of a 7990ghz Edition AMD card, but I'd bet that GTX780 PC is pretty close to real world XB1 performance levels.

#22 Posted by The_Laughing_Man (13629 posts) -

@believer258:

You obviously don't understand how coding straight to the metal affects performance. A PC running windows and a 4gb GTX780 would probably give comparable performance to a closed console architecture running something in the neighborhood of an 8gb GTX7790.

I've said in numerous threads here (and there's plenty of articles on Digital Foundry) about how consoles and PCs differ. While the quantifiable capabilities of the CPU/GPU in a console might match mid/low end PC hardware, the actual performance capabilities far exceed them. The Xenos chip in the Xbox 360 is comparable a Radeon X1800 card, a GPU that if it could even boot up Tomb Raider or Bioshock infinite probably couldn't render 10fps on low settings, yet the Xenos runs them at 720p30 on settings equivalent to PC "medium".

It's embarrassing for AMD that they were using a competitor's brand instead of a 7990ghz Edition AMD card, but is bet that 780 PC is pretty close to real world XB1 performance levels.

That and if they are using a Xboxone version of a game putting it on something that might be stronger would not magically make it look better. Arnt the graphics of a game tailored to look the way they should?

#23 Posted by thetenthdoctor (291 posts) -

The fact of the matter is that we're in a weird time for comparing consoles and PCs. There isn't a single PC GPU on the market with 8gb dedicated RAM, so it's frankly impossible to build a PC that perfectly mimics the performance capability of the new consoles. The easiest way to do it right now is make something that can brute force it's way past the missing ram and CPU cores via higher clock speeds, shader units, FLOPS and compute capability, in this case a GTX780. Yes it's a more powerful GPU overall than the XB1 has, but also weaker in some areas AND has to deal with motherboard delays and system overhead that the bespoke units in the XB1 won't.

#24 Posted by NoobSauceG7 (1258 posts) -

I think the funniest thing is that the video card was supposed to be some sort of high end gtx one, while Xbox One is supposed to have an AMD card, right?

#25 Edited by bitcloud (646 posts) -

@truthtellah: There have reports that Microsoft is behind by about 6 months and just about everything and they are really scrambling to get shit together. Coming from reliable people on Beyond3D and Neogaf. These are the same people that have been right about every hardware and software piece to come out. They were right about the spec for both systems and the yield issues for the Xbox One's processor.

@thetenthdoctor

: Remember that the PS3's gpu is even worse, somewhere in-between a 7600gt and 7800gt. Even though the Cell processor makes up a lot of that, the cards we both mentioned are useless. Even the cards that are 1 to 2 generations better are pretty much useless for any games.

The 8800GTX wasn't even capable of running Crysis 1 higher than medium settings at 1280x1024. A $600 card at the time that came out during the PS3 launch. A card that was a generation ahead of both consoles in terms of hardware shaders.

#26 Edited by Silver-Streak (1369 posts) -

So, the only thing about this non news that's interesting at all: They're using nVidia GPUs. The XBO is running a ATI APU. This could mean whatever performance was seen on the floor could be far less optimized than what it would be at launch, or, (sadly) it could mean that it won't perform as well at launch as it did on the floor.

Also, the fact they were running Windows 7 is kind of hilarious.

#27 Posted by Liquidinsurgency (5 posts) -
#28 Edited by Tidel (360 posts) -

@fapathy said:

Windows 7 system

Awesome.

#29 Posted by Lind_L_Taylor (3966 posts) -

Sounds like a PC might be the best console to choose from.

#30 Edited by pause422 (6201 posts) -

a 700 gtx card isn't anywhere comparable to Xbone specs, if you think they are you are so dead wrong its funny. I ultimately just find it a bit disappointing that most PS4 games had ps4 dev kits(confirmed several places) and microsoft wasn't confident enough to have theirs on actual Xbone hardware. More to the point though, I guess even MS hates windows 8, seeing as Windows 7 was the choice. Says alot right there.

#31 Posted by dsi1 (169 posts) -

Not only a high end PC nothing like either of the consoles, but also Win7! Good job Microsoft!

#32 Posted by Pachtar_Klepek (89 posts) -
@pause422 said:

a 700 gtx card isn't anywhere comparable to Xbone specs, if you think they are you are so dead wrong its funny. I ultimately just find it a bit disappointing that most PS4 games had ps4 dev kits(confirmed several places) and microsoft wasn't confident enough to have theirs on actual Xbone hardware. More to the point though, I guess even MS hates windows 8, seeing as Windows 7 was the choice. Says alot right there.

They did what they did because it was stable not from some snarky political reasons. X1 devkits aren't stable enough to publicly show a game, which says a lot.

#33 Edited by thetenthdoctor (291 posts) -

@pause422:

Read post 21 again, then post 25. If you still don't understand, read them again. Repeat as necessary.

#34 Posted by bitcloud (646 posts) -
#35 Edited by thetenthdoctor (291 posts) -

@bitcloud:

Yes, it was a compliment. You're trying to make people understand the same thing I am- that direct access to the GPU RAM and the programmers' ability to code for a fixed spec allows relatively antiquated hardware to perform substantially better in a console vs a PC. That's why the PS3 can render Uncharted and the 360 can render Gears 3 on ancient tech and tiny amounts of vRAM. A PC built to the actual physical spec of those consoles (in terms of shader units, CPU clock, RAM, etc) would straight up choke on those games.

It's not the best analogy, but comparing the new (or old) consoles to PCs of the same spec is like saying a Ferrari should have the same performance as an F150 pickup truck since they both have V8 engines.

#36 Posted by pause422 (6201 posts) -

@thetenthdoctor I read everything once, so no thanks. I know you would like to believe you are the one correct person in this thread above all else, but no one is buying it. Good try , though.

#37 Edited by thetenthdoctor (291 posts) -

@pause422:

I'm not alone in knowing what I'm talking about. Bitcloud (and a ton of articles online) also understand the difference between how hardware performs in consoles and PCs. Your proud declaration that a GTX 780 PC isn't even close to the performance of an XB1 is foolish and uninformed. If you want to stick to those guns and keep being ignorant, then more power to ya I suppose.

#38 Edited by EXTomar (4843 posts) -

The reason why no one bothers selling a video cards with "huge" amounts of VRAM is that they are fill bound instead of resource bound. Creating video cards that can do more processing is more important than video cards with more RAM. To that end there is no compatible PC video card that compares with the XBox One because it would be a waste in desktop systems. There are comparable laptop systems but they always trade off power for portability.

#39 Posted by PufferFiz (1379 posts) -

Isnt this pretty normal for E3? Them not wanting to chance a Dev kit crashing to much for anything to be shown off?

Not sure, but in the sony booth the ps4s were real dev kits, I was playing Contrast when the guy next to me's machine locked up and I was talking to a dev I met the night before about how its been locking up all event. When it rebooted it was black till the game started to execute again so no windows, unless it was running a custom linux set up, but there was no post screen.

#40 Edited by Benny (1953 posts) -

It's like nobody has ever seen E3 coverage before. They always run console games on PCs.

#41 Posted by Angouri (233 posts) -

@benny: johnathan blow's point on this was that I'd they can't run it on development hardware at the biggest show for video games prior to launch, it looks bad. These things are launching in months, and some of those games are launch software. No need to shit on the other guys, but Sony confirmed that they were running their games on PS4s (indies too). Xbox One is coming in hot, and PS4 games have stability issues.

And for those that think Microsoft wont make shit Win8 exclusive, you forget: they now have a Windows 8 store. Skulls of the Shogun never came to 7, and Halo 2 was Vista (not XP). First party Microsoft is releasing everything for Windows 8.

#42 Edited by mosdl (3229 posts) -

@bitcloud:

Yes, it was a compliment. You're trying to make people understand the same thing I am- that direct access to the GPU RAM and the programmers' ability to code for a fixed spec allows relatively antiquated hardware to perform substantially better in a console vs a PC. That's why the PS3 can render Uncharted and the 360 can render Gears 3 on ancient tech and tiny amounts of vRAM. A PC built to the actual physical spec of those consoles (in terms of shader units, CPU clock, RAM, etc) would straight up choke on those games.

It's not the best analogy, but comparing the new (or old) consoles to PCs of the same spec is like saying a Ferrari should have the same performance as an F150 pickup truck since they both have V8 engines.

Uncharted 3/Gears 3 look the way they do because of years of experience in optimizing for the console hardware. They got around vram limits by specifically coding to them (texture reuse, clever usage of hidden loading screens, making smart use of geometry).

And yes, an exact replica PC would struggle, but that would be a 6-7 year old PC.

PCs will always be able to outperform next gen consoles due to much faster cores, gpus, etc.

#43 Posted by JJOR64 (19023 posts) -

So what you are telling me is that my Windows 7 computer with my Nvidia graphics card is basicly an Xbox one? Great! Now I don't have to buy one.

#44 Posted by davidwitten22 (1708 posts) -

Windows 7 ahahaha. Microsoft clearly has tons of faith in Windows 8.

#45 Posted by Colourful_Hippie (4419 posts) -

Of course they were on PC's

#46 Edited by thetenthdoctor (291 posts) -

@mosdl:

I can't tell if you're trying to agree or disagree with me, because you basically said the exact same thing I did.

Yes, PCs are capable of better graphics than a console, but they require much higher specs to get the same results as the consoles do. You said it yourself- a 7 year old PC would choke on games that run fine on 7 year old 360 hardware. The XB1's GPU will resemble a Radeon 7790, but a PC with a 7790 wil not be able to turn out graphics like the XB1- FACT. To match the graphics capability of the PS4 and XB1, PC users will need a 4+ ghz quad core CPU, 8-12gb of RAM and at LEAST a GTX 770 or 7970 Ghz Edition.

When the PC versions of these games start hitting, you'll see.

#47 Posted by mosdl (3229 posts) -

@mosdl:

I can't tell if you're trying to agree or disagree with me, because you basically said the exact same thing I did.

Yes, PCs are capable of better graphics than a console, but they require much higher specs to get the same results as the consoles do. You said it yourself- a 7 year old PC would choke on games that run fine on 7 year old 360 hardware. The XB1's GPU will resemble a Radeon 7790, but a PC with a 7790 wil not be able to turn out graphics like the XB1- FACT. To match the graphics capability of the PS4 and XB1, PC users will need a 4+ ghz quad core CPU, 8-12gb of RAM and at LEAST a GTX 770 or 7970 Ghz Edition.

When the PC versions of these games start hitting, you'll see.

You missed the point - the 7 year old 360 took 5 years to start rendering graphics that a PC with the exact same specs couldn't due to specific optimizations. It takes time to build those optimizations. The fact that so many console games had to render in sub-720p at barley 30 fps while old pcs could render them in 1080p at 60 fps proved it.

So no, you are wrong. Right now I bet comperable PCs will outperform as drivers have been optimized by nvidia/amd for years while the new systems are brand new. The performance benefits of consoles (direct access to gpu, optimizing the OS [windows is already well optimized regarding directx/opengl]) will take time to achieve. Also no game is using all 8 gigs of RAM, they are probably in the 3-4 gig space.

And the fact that the cores on consoles are so much slower compared to PCs I think negates any advantages short of power consumption, as you can easily brute force it on current quad cores.

#48 Edited by thetenthdoctor (291 posts) -

Here's a little comparison to illustrate my point.

A Radeon 3870 is the published recommended GPU for the PC version of Assassin's Creed: Brotherhood. As you know, published specs on a PC game are probably only good for medium quality settings at around 720p and 30fps, which is just what the 360 version runs at. The 360 runs this game just fine using the Xenos, equivalent to an AMD X1800 GPU. Take a look at the gulf in specs between these 2 cards:

X1800: 500 MHz core clock, 1ghz memory clock, 30gb/s memory transfer rate.

3870: 775mhz core clock, 2.25ghz memory clock, 72gb/s memory transfer rate.

Those PC specs are literally double what it takes the 360 to run that game, and that's just to run it at Xbox quality. If you want 60fps and ultra settings, you'd need almost triple the core and memory clocks of the 360.

Comparing consoles to PCs based on the numbers is foolish. If you think a PC with similar core, shader and memory clocks will run XB1 or PS4 games, you're living in a dream world.

#49 Posted by mellotronrules (1223 posts) -

so apparently this is news, guys.

pro tip: e3 is a trade show and essentially a giant commercial. practically no hardware or software is final. act accordingly.

#50 Posted by thetenthdoctor (291 posts) -

@mosdl:

Heck, I can still brute force past 360 levels of graphics with an overclocked Dual Core and a GTX770, but that's all going to change in the next year. PC hardware right now is all built around brute forcing 60fps and crazy antialiasing and anisotropic filtering into games written for the consoles. Once developers beging optimizing for 8 cores and instant access to 4-5gb of on GPU RAM, PCs will begin struggling and big upgrades will be needed to people's PCs to hang.

It's the reason I've held my dual core as long as I have. No sense in going to to socket 1155 or even 2011, because Intel and AMD will need to respond with better CPUs to match these new consoles.