It should have a 3 core Power7 by IBM, and some Radeon HD derivate. From the presentation the games looked super jaggy to me, but maybe that's because AA was not working. Well I'm looking forward to it.
Wii U
Platform »
The Nintendo Wii U, the follow-up to the monstrously popular Nintendo Wii console, launched in North America on November 18th 2012.
Some Wii U Tidbits on Hardware Performance, Screen Sharing
@DS23: Here's the thing: the 360 and the ps3 both shipped with outdated hardware, and it's likely that the 720 and the ps4 will too. Why? A few reasons.
- Cost. Older technology costs less money to license, and less money to manufacture as processes improve.
- R&D Time. Consoles are not PCs. You can't just throw in whichever graphics card is the latest and be happy. A slightly larger GPU might cause components on the boards to not fit, and that means more money redesigning chips and boards and the actual hardware container itself. Space is limited, so the actual internal components are, too. Not to mention power and cooler requirements which may make the system bulkier/heavier or noisier.
- Hardware needs to be set in place long beforehand so that developers can get launch titles developed and optimized. If you change spec too soon before launch, it's likely developers will miss their launchdate.
Maybe "severely" outdated hardware is a stretch, but don't expect to see AMD 9900s inside of these new consoles.
@Khann said:
I still don't actually understand what this console is, and how it works.
Seriously now? Have you seen any of the presentations? 'Cause I've got a pretty good idea of what it is and what it does from watching them.
Besides, the Wii U wiki page has tons of info on the system and its features.
@DeF said:
@Khann said:
I still don't actually understand what this console is, and how it works.
Seriously now? Have you seen any of the presentations? 'Cause I've got a pretty good idea of what it is and what it does from watching them.
Besides, the Wii U wiki page has tons of info on the system and its features.
Ah, cheers for that.
Now I'm even less interested.
@habibyjohnson said:
@DS23 said:
I keep seeing this despite hearing the complete opposite in my own wanderings regarding MS and Sony's next gen plans, especially MS. Weird.
The thing is, if released soon, no matter what hardware it is it'll end up not looking too different from what we have now. Graphics has pretty much peaked. This is why we're not ready for next-gen yet. Were obviously ready for Wii U, cos Wii is very outdated, but there is no reason for ps4/xbox 3 right now. I think these rumors are false anyway. I doubt we WILL see them for a few years yet.
Not to mention that everything Microsoft or Sony may put in their consoles will be unavoidably outdated in comparison to current PC hardware. You really want a substantial, noticeable leap in graphics and performance? Put a GTX 690 in there. It gives the same performance as three GTX 580. The Samaritarian Unreal Engine demo showcased some time ago can be rendered at full speed with a single GTX 690. The thing is, the GPU alone costs $1000 right now. A GTX 570 is more than decent GPU, and it still sells for around $150-$200. There is absolutely no way any company can include that (or any similar) GPU in its console without either selling it at a huge loss or skyrocketing the price of the console. They'll have to take a massive hit to their finances either way. That is why consoles always use "outdated" hardware. It can't be in any other way, given the nature of the console market.
What I'm trying to say is that if you really want cutting-edge graphics, play on PC with the latest hardware. Otherwise, try to focus on what a console can achieve with its technical limitations. The Wii U can be an extremely interesting product, if developers are willing to explore the controller's potential to its fullest.
Why is everyone focused on the hardware specs, this isn't how Nintendo sells its consoles. Look at the Wii, it sold millions and not because of its graphical capabilities, but the unique controller and games. Wii U is marketed towards the mainstream again, with the hopes that they grab a chunk of the hardcore gamers too.
If you want next generation graphics, then Microsoft and Sony have you covered. Nintendo has another unique controller to sell their console with. It’s exactly how they sold the Wii.
For fuck sake, the new console is called Wii U. Brand recognition is key here. Nintendo wants the Wii audience to buy their new console and if they can sucker in some of the hardcore market then that’s where Batman, Ninja Gaiden 3, Assassin’s Creed 3, etc comes in.
Honestly, Pikmin, Batman, and Lego City all looked like 360 or PS3 games. There wasn't anything that looked better than what's currently out there. I hope the system renders and outputs in 1080p and we get games running at 60fps. That will be enough for me.
I don't understand how you guys can expect games to magically be 60FPS on Wii U when it's rumored to have similar, perhaps only slightly better technology then the PS3 and 360...
@MAGZine: @MAGZine said:
@PlasmaBeam44: That's plenty! Most systems nowadays only render 720 and upscale to 1080 anyhow. Many PC rigs won't do 1080p at 60fps... at least not with all the glitter on.
1080 is a pretty paltry resolution on the PC, you'd need to be pretty far down the GPU/CPU food chain to struggle to run 1080. With the rare exception of something like a BF3 or a DX11 Crysis 2 a GTX 560 Ti will do just dandy at that resolution on current games.
@MAGZine: Modern PC hardware can render many games at 1080 resolution at 60 frames per second easily.
For example running the PC version of Max Payne 3 at 2560x1600, DX11, very high FXAA, 16x AF, "very high" settings w/ HDAO and very high tessellation you can obtain 60fps easily, running a GTX 680, 6GB ram and an Intel I7 2600K.
That resolution, performance and quality settings are above anything available on current console hardware and next generation hardware also.
@Ominou5: The GTX 680 alone costs more than four Xboxes and is a top of the line card. Unfair comparison. When @MAGZine says many PCs won't do 60fps at 1080p with at maximum settings, he's correct. Only a few PCs are like yours.
Let's just all agree that the we won't be happy with console graphics until they do real-time ray-tracing at 60+ FPS. Seeing as the GTX 6xx family of cards will have been out for over a year by the time the "next-gen" consoles come out basically guarantees that we will all be very disappointed. So...the next E3 will be underwhelming too :-/
I believe Nintendo is rushing into the "next gen" market first so that they can establish dominance. By the time Sony and Microsoft releases their respective consoles, Nintendo would have already have a huge chunk of the marketshare, making their console the new common denominator for development. But so far with the reveal of the new pro controller and connectivity interface, it seems like Nintendo have now just reached parity with the others.. That's how I see it.
@Insectecutor: I said "Modern PC hardware can render many games at 1080 resolution at 60 frames per second easily."
I was not referring solely to the GTX 680. When I was talking about the GTX 680 I was explaining that "high End" hardware is capable of running the latest games at a higher resolution that 1080, 2560x1600 with highest in game settings and still obtain 60 frames per second or more.
If you were to purchase a mid range gaming PC right now for around $930 USD you would be able to play any new title on the market at 1080 resolution with high settings and obtain a frame rate of 60fps.
Most serious PC gamers will have a mid range PC at least and to be honest considering the age of current gen consoles and the games which are being developed for them, most mid range computers built and/or purchased in the last 2 years should have no problem running those games at 1080 at 60fps.
But to be honest to use the term "many" in this context considering the wide scope of computer hardware on the market today both high end and low end is too vague to be used. There are millions upon millions of PC users in the world who use their computers for tasks other than gaming, so if you were to say that
"ManygamingPC rigs won't do 1080p at 60fps... at least not with all the glitter on."
which is how I interpreted it, I still disagree.
Why didn't they show any of this stuff at the press conference, sounds more interesting than what was shown
@JoeyRavn said:
@habibyjohnson said:
@DS23 said:
I keep seeing this despite hearing the complete opposite in my own wanderings regarding MS and Sony's next gen plans, especially MS. Weird.
The thing is, if released soon, no matter what hardware it is it'll end up not looking too different from what we have now. Graphics has pretty much peaked. This is why we're not ready for next-gen yet. Were obviously ready for Wii U, cos Wii is very outdated, but there is no reason for ps4/xbox 3 right now. I think these rumors are false anyway. I doubt we WILL see them for a few years yet.
Not to mention that everything Microsoft or Sony may put in their consoles will be unavoidably outdated in comparison to current PC hardware. You really want a substantial, noticeable leap in graphics and performance? Put a GTX 690 in there. It gives the same performance as three GTX 580. The Samaritarian Unreal Engine demo showcased some time ago can be rendered at full speed with a single GTX 690. The thing is, the GPU alone costs $1000 right now. A GTX 570 is more than decent GPU, and it still sells for around $150-$200. There is absolutely no way any company can include that (or any similar) GPU in its console without either selling it at a huge loss or skyrocketing the price of the console. They'll have to take a massive hit to their finances either way. That is why consoles always use "outdated" hardware. It can't be in any other way, given the nature of the console market.
What I'm trying to say is that if you really want cutting-edge graphics, play on PC with the latest hardware. Otherwise, try to focus on what a console can achieve with its technical limitations. The Wii U can be an extremely interesting product, if developers are willing to explore the controller's potential to its fullest.
I agree with you, but you have to remember that buying a chip for personal use or buying bulk is considerably different. Speaking out of my ass now; Where's a single GTX 690 costs you $1000, someone who buys one million of GTX 690s might only pay $250 per unit. And they're dealing with chips not manufactured hardware, so they get additional "discounts" on that.
The 'game not pausing while you look at your inventory' mechanic was implemented in Demon's/Dark Souls. It definitely added to the tension of the game but seemed unfair as you can't see the screen at all when you have the inventory up so no chance to avoid an incoming attack if you didn't notice a lurking monster. Having it on a separate screen avoids this problem while still adding some tension to the game overall. Looks good to me.
@niamahai: I don't necessarily think that's true. They might be getting to market first for the holiday season dominance, but I don't think Nintendo has any delusions of grandeur regarding their console as the go-to platform for the "hardcore" gamer. Microsoft's new console will hit and that market will continue to belong to them.
@skrutop: Nintendo doesn't play that game, and announced ahead of time they would not do that. They'll wait a while, mull over reactions etc, then release a date and price.
How much hard drive space will it have? Will it come with an HDMI cable? Will the boxed games be available digitally? How will two controllers work when there are only the big controller vs remotes type of gameplay in a particular game and... oh, nevermind.
I guess we'll have to settle for "Assassin's Creed III looks like itself (there's a philosophical conundrum)" and "this next-gen console appears (yeah, there's some confidence) to be better in some areas (which ones, exactly?) than last gen's." Well, I would hope so.
@Manhattan_Project said:
Yet another game that makes me look away from the main screen to manage my inventory. WHY? This sounds terrible, right? Someone tell me I'm not crazy!
Well, in most games bring up inventory stops the game which "in theory" would allow you to look down at the screen in your hands. What we should all hope for is the use of touch screen in ALL inventories. That way you can drag and drops or drag to combine items. That at least would be a SUITABLE use of the technology. Yet, another suitable use of the tech would be is you bring up the inventory and shake the screen to make the innovatory sorted by weight, shake it again and the inventory is sorted by value, shake it again it is sorted by type of number of items again.
There are smart ways to use the tech, but Nintendo needs to be forceful in making some standards that MUST be possible. Any game with an inventry should have a way to view it on the Wii U pad and the controls of the inventory on that pad should be used across all games. The same goes for maps, if there is a map in teh game there needs to be a way to view and use it on the pad with THE SAME controls across all games. That way when you are playing you know how it works.
Listen, I think Wii U has low-ball specs and is a huge mistake. However, there are some ideas in there that "if" used well could become STANDARDS going forward for how to use a secondary touch screen in a game. Nobody believe more then I do that Wii U is a catastrophe...but it could be a car crash where some good might leak out of the wreckage.
@FMinus: That is not how it works. Console makers buy "the plans" for graphics chips, and then it is entirely up to them to manufacture them. While they don't pay for each additional card, they do pay a hefty amount of the plans, and there are extra costs involved given that they must tool their own factories (or pay someone to implement it), etc. All said and done, the actual cost is virtually unknown, but it's not going to be much cheaper.
@Ominou5: Alas, this is not what I said. Most PCs out there, including ones used for gaming but not built for gaming, will not run graphics at 1080p on high. This is a fact. "Gaming PCs", ie those built to game, may be a different story. Don't kid yourself into thinking that even most gaming machines run 690s, though. There are still tons of cards like AMD 4900s and nVidia 2xx.
Everyone else... just remember that Nintendo doesn't like to lose money on hardware. It's one of the reasons that the Wii was so successful. They also don't care for bad-selling expensive hardware. I'd say expect decent hardware and an attractive pricepoint for the WiiU.
It's hard for me to be excited about this. For the last decade, Nintendo has seemed to rely more and more heavily on gimmicks, whether they be two screens, 3D, or motion control. Game development seems to take a back seat and it shows in their game catalog. Five or so stellar Nintendo developed games, a handful of good third parties, and a heap of terrible ports and shitty movie licensed games does not make for a worthwhile purchase in my opinion.
I think Nintendo should put out the most powerful system they can muster, and make it with a smart, intuitive architecture. Make developers WANT to make games for your platform. Isn't that what it's all about at the end of the day? Games? Who's going to want to make games for this thing when there are likely 2 other powerful, game centric consoles in the wings?
@Manhattan_Project said:
Yet another game that makes me look away from the main screen to manage my inventory. WHY? This sounds terrible, right? Someone tell me I'm not crazy!
It might actually make sense in some cases, like crafting or survival horror where you want to force decisions on whether you want to divide your attention when monsters are about.
Still, very limited use cases and I hope it's used more like what a secondary display should be used for, or better as a secondary vision mode you peek at from time to time like Detective vision.
I'd been holding out hope we might see a 60 frames-per-second Assassin's Creed III on Wii U but apparently not, at least not for the first wave of games coming to the platform.
Of course the second wave (or if we're charitable third) will have to compete with PS4 And Xbox 3, so that point is somewhat moot.
@FMinus said:
I agree with you, but you have to remember that buying a chip for personal use or buying bulk is considerably different. Speaking out of my ass now; Where's a single GTX 690 costs you $1000, someone who buys one million of GTX 690s might only pay $250 per unit. And they're dealing with chips not manufactured hardware, so they get additional "discounts" on that.
Yes, you are talking out of your ass there. You have to remember that when the PS3 came out they were still LOSING money for every PS3 sold at 600 dollars and that was using a trimmed-down Geforce 7800, closer to a 7600. By the time the PS3 came out, the 8000 series was already available and any PC gamer should remember how huge of a performance jump that one was.
If the PS4 is coming out next year they'll never be able to put a top-of-the-line chip in there and will most likely go with one that's a generation or two behind. So it might be something similar to a Radeon HD 7950 if we're lucky. I think whoever said that the next console is going to be able to display 4K resolutions is also talking out of his ass. That's most likely just a theoretical limit, running at sub-20 FPS.
@paulunga said:
@FMinus said:
I agree with you, but you have to remember that buying a chip for personal use or buying bulk is considerably different. Speaking out of my ass now; Where's a single GTX 690 costs you $1000, someone who buys one million of GTX 690s might only pay $250 per unit. And they're dealing with chips not manufactured hardware, so they get additional "discounts" on that.
Yes, you are talking out of your ass there. You have to remember that when the PS3 came out they were still LOSING money for every PS3 sold at 600 dollars and that was using a trimmed-down Geforce 7800, closer to a 7600. By the time the PS3 came out, the 8000 series was already available and any PC gamer should remember how huge of a performance jump that one was.
If the PS4 is coming out next year they'll never be able to put a top-of-the-line chip in there and will most likely go with one that's a generation or two behind. So it might be something similar to a Radeon HD 7950 if we're lucky. I think whoever said that the next console is going to be able to display 4K resolutions is also talking out of his ass. That's most likely just a theoretical limit, running at sub-20 FPS.
Had nothing to do with the GPU on the PS3, but the CPU & BR drive which at that point costs a fortune, even a stand alone BR player costs close to a PS3 at that time.
Please Log In to post.
This edit will also create new pages on Giant Bomb for:
Beware, you are proposing to add brand new pages to the wiki along with your edits. Make sure this is what you intended. This will likely increase the time it takes for your changes to go live.Comment and Save
Until you earn 1000 points all your submissions need to be vetted by other Giant Bomb users. This process takes no more than a few hours and we'll send you an email once approved.
Log in to comment