By Pepsiman 37 Comments
The first thing to note before actually going inside the Wii itself and examining its specifications is that the equipment already exists for it to output HD resolutions. Naturally, I'm referring to the component cables. These do exist for the system and although they're only officially used for progressive scanning and surround sound, there's nothing which prevents them from otherwise being ordinary HD cables. So out of the box, there already isn't a need to create homemade cables to hack together the capability for the Wii to output HD resolutions. The necessary cables can already be bought; they merely aren't being used to their fullest potential as of now.
Some gold-plated prongs attached to wires don't make a full argument, though, so now let's take a gander at what's actually powering the Wii underneath. The actual specifications for the system have never been publicly released in their entirety by Nintendo, ATI, or any other company involved in the system's creation. Nevertheless, enough has either been leaked or hacked to give a decent enough understanding of the console's inner workings. With that said, let's get a pretty interesting revelation out of the way: the Wii's specifications closely resemble that of the original Xbox's, both in terms of the CPU and the GPU. There are some differences, such as the Wii having a slightly lower CPU clock speed by just a handful of megahertz, but other aspects of it do enough to compensate and make Microsoft's console a really good analog for it. This is all really important to note when it comes to saying the Wii is already capable of HD for reasons outlined below.
Much like the Wii, although the first Xbox iteration wasn't overly marketed as an HD-capable machine, it too was perfectly capable of it in theory. The main difference between the two was that it was a fact actually acknowledged by Microsoft and developers weren't prohibited from making their games output that high if they pleased. You might recall that using the odd check box system on the back of games, some games would mark that they had things such as Xbox Live connectivity, LAN capabilities, etc. Some also marked the box for 720p output capability. The games which did indicate this, such as Soul Calibur II, may not have looked that much prettier because of it, but what needs to be kept in mind is that 720p, 1080p, etc. are solely indicative of resolutions, meaning it's purely the number of pixels the system has to work with that's upped, not polygons. So naturally, you wouldn't be suddenly getting games on the Wii which superficially resemble games like Crysis, but you could in theory find ones which could at least output the same resolutions that the GPU abuser can as well.
But what does this actually mean in practice? Put simply, the Wii is probably capable of a lot more than it's given credit for today. Again, the same was more or less the case with Microsoft's first system, but it also actually has some games to prove it, so let's keep using that system for comparison. Lost Levels is a community of gamers mostly devoted to covering cancelled games and, whenever possible, uploading dumps of prototypes that the staff have acquired so regular users can poke and prod them with emulators, debuggers, and the like. However, despite the majority being predominantly programming laymen, there is a small population who is also very familiar with the development side of games. This is especially noticeable in this thread, which discusses which games really pushed the hardware of their systems. Some suggestions are debunked, such as Donkey Kong on the SNES, but the main post to look at is one by ProgrammingAce, a user who is intimately familiar with Sony and Microsoft consoles, including this generation's batch. He devotes part of his post to discussing games which pushed the Xbox hardware to go really far. Most surprising of all is when he discusses games such as Ninja Gaiden Black and Pariah, pointing out that they could haved already "passed certification" for 360 games in the state in which they were released. Passing certification, in short, means that the games would at least meet Microsoft's internal standards for letting developers develop and publish games on the 360. These standards are mostly related to technological usage, as well as overall stability, and those are what matter the most anyway, especially when considering that 360 games are supposed to natively output in 720p. (We'll save Halo 3 and other such controversies for another day.) It was already well-known that games like Ninja Gaiden Black pushed the Xbox hardware to do what the developers wanted pretty significantly, but to know it does it to the extent that an unimproved 360 port could, in theory, be okay with Microsoft is indicative of just how much the original system, and, by extension, the Wii, is really capable of doing.
Now I recognize that the biggest argument to be made at this point against Wii HD already "existing" is probably that making games for the Wii is a different proposition and isn't like working on the original Xbox. This is naturally true; despite the similarities of the specifications, the quirks of the actual hardware mean that developing for the Wii is going to be different than the Xbox. To defuse this point, though, I'd like to state that the real point to take out of all this is that the Wii can still be manipulated to output in HD resolutions. It's just that the methods for doing so have to be different. This is true even in today's multiplatform releases. A game like Mirror's Edge may look and play the same on the 360 and PS3, but the differences in hardware mean that the developer has to use different tricks to achieve the same effect across different consoles. What matters is that the end result is overall the same and that's why the Wii has been repeatedly compared to the Xbox throughout this post. The raw specifications are similar and getting the Wii to also output HD natively is a matter of working around its technological behaviors. Getting to point B from point A may entail different journies for the two systems, but point B is still going to be point B for both consoles. Different means simply have to be used to achieve the same end.
If that's the case, then why is it that the Wii hasn't shown such capabilities off by now? While it could very possibly be due to Nintendo preventing developers from working in HD simply because of demographic issues (ie: most Wii owners working solely with SD televisions), the bigger culprit is probably more along the lines of developer fears. Despite the fact that the Wii is indeed able to output games in HD resolutions, the worries are still understandable. It all boils down to how much more data the Wii has to process on the fly if it's made to go HD. When you increase the number of pixels that have to be dealt with in the transition from rendering the polygons to making a workable, two-dimensional image for televisions, it's to be expected that the sheer size of the data is going to increase. The problem is that if it's not dealt with properly, the resolution increase can make the system suffer significantly. After all, Gran Turismo 4 proved that even the PS2, a system less powerful than the Wii, is technically capable of outputting 1080i; a stable frame rate just gets thrown out the window at that point. If the developer goes even further and makes higher resolution textures for HD modes instead of relying merely on upscaling, the problem becomes all the more major. Indeed, the main issue is making sure that if the Wii is ever made to go HD as it currently is that the added data load is dealt with well so the system doesn't nearly come screeching to a halt. But even that can be overcome so long as the right techniques are employed and made stable. After all, the Xbox was the original console host for games such as Chronicles of Riddick and Doom 3, games which many people predicted could never plausibly work on the available non-PC hardware at the time. But they still did because of some clever work on the part of the developers. The Wii isn't inherently handicapped in a way that prevents the same thing from happening, either; it's only a matter of finding people bold enough to push it that far.
The notion that the Wii needs an actual hardware revision in order to output HD resolutions is one I find to be ignorant. Researching the specifications and making real-world comparisons to similar systems such as the Xbox show that it's a much more plausible notion than it's commonly perceived. The hardware may not be as conducive to doing such things as one would hope, but it already has more than it needs to do so out of the box. The main hurdles are therefore related to the demographics playing the system and developer motivation. Regarding the former, while component cables exist for the system officially, they are neither bundled with the system nor widely known. Clearly the intention on Nintendo's part is to remain SD for at least one more generation while waiting for HD setups to penetrate more households. But again, it's the latter one which is more damning in the end. The Wii was not deliberately designed to consistently output higher resolutions, so while it's still possible, many developers are probably hesitant to have it go that far. HD is still one of the uncharted waters for the system; if it's not treaded correctly, things could go awry very easily. It's an understandable, albeit disappointing predicament. With that said, if Nintendo does come out with a version of the Wii that is more openly capable of handling HD, it will be to ease developer frustrations which exist now, not because the current hardware is inherently unable to do it at all. Wii HD is here today and a precedence exists with other now-underpowered systems such as the Xbox. It just requires a bit more magic to make it happen than for the other systems today.