Nin10do's Theory

In the same forum as the discussion of John Lucas’ idea of how the Wii U will dominate, is another theory. Nin10do has focused more on the architecture of the system, wrather than its style or feel. His theory is that the Wii U is so efficient, it is more powerful than the other two competing systems.

Simply put, the idea is that the Wii U's entire structure does much more with much less. With the GPGPU, PowerPC architecture, and the help of 38MB of eDRAM (and some eSRAM), the way Wii U is designed is very different from how the Playstation 4 and Xbox One are designed, which have a relatively wasteful architecture that is easy to develop for but require much more resources.


This means that despite having less RAM, CPU cores, and a slower clock speed, the Wii U should be capable of pushing out much more than the PS4 and XBO can. This results in better overall performance, (seen through the locked 60FPS frame-rate on all first party Wii U titles) and given time, visuals.


A Frame Per Second or FPS is how many things can be rendered and shown every second. I asked him about when he began to see something wrong.


Various reports by EA executives among other companies/devs (Id, Epic, etc.), stating that the system was not as powerful as the 360, or the PS3, that engines weren't running on the system, yet later reports that they were. Before the launch of the system there were devs praising the hardware and the tablet (Gabe Newell), and others saying the opposite, that it was weak and the tablet more of a barrier and a nuisance.


I've started the 8th generation with an open mindset as an open slate, as I do every generation, what has happened before matters little. The PS4 reveal intrigued me and I was curious about Sony's plans, while remaining cautious, remembering previous Sony launches. Xbox One conference didn't engage me, obviously, but I didn't have too much of an issue with it until after the fact, when I realized the policies were confirmed. That left a sour taste in my mouth. When E3 came I was leaning more towards the PS4, before really looking into the Wii U.


Well performance is key and it's already starting to show. Of course with stylized graphics it isn't obvious yet. I think Bayonetta will be the first clear display of it's capabilities, by being an irrefutable graphically intensive, with great performance. Although Kart is sure to get people curious about it's potential, running just as well with 4 players split as with just 1. Q3-Q4 2014 should be around the time it's clear there's a difference, with the release of X and other 3rd parties that have been working with the Wii U from the start.


In the post of his theory, He began to explain what He saw at E3. First, He began with a video of Killer Instinct for the Xbox One.

So if you watched all the way you'll see what was mentioned, if not I'll summarize. It quotes a KI developer talking about the game, and when asked about the resolution he says it technically runs at 720p 60FPS and anything that drops the framerate or resolution is removed or optimized. This is a launch downloadable title. Even if it was retail, w/e they are hitting walls. He goes on to mention what I've been saying for the longest, there's no progression with these systems. Bare in mind this dude is unbiased and knows his stuff. Moreso than I can say. He also mentions that it's incredibly bad that they are hitting walls before launch and the results aren't that great, and maybe the Wii U isn't that far behind as everyone expects, despite the RAM.


After this he goes into the Central Processing Unit of the PS4, and the XBox One. He talks about how the systems CPUs are not built for advanced gaming needs. Nin10do was confused, and went to a friend to explain what was going on. His friend explained that there was a faster efficiency for the Wii U core than for the other systems. In fact, there was an even bigger efficiency difference than just the numbers given. The Wii U ran off a Power PC core, and the others ran off a X86 core. If you are wondering what this all means, his friend explains.


The x86 instruction code was designed back in the 1970s when RAM was very limited, 1024 to like 4086 bytes of RAM. It used variable length instruction code to be space efficient in RAM. This was fine when the CPU was 8 bit. Going into 16 bit and 32 bit, the RAM size moved up to 65,536 bytes and then 2 million bytes. Further, the first chip did not have expansion libraries, but only restricted to a library of 127 functions on the CPU, because one bit in the first byte on x86 (the operation to perform) was 0 if in the primary 127 functions or 1 if it wasn't. When the 286 came out, extended functions required a second byte to fetch for operation code.


Intel never intended for x86 to live past the 1990s. They developed the Itanium instruction code to replace it. When I get to megahertz myth, I will explain that. x86 was designed for small systems, which today are appliances. It was efficient in early models, but not these advanced systems of today.




IBM developed POWERPC in the 1990s, 2 decades after x86. Systems have evolved, and this newer architecture takes advantage of it. The reason why I put POWERPC in caps here is because it really is an acronym. To be short, POWERPC is RISC design, Reduced Instruction Set Computer, built for performance.
For POWERPC, instruction code is fixed width, 32bit for most POWERPC CPUs, like the Wii U. This takes less clock cycles to fetch instructions. It requires less memory usage. Overall, requires less power for the CPU and cooler temperatures. This gives room for higher clock speeds and allowing multiple instruction operations per clock cycle.
Another part with POWERPC CPUs, many are not backwards compatible because IBM's engineers may adjust the instruction library to be more efficient to process in the CPU
Overall, POWERPC is more powerful than x86 with current systems by a long shot by not requiring as much resources per instruction to perform operations.
When his friend was given the theory, he said.


The Wii U follows the design of the Gamecube. It is not as hard to develop as said.


x86 is very inefficient today. It was efficient back when it was made for cheap 4-bit and 8-bit systems.


The Wii U uses less RAM and loses less cycles than the XB1 and PS4. Smaller is better, as clock rate is govern by speed of light.


In other words, the Wii U is running faster, and better than the more advanced PS4 and Xbox One, while keeping a lower need for RAM, and power.


Nin10do then explains that the games on display seem to be hitting walls.


All of this supports my admittedly originally half baked theory, well not anymore. Of course my word isn't everything, just watch that video I linked, the caps are being hit on the PS4/XBO. Bayonetta, the first obviously graphically intenstive game coming out of Nintendo is going to be at 1080p 60FPS. There's no dicking around on that one. It's not a racer or a fighter or with simple textures/models. It's a full blown game with great performance at year 2, with the consoles shipping at their max potential what excuse is there for Killer Instinct? For Knack at 720p, and Killzone at 30FPS with dips, Ryse looking like a buggy mess with dips as well. AC4 at 30FPS aiming for 1080p. At stated before, graphics don't matter too much, neither does power in general, but let's put that aside since you are all so insistent. What does this show? This is year 2, hell I wouldn't even say that since the devs had pushbacks due to understaffing, even then what's Mario Kart, at just over a year with 1080p 60FPS with 2 player Split screen ready at e3, and with them aiming for 4 player splitscreen 60FPS native 1080p. Battlefield struggling to do 720p and with unsatisfactory visuals. What say you of this?


He finally gives his theory.


Here's my theory, one that will never be confirmed of course because this information will never be public but regardless. My thoughts are that Sony and MS saw Nintendo put their system on the market, and they got flustered and rushed to designing a new system. 2011 was right when they started profiting on the PS3/360 with no signs of sales faltering, had it gone their way they would have kept it up for another 3-5 years. It didn't go their way, and 3 months after that inspiring speech of continuing to make the impossible possible, they did, by revealing the Wii U and throwing their competition a curveball out of nowhere. Sure Cerny said they've been planning it since 08 but I don't buy that and there's a difference between the conceptual stage and actually starting R&D, something tells me that bit began for both MS and Sony in 2011. They rushed to not let themselves get beaten to the market badly and both designed something easy for them, a laptop/budget pc with their firmware and sent it through testing. These consoles are clearly rushed, as has been the software. Looking at their displays this year objectively. These surprise announcements 6 months before release, last second hardware changes (not policies), nothing ready for e3. News of Xbox One cases MELTING! and them having to underclock the CPU and that's why they've been slowly increasing it bit by bit.


He goes on to say that the displays of amazing graphics were prerendered, or CG. This means they were made like a movie for Pixar, where a computer slowly renders each scene frame by frame, and then outputs the video. It was not real time, or able to make these videos. In fact, it was an old trick that has been played for the PS3, and even PS2.

I asked him how He did his research. He pointed out that the research was continuing, for example Shin’en made very kind comments about the Wii U. They said the architecture was very efficient and capable. Then he gave his usual research techniques.


Finding articles, tracking down the sources, figuring the specs of these systems, through this all I've found several numbers like everyone mentions, but then it comes down to finding out what the numbers mean. I've always been tech savvy and I'm moving towards learning about what makes computers tick, CPUs, GPUs, and RAM, are a major part of that. I've got a buddy that knows a lot more than me when it comes to the bare bones of CPU and I often pose questions to get broader understanding of certain things. Basically, I get led in the right direction to find the answers, helps give perspective so I can figure which questions to ask, and where to go searching for the answers. Connecting all the dots is fun :D


When I asked about his knowledge on the subject he stated.


87/potato: Baked. No Fixings.


It's just something that's always clicked with me, and I learn it fast. Been operating computers since I was 4 years old, hell I taught my parents how to operate them, if that says anything.


It's in my nature.

He is expected to have more information to show any day now, but for now this is what we have. When I asked what would be said, he gave me this link.

Popular Posts