Re: Game of the year?
psz;335750 said:
The Original XBox was a Celeron3 733Mhz CPU, 64MB shared ram, a GeForce 3/4, DVD, NIC, 8gb Hard Drive, and retailed 6 years ago at $300 (ended up at $150).
Technically, the Xbox's GPU, the NV 2X, was a GeForce 3, not a 4; it only supported Shader Model 1.1, as did the GeForce 3, while the 4 Ti supported up to 1.3. (the last model that was in Direct3D 8.0) Though it was upgraded to have two Transform & Lighting units, the same number as the GeForce 4 Ti. Just as a clarification.
psz;335750 said:
1) The PowerPC 729Mhz CPU is, hands-down, considerably more powerful than the 733Mhz Celeron (but still inexpensive compared to "modern" CPUs).
I think that this is what Nintendo was aiming for: a very high performance-to-cost ratio. It's been demonstrated time and again that a top-end CPU isn't necessary for high-performance gaming.
psz;335750 said:
2) 88MiB GDDR3 shared ram (split into Internal and External). More Ram. Faster Ram. Cheap.
That figure is incorrect; it's 64MB, as memoruy chips only come in sizes equal to powers of 2. The other 24MB is actually eDRAM mounted inside the "Hollywood" GPU package itself, as the "Napa" daughter die, similar to the Xbox 360's GPU, only the Wii's GPU can use that memory for general-purpose VRAM, rather than just as a tile-rendering backend.
As for the 64MB of GDDR3, this can be pretty easily confirmed: a quick glance at the Wii's inner guts shows a single memory chip, with about the exact same markings as those found in the PS3 and Xbox 360. In other words, it's a 512-megabit module of GDDR3, a 90nm fabricated part with up to 1.4 nanosecond timing. (allowing a speed of 700MHz, or technically an effective "DDR" rate of 1400MHz)
psz;335750 said:
3) Now here's where it gets sticky: People claim that the Wii GPU is just the GC GPU with a better clock. This is *OBVIOUSLY* not the case. I'd say it ranks similar to the Xbox's GPU when it first came out: Lower end of the MODERN cards. Since it's ATI, assume similar specs to the X1x00 series. OBVIOUSLY an improvement of the GC and the XBox (more pipelines, shaders, etc). This may also be off. It may be a non-HD version of an HD 2x00 series. Doubtful, though, due to the costs.
Well, you can compare it to ATi/AMD's other GPUs at the time, namely others made on a 90nm process. Going on die-space alone, it's clear that it's not the same GPU, and would need to have around 120-175% more transistors in order to take up so much silicon.
Obviously, I'd say more hardware was added: it's worth noting that the Game Cube's GPU did not actually support pixel shader hardware; it just had hardware T&L. (which only it, the Xbox, and the Nintendo64 had up to that point)
As far as the "HD" bit, I'd remind everyone that it's just a buzzword; the GPU itself cannot dictate the maximum resolution; that is purely determined by the maximum framebuffer size. (which in the Wii, is apparently around 1MB, or large enough for anything up to 480p widescreen) I doubt it has any connection to the Radeon HD 2x00 series: those are much newer parts.
Rather, I'd judge that, like the Xenos in the Xbox 360, it was based off of the R5xx architecture. Amongst the 90nm chips in that lineup, the Wii's MEASURABLE specifications seem to be similar (in both size and TDP range) to the RV530, the chip used in the Radeon X1600 series of cards; that comes with the equivalent of 4 pixel pipelines (both 4 render output units and 4 texturing units, the same numbers as in the Game Cube) as well as 32 shader ALUs (stream processors) organized into 16 pixel shader units. And the chip would likely, in order to have a Thermal Design Power (TDP) envelope of around 10 watts, would likely have to run at over 300MHz; the RV530 in the Radeon 1600pro runs at 500MHz, and the GPU itself consumes perhaps 15-20 watts.
psz;335750 said:
Also: The XBox360 and Wii both use IBM PowerPCs based on the same "generation". The 360 has the higher-end chip, but you can expect the CPUs to have at least SOME similarities, in terms of commands and operations.
EDIT: Had the Wii CPU at 724, fixed it to be 729
Actually, I'd say both figures are wrong: there is no official listing of the CPU's clock speed, and the *ONLY* source out there is IGN, which is fishy to say the least.
My estimate, to be honest, comparing it to PowerPC GPUs, is that it probably runs more in the 1.2-1.5GHz range, given that even given it's small size, and efficient 90nm Silicon-on-Insulator fabrication design, it still consumes a whole 7 watts or so.
As far as performance comparisons go, it could be presumed that the Wii's CPU is an extension of the architecture used for the Game Cubes; it can't be more than twice as complex given its size, (my estiamte is more like 60-75% more transistors) which means that it would have to effectively be running using mostly the same design specs, which provided a rather strong performance-to-clock rate ratio.
By comparison, the Xbox 360's "Xenon" CPU has particularly a low P:CR ratio; comments from the likes of John Carmack place it as being half as efficient, per clock cycle, as a Pentium 4, (per core) which in turn was about 60% as efficient as a Pentium III (from whose design the Xbox's CPU was taken from) per core. So each core would equate to a Pentium III at 960MHz or so; more potent than the P3-based celeron in the Xbox, but not by a WHOLE lot, and if at all, not too much beyond the GameCube's "Gecko" at 485MHz. (which, due to efficient instructions, a much larger cache, and a FAR shorter pipeline, achieved vastly better per-clock performance than the Xbox's Celeron)