Snickersnack: That's a lot of weak cores. Interesting.
doady: It's only 1.6 GHz, but clock speed isn't a reliable indicator of processing power. For example, the Wii processor was only clocked at 729 MHz, but it was only slighlyt less powerful than a single core of the Xbox 360 processor clocked at 3.2 Ghz, probably since it has out-of-order execution. As PC gamers, we know that AMD processors have higher clock speeds than Intel processors, so does that mean that the AMD processors more powerful? It's the architecture that counts, not clock speeds.
StingingVelvet: 4GB is plenty for PC gaming still today, I have never felt a need to upgrade. Having to jump right from 4 to something like 16 (PC always needs like twice the RAM of consoles) is pretty significant.
It's cheap though, so whatever.
doady: Very few games on PC actually use more than 4GB so 16GB RAM on a console would be an utter waste. 16GB is for like video encoding and stuff. You don't need that much RAM for gaming.
Remember also when making these comparisons about CPU/GPU/RAM or whatever is that games for consoles use the hardware much more efficiently than PC games due to low level access. You don't really need the same power as PC to begin with.
Considering the way they're going, the next Xbox will last 8 years. In 8 years time, we went from 1GB to 16GB. Trust me, 16GB is NOT overkill. The whole point is, that consoles ALWAYS have been held back by memory and this same lame argument has been used a billion times in the past. When the PS2 was released, 32MB was also still used in many PC's and look what it meant: GTA 3 had to be nerfed massively because of memory constraints. Considering it will be another few years before it's even going to be released, I stick by what I said: 16GB would be a better choice.