PS3 = Dreamcast?

Back in the heyday of console marketing, it was all about bits–8 bit to 16 bit, 16 to 32, right up until Dreamcast’s whopping 128 bits, at which point people finally stopped using such things as a yardstick for measuring a console’s power and that, as they say, was that. In case you’ve ever wondered […]

Dcvps3
Back in the heyday of console marketing, it was all about bits--8 bit to 16 bit, 16 to 32, right up until Dreamcast's whopping 128 bits, at which point people finally stopped using such things as a yardstick for measuring a console's power and that, as they say, was that. In case you've ever wondered how many bits it takes to get to the tootsie roll center of a PS3, the answer, according to a "tech guy" at Sony, is apparently also 128. But it's "more 128 than the others." Is that like how some animals are more equal than others?

Here's how he explains the math, and why it's largely irrelevant when considering the power of a modern console:

Most single pieces of data fit in 32 or 64 bits. The benefit of 128 bits is that you can operate on 4 pieces of 32-bit data at the same time, which is called SIMD (Single Instruction,
Multiple Data). This is only useful for data that needs the same operation on all 4 pieces, which is common in games for things such as 3D graphical transformations, physical simulation, collision detection, etc. 128-bits is the "sweet spot" of price and performance, so that is what everyone seems to have settled upon.

To get more power, people have instead now been moving to more processor cores. (PS3 has 8, Xbox 360 has 3,
Wii has 1, PS2 had 1 + 2 special-purpose, Xbox had 1, etc).

What good does any of this do us? Erm, none, really, except in a "pursuit of knowlege is its own reward" kinda way.

Shock: The number of bits in the PS3 [Insert Credit]