Originally posted by Peach
And yet terrible gameplay covered up with shiny graphics is just...gold-plated garbage. Graphics can make a good game better, but they can't make a bad game good.
Here's the problem: I cannot think of a game that went overboard with graphics but utterly failed with the gameplay. It is "passable" at worst. This is why I cannot see your "every time" statement holding true "every time". In fact, it is almost not every time.
Gameplay is crap if you have a crap visual. Do not take that out of context, either: most of the old games were crap, when I was a kid because of the graphics. Utter and complete crap. Nothing has changed since I got older. Some games have an artistic visual that do not require millions of pixels: it works.
I will never hold the stance that gameplay beats out graphics every time. It is both. I must have both.
Originally posted by ArtificialGlory
32 and 64-cores? Are you a time-traveler from the future? Quad cores are more or less still the norm, though 6 and even 8-cores exist(there are CPUs with even higher core counts than that, but they aren't for your average end-user). Actually, X360 has the highest core count of the current-gen consoles, which is 3.
You're correct based on the terms.
Originally posted by Smasandian
It's all about the graphics chip.It's the same on PC. My CPU is 4-5 years old but my GPU is brand new and I can most games at highest settings. It's all about the GPU.
Actually, it's not. It is a combination of the memories (CPU, RAM) and the graphics capabilities (discrete or integrated).
Originally posted by ArtificialGlory
I would love to see one of these Macs. Maybe they have dual-CPUs or some crazy shit like that.PS3 has 1 core and 7 SPEs(1 of which is redundant), but those can hardly be called cores.
This guy knows.
It's a cell processor, not a multi-core processor. Different architectures.