Dark1x is a nice guy but he's like the phoniest graphics whore ever. He was all about 60fps until it was obvious that PS3 couldn't handle 60fps. Then it became all about a locked 30 fps and some messy vaseline-like motion blur that fudges everything up. He makes a big deal about Burnout Paradise looking slightly better on the PS3 but whenever the 360 version looks better, his eyes suddenly becomes less discerning and the differences all look negligible to him.
All I know is that a real graphics whore wouldn't waste his time defending crap console hardware. Instead, he'd be playing games at 60fps in 1080P or higher with everything at max, 4x AA, and 16x AF on a PC that has at least 3 GPUs and a 4ghz CPU.
Wait, what prompted this?
I *DO* play PC games at 60 fps in 1080p. I run an i7 930 @ 3.8GHz + Radeon 5870 right now. I, of course, stick to a 9th gen Pioneer plasma which absolutely smokes every single PC LCD on the market for gaming, but I always run in clone mode with my PC upstairs usable on a second screen (a recent Samsung 25" LCD, which is pretty shit, but gets the job done).
Have multiple GPU setups improved? A friend of mine uses two GTX275s and there is a very subtle stuttering effect that is introduced in everything I tried that immediately disappeared when the second GPU was disabled. The numbers are high in FRAPS, but it doesn't look right.
Consoles are consoles, PCs are PCs. I have different standards for consoles than PCs. Consoles are closed boxes and I find it fascinating to see what developers can achieve on these machines. In particular, the PS3 interests me because it is so different (weak GPU with a seemingly powerful, but unique, CPU). I know Crysis 2 will look insane on the PC, but I'm much more interested in seeing what they can do with a more limited, but closed, platform. I definitely prefer the PS3 and I can't even explain why, but it tends to shine through more in a place like this where it is so damn hated. Seriously, I've never seen such venom towards a single platform in one place.
I've always loved motion blur, though. That has nothing to do with any platform in particular. It delivers a more CG-like appearance that I can't get enough of. What's wrong with that? It looks best at 60 fps, definitely, but I think a 30 fps game (solid 30) with high quality per object blur can look better in motion than a 60 fps game with no blur in many cases (even on the PC).
Again, I maintain that I love all platforms. Heck, I cancelled that bullshit God of War III special edition box and used the money saved to buy Metro 2033 instead.
I remember how he was praising MGS2 on the PS2 downplaying the obvious better looking PC Substance version.
Oh no, you're not getting away with that bullshit. Back then, PCs were total shit at handling post processing and the like. The first release of the game didn't even work properly on ATI cards (which were huge at the time due to the 9700 Pro release). The PC version was missing all sorts of details and effects (as was Silent Hill 2). In all honesty, it was probably due to the fact that the game was ported by a shit developer from a very specialized platform (the PS2). The PC version of Substance was simply a bad port.
framerate that drops from 120 fps to 117 fps and hurts the "fluidity" of the game.
I would never bitch about that as I lock all games at a maximum of 60 fps. Going above that is useless. If you have a 120 Hz screen, sure, that's fine...but most people simply disable vertical sync to achieve those numbers. 120 fps can look like absolute shit with the wrong configuration.