PC gaming didn't have the growth rate needed in 1999-2000 or so to keep up with the way that budgets and team sizes were rising in that period. When the dot.bomb crash hit and all the free capital dried up, American and British developers needed a place to make more money, and found that console games sold pretty well and that there was an real desire and audience for PC-styled shooters, action games, and RPGs there that was undeserved in the PSX/N64 generation. A big enough audience, in fact, that they could break even or still make money on the much more expensive to develop games by developing console games alongside PC verions.
Then the 360 happened and the games got more even more expensive to make, and the hardware and platform were even more robust and easy to hook into. Build quality issues aside, the 360 was the first good platform for the kind of online gaming that was mainstream on the PC.
I don't think that the hardware vendors have helped any, but I don't hold them at fault either. The driving economics of the industry have been the reason why PC gaming has been eclipsed by console gaming over the last decade or so more than anything else. What's happening now would have happened even if Intel and Nvidia weren't raging distinguished mentally-challenged fellows half the time.
I don't think PC gaming is a crapfest though. I think that's a product of revising the history of the mid to late 1990s, mistaken bemoaning of lost genres, and totally ignoring the virtues of the emergent trends in PC gaming-independent and smaller team development and continued strong European developer support. It's a strong, vibrant, unique platform that holds one key virtue over the others-the people who make games for it don't automatically assume their users are stupid.