Character building
Aug. 1st, 2011 04:13 pm![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
I like consoles because they make things simple. If I want to futz with exactly what hardware I have, I'll use a PC. Consoles Just Work. That is one reason I didn't get a grown up console (PS3 or XboX 360)- too many choices as to exactly what hard drive size I wanted, and too expensive to fix after the fact. I could save money by modding it myself, but that's a lot of... not exactly work, but a lot of decisions and anxiety, plus with the red ring of death so common you really didn't want to void your warranty. Meanwhile, the Wii was there, telling me it was a perfect black box I never needed to think about. But then the Wii kept making up new peripherals, and the cost of the xbox proprietary hard drives dropped to the point that I don't feel like I'm being ripped off for lacking the can-do spirit to mod it myself. Microsoft and Sony have both introduced motion control peripherals since then, but somehow it seems less daunting now.
Meanwhile, games keep coming out for consoles. No AAA game coming out this year could play on even a top of the line computer built in 2006 (year the 360 was released), but they're all out for the consoles. They can do this because advances in the algorithms expand the horizons of what they can do- most obviously in graphics, but I'm sure in other gameplay aspects as well. If you showed me Silent Hill 2 (2001) and Resident Evil 4 (2005) side by side, I would bet you money they had significantly different hardware requirements. RE4 just looks so much better, even though SH2 is wrapped in fog to lower the computational power needed. But unless there are little gremlins swapping out the processor when I change the disc- and I'm not ruling that out- it has to be software advancements.
I feel like this is vaguely good for the moral character of developers. No, you can't just require everyone to overclock their processor, you have to think about what you're doing. If you really genuinely need something for game play reasons, invent some new math. It builds character.
Meanwhile, games keep coming out for consoles. No AAA game coming out this year could play on even a top of the line computer built in 2006 (year the 360 was released), but they're all out for the consoles. They can do this because advances in the algorithms expand the horizons of what they can do- most obviously in graphics, but I'm sure in other gameplay aspects as well. If you showed me Silent Hill 2 (2001) and Resident Evil 4 (2005) side by side, I would bet you money they had significantly different hardware requirements. RE4 just looks so much better, even though SH2 is wrapped in fog to lower the computational power needed. But unless there are little gremlins swapping out the processor when I change the disc- and I'm not ruling that out- it has to be software advancements.
I feel like this is vaguely good for the moral character of developers. No, you can't just require everyone to overclock their processor, you have to think about what you're doing. If you really genuinely need something for game play reasons, invent some new math. It builds character.