Whenever the video game industry came up with the console, I always knew that the technology of video game console would be about 1~1.5 generation behind the PC. But if I can remember correctly, it wasn’t always like that.
I was lucky enough to have a previlage of being the first one to get the lastest PC gizmos on the block, thanks to my father. He would bring one of the first color supported videocard, and I was so thrilled to have my own 5.25 floppy disks to store the games I loved so much. I honestly didn’t know what kind of PC we had, but I was the first kid in town who could play games in colors. At least until the NES flooded in and owned many kids and give them random seizures. The NES and PC were just the gaming machines to me, nothing more.
Then I moved to the US, finally found about the wonderful world of PC. I was so facinated that whatever PC I could have my hands on, I immediately try to open up the case and disassemble the parts. My first PC ever (IBM Aptiva) had to suffer the same fate as my old PCs as well.
But it was that first PC that really brought me into the PC gaming. The IBM Aptiva that we purchased was bundled with MechWarrior2 (ATI 3D Rage edition), and it was the first 3D game that I’ve ever played on my PC. I had a luxury of installing the game with ‘Full Installation’ that includes cinematics.
Back then, the developers had limited tools to make the realistic graphics, so they carefully had to choose how to implement them. The scale was lot smaller back then, but in my opinion that’s when the developers could get more personal with the games. They rendered 3D models and captured the screenshot, then add them to the game library because many of the PCs at the time could not handle the 3D rendering in realtime.
And within few months, boom. 3dfx. Then all these framerate-squeezing, benchmarking and tweaking tips floated around the net, and nVidia came up with the videocard that supports DirectX. The PC 3D graphic market exploded, and yatta yatta, nVidia bought off 3dfx and killed it, blah blah blah, AMD bought ATI, and while all these were happening, DirectX and OpenGL became the main players of PC graphic industry standard. And here we are, in early 21st century. The graphic industry is now focusing on the physics in 3D world, and with DX11, they’ll focus more on the multi-thread technologies such as GPGPU and more effective multi-CPU (4+) support. It’s not so hard to implement lighting and shaders in the 3D models that developers came up with, with all the cutting tools and great game engines.
Somehow though, these days when I have to play tons amount of games, I just see that many games are..shinier. A lot. I don’t know which game started this trend, but it’s so shiny that sometimes the textures are washed out. Cars, human rendering, the buildings.. It’s like some developers think that making things shiny would make gamers happy. They’ll even polish up the dog feces if they have it in the game.
I don’t want everything shiny. I used to see better lighting in the old games. I’m afraid to find out that I’m seeing more games, everything polished up to be shiny as day goes by. I would rather have 60 solid FPS than super shiny visual FX. Back in the days, I only had to invest about $200~$300 to enjoy the top-notch graphics for the time. Nowadays, they want me to install two of $300~$400 videocards with a 1000W PSU just to make everything on my screen shinier. Even for the gaming consoles, I see poor shadow/texture/shaders quality but better lighting effect to make everything shinier.
Am I exaggerating? I don’t know. Not every games are getting shinier, but you can’t deny the fact that many 3D objects in game look like they’ve been dipped into the car polisher.
What do YOU think?