Video games are a unique medium in that they produce visual feedback as a response to user input. In the past 30 years games have gone from simple lines and dots to near photo-realism. While this advancement is impressive, I think it is important to be wary of better graphics coming at the cost of gameplay. Today I briefly explore these points and a few others in the realm of video game graphics.
I think the holy grail for a great deal of of those who work with computer graphics is to produce something that is considered to be photo-realistic. In other words, to produce an image so lifelike in its depiction that it cannot be differentiated from the real thing. While I think this goal is certainly laudable and its pursuit has led to many advances in graphical fidelity, I think that this is also a very narrow approach to design. Sometimes, graphics that strive to be the most realistic possible are actually doing a disservice to the game that they are attached to. For example, imagine a Mario game that abandoned a cartoon approach and instead opted for photo-realism; much of the charm that has made Mario the gaming icon that he is would immediately be lost in this case. The realistic Mario game may not be bad, indeed it could feature all of the best gameplay elements of the series, but it seems like a part of the X-factor that makes Mario so great would be missing. I think it is fair to say that the 'best' graphics are those that are appropriate for the subject matter and help to enhance the overall atmosphere and are not necessarily what looks most realistic on a side by side screenshot comparison.
Unfortunately, many people equate 'best' to mean 'best looking' and I think that, at times, games suffer for it. Developers know that there is a portion of people out there who will purchase a title primarily based on the pictures on the back of the box. As such, it is understandable that, with limited resources, a developer may focus on trying to make something that looks good instead of playing good. This is also the reason why quick-time events are still common in games; they provide great action sequences for trailers even if they represent the most base of gameplay. In many ways, I think that photo-realism is akin to the summer action blockbuster in film; it looks awesome and sells tickets but is also likely masking a weak plot and sparse character development. That said, not all games that strive for photo-realism come out bad (the Crysis series sets the bar in graphics but also offers compelling gameplay), but the use of flashy graphics should be taken as a warning sign. If all the previews for a game can talk about is how gorgeous it looks, then you can be pretty sure that's the only thing going for it.
With the above said, I think that gameplay shouldn't be the sole quality that games are judged on. Video games are a unique medium that combines user input with visual response. As such, having that response be pleasing to the eye can enhance the gaming experience. I think this point is admirably demonstrated by the slew of indie games that are adopting a retro look. While, no doubt, this is due in large part to the lower budgets of indie titles, I think it is telling that these games seldom harken back to the line graphics of Atari and instead opt for a 16bit style. What this tells me is that the simple line graphics are sufficiently ugly that they actually negatively impact the play experience. Indie developers have thus honed in on a level of graphics that is cost effective without being visually displeasing to the point of impacting sales. If you consider a game like Braid, a basic platformer with a compelling time manipulation mechanic, there is nothing about that game design that could not have been accomplished with Atari-style graphics. However, by adopting something with more personality and a level of fidelity that allows for the recognition of basic facial features the player is able to better engage with the game. In fact, one of the common positive points that is mentioned with Braid is it's intriguing art style. I think it is fair to say that Braid wouldn't have been the success it was if it relied on gameplay alone.
Ultimately, graphics are an integral part of the overall gaming experience. The lofty pursuit of photo-realism coupled with more robust processors has led to an exponential increase in graphic fidelity over the past 30 years. However, I think it is important to recognize that graphics, while integral, are still only a part of the experience. Sacrificing gameplay for graphics is a bad trade for gamers.