View Full Version : Editorial IGDA: Videogame Graphic Advances - Not That Important?

07 July 2004, 01:43 PM
Graphics have never been better. Or worse.

Continuing the discussion from last month (, this time focused on the revolutionary advances made in graphic technology over the past year: human modeling, thanks to near-cinema quality shading and today's zillion-polygon video cards, is at its highest apogee yet and promises to go further. But there's still something missing from the graphics in general, and the people specifically, portrayed in our games. It is easier now to perceive and interpret visual information from game worlds and characters than ever before, but those characters are unsettlingly funereal in visage: cadaverous, ghastly, at times repellent. And yet our “face tech,” along with all other graphic technology, is better than ever. What gives?

This article ( sums it up: the author describes the new Alias game's Sydney Bristow as sepulchral and Botox-stuffed, warning that increased realism will decrease our willingness to be immersed in game worlds populated by such ghouls. This concept is not new. Roboticist Moshiro Mori postulated ( that human beings will willingly engage with avatars whose visual qualities are human to a point, but will react with revulsion and even fear toward appearance that is “more human” than that but not “human enough.” We have evolved over thousands of years to recognize our own kind, we're not easily tricked, and attempts to fool us often result in presentations of characters that are unfamiliar and hideous.

That's why the faces and people of yesteryear's games looked more real, and were certainly more comforting, than what we see today. Consider the death masks of Thief: Deadly Shadows , the icewater-soaked aspect of Daniel Garner in Painkiller , even the fetching but waxlike Aki Ross from the Final Fantasy movie. Pixar, on the rare occasions it models human beings, takes care to render them in a very stylized way – thereby dodging Moshiro's “Uncanny Valley.” Game worlds are deep in the twilight of this cleft, and additional polygons or lengthier shader instructions won't necessarily make computer generated faces seem more real. They're not real, they'll never be real, and millenia of social evolution are being offended by what for the first time has a chance to visually resemble a human, but can't. New software ( promises to make games look even more “realistic,” when in fact these products often render human models increasingly alien and horrifying. The day when the shared history of our species is fooled by a computer generated person is a long, long way off.

Developers have noticed this, and a backlash against the bloodless mein of modern game characters has cropped up. Look at the cel-shaded protags of XIII and Viewtiful Joe . Meanwhile, Tomb Raider: The Angel of Darkness models a well-known character in a way that's realistic, but stops consciously short of trying to make her look human. That game was an atrocity regardless, but imagine the backlash had the beloved Lara Croft suddenly taken on the ghastly, mannequin-in-motion visage that's become so popular. Is it possible that graphics can get too good?

The obvious response is “no;” Cliff Bleszinski succinctly announced ( that “old games suck,” due in part to their limited graphics. Many would argue that progress is always good, especially when it comes to the enhancement of this art form. But graphics aren't exclusive. The Dark Project's Return to the Cathedral mission evoked just as much I-want-mommy terror as Deadly Shadows' Shalebridge Cradle; it was the level design and the sound work of Eric Brosius that ignited emotional reactions, not the look. The fact is, dependence on increasingly real visuals alone to generate emotion will inevitably hit a wall: at some point game graphics will look as good as real life. Developers have an arsenal of emotioneering tools at hand; to limit themselves to just one, however prominent, would be ill-advised.

There's another concern with the rapid increase in visual realism. The more “real” that games look, the more likely they are to be subjected to outsider scrutiny and, possibly, censorship. Manhunt , morbidly puerile and disturbingly satisfying as it is, stops just short of going too far visually. Were Manhunt not so blatant an attempt to cash in on the strategy of “produce sales by shocking grandmothers” that Rockstar has made a business model out of, we'd be hearing a lot more about it. While developers shouldn't limit technical or artistic growth in order to appease a segment that will never accept that games are not responsible for increases in evildoing, they should nonetheless take care not to inflame that segment more than is strictly necessary.

Overfocus on hyper-realistic graphics and modeling, while not a bad idea in a general sort of way, can also impede quality of gameplay. There is more to realism ( than just “looking real,” a fact that developers occasionally lose sight of but are always quick to re-embrace. Even the famously tech-fetishist id Software has brought on a professional writer for the development of DOOM 3. As awesome as technologies like shading and Havok may be, they in and of themselves are not games."

>>Link<< (

07 July 2004, 01:53 PM
Better graphics in games are an improvement... until the graphics themselves are hindering the game experience.

There has been such a push for photorealism in the graphics, but I believe that NPR methods will become much more popular over the next couple of years.


creative destructions
07 July 2004, 02:34 PM
The Dark Project's Return to the Cathedral mission evoked just as much I-want-mommy terror as Deadly Shadows' Shalebridge Cradle.Really glad that level is getting some recognition.

CGTalk Moderation
01 January 2006, 06:00 PM
This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.