GDC 2011 Preview: Mike Ambinder

GDC 2011 Preview: Mike Ambinder

This week, we've asked the speakers of some of the most promising sessions at next week's Game Developers Conference to write for us about their presentations. Mike Ambinder is the author of 12 scientific articles/book chapters and has a PhD in experimental psychology as well as a B.A. in computer science and psychology. He joined Valve in 2008 and has worked on a variety of the studio's games, including Portal 2, Alien Swarm, both Left 4 Deads and Team Fortress 2.

Traditional methods of player input in videogames have involved the mapping of button presses, analogue stick direction and pressure, mouse manipulations, and now gestures to actions in-game. These established conventions of accepting input are excellent proxies for determining player intent, but they fail to indicate another crucial dimension of player input—player sentiment. With very few exceptions, games have ignored a player's emotional response when tailoring gameplay experiences. This lack of inclusion has been tied to the inability of traditional controllers – mouse and keyboard, gamepad, etc. – to adequately measure physiological signals. However, the technology to measure and quantify these signals is now more readily available and reliable, and the opportunity exists for videogames to incorporate emotion as an additional axis of player input.

With the addition of emotional input, gameplay can be tailored to the individual emotions of the player. For example, difficulty could be dynamically adjusted dependent upon the player’s current frustration level. Or, indices of emotional arousal may be utilized as inputs to the game. One can imagine a sniper receiving an accuracy boost in-game if their heart rate remains low but suffer a penalty when their heart rate rises above a pre-determined threshold. With the ability to quantify a wide array of player emotion – frustration, enjoyment, engagement, boredom, fatigue, etc. – the design space of gameplay experience can multiply to create experiences that are not possible with traditional controller inputs.


Left 4 Dead 2

Biofeedback, defined here as the measurement, display, analysis, modification, manipulation and response of physiological signals, can be used to measure and quantify emotion through a variety of means. Traditional biofeedback devices measure heart rate, SCL (skin conductance level, coarsely defined as how much someone sweats and correlated very highly with physiological arousal), facial expressions, eye movements, pupil dilation, EEG waveforms, body posture and other indications of physiological arousal or valence (the positivity or negativity of the signal) to classify various emotional states. My talk at GDC will discuss the use of SCL and eye movements in current research performed at Valve to investigate the viability of biofeedback-based gameplay experiences.

Currently, Valve is looking into the use of physiological signals to dynamically adjust gameplay based upon the arousal of the player, to create gameplay elements that directly depend upon player emotional state, to investigate alternative methods of player control, to quantify response to gameplay while playtesting and to examine potential uses in matchmaking/social gameplay experiences. In particular, this talk will cover how Valve is incorporating actual measurements of player arousal into the algorithms governing the AI Director in Left 4 Dead 2, how we created a mod of Alien Swarm that dynamically adjusts difficulty based upon the player’s SCL response and how we are using eye movements as replacements for a mouse and gamepad in Portal 2.

In Left 4 Dead and Left 4 Dead 2, the AI Director dynamically creates a gameplay experience based upon an inferred state of player arousal. The quantity and character of enemies, the placement of health and ammo, the sound cues utilized and various other aspects of gameplay are procedurally generated based upon an estimate of arousal derived from various gameplay events – for example, proximity to enemies killed and a player entering a low-health state. In a current experiment, we replaced this arousal metric with a new metric derived solely from measurements of physiological arousal as indexed by SCL with the hope of creating a more enjoyable gameplay experience. In addition to improving the overall gameplay experience, we should be able to analyze the patterns of arousal that lead to enjoyable experiences and modify the director to elicit them accordingly.

While the previous experiment focused on modifying the gameplay experience based upon a player’s emotional state, an alternative use for physiological signals is to use them as direct inputs to gameplay. For example, the aforementioned modified version of Alien Swarm requires a player to kill a target number of enemies in a set amount of time. The key here is that the timer is tied to the player’s arousal level – it ticks down quicker as arousal rises and levels off if a player is able to remain calm. In this scenario, a player’s ability to remain calm is directly tied to their success in-game. These sorts of novel gameplay experiences are not possible without the inclusion of physiological signals.


Portal 2

A final experiment aimed to replace traditional movement controls with eye movements. Since the eyes move quicker than the wrist or fingers can flex, it is possible that replacing mouse or analog stick movements with an eye movement proxy could create a more precise (and hopefully more enjoyable) navigation or targeting mechanic. We tested this theory by uncoupling the firing location of the portal gun from movement of the viewing window. In other words, players used their eyes to aim and the mouse to shift their viewpoint. Removing the need to center the crosshair on the intended firing location frees up the player to use the mouse primarily for navigation as opposed to a navigation/firing hybrid (a mechanic perhaps best taken advantage of in more action-oriented games). One final application of this approach is the ability to classify the eye movement scan patterns of players in the hopes of providing rudimentary insight into player cognition (are they lost, haphazardly searching for something, fixating on the exit but not realizing it’s an exit, etc.)

These sorts of experiments are intended to investigate the possibility of incorporating metrics of player emotion into gameplay. While we are just scratching the surface of what is possible with these sorts of inputs, the potential applications are tantalizing. One can imagine NPCs in games reacting to player state or in-game tutorials popping up as needed when frustration is detected or players using a health potion with the blink of an eye or the beat of a heart. The game industry is just starting to branch out into the use of more alternative forms of player input. Looking ahead, the inclusion of emotional inputs seems like a natural next step in this ongoing evolution.