XEODesign, “Why We Play Games: Four Keys to More Emotion Without Story” and “Why We Play Games Together: The People Factor”

Why We Play Games

This piece opens with a statement: “To create more emotion in innovative future games, we at XEODesign want to know more about the role of emotion in games and identify ways to create emotion other than story cut-scenes” (p. 1). The authors follow this statement with a series of thought-provoking questions:

[H]ow many emotions do games create? What makes failing 80% of the time fun? Do people play to feel emotions as well as challenge? If emotions are important to play, where do they come from? Do people modify games to feel differently? Is it possible to build emotions into games by adding emotion-producing objects or actions to game play rather than cut scenes? To what extent are game developers already doing this?

Their answers to these questions are a systematic exploration, and so in a certain sense thorough, though they also (perhaps in or because of the intervening years since this articles publishing) seem like just a start.

“Why We Play”’s discussion of things like “flow” and “fiero” reminded me of Jane McGonigal’s book, Reality Is Broken, which explores a number of the issues XEODesign explores in this article at great length.

Why We Play Games Together: The People Factor

XEODesign’s consideration of the in-the-room, observational, turn-taking social aspect of gaming struck me as particularly exciting. They advise designers to

[m]ake sure the game is fun to watch so those not playing have something to do. The game will be more likely to be shared if experienced players see new things in the game when a buddy takes over the controls. Support casual play experiences for the beginning game. Make it easy to start and spectacular enough to inspire commentary. (p. 1)

The authors also advise “[offering] a range of items and features that have a different emotion attached to them” (p. 1). They further explain this notion with one of the more fascinating formulations and phrasings I’ve come across in reading about games: “Not just a bigger gun, but also one that lets players say something about themselves and about their target” (p. 1)

Buchenau and Suri, “Experience Prototyping”

Buchenau and Suri define “Experience Prototyping” as “a form of prototyping that enables design team members, users and clients to gain first-hand appreciation of existing or future conditions through active engagement with prototypes” (p. 1).  The authors “use examples from commercial design projects to illustrate the value of such prototypes in three critical design activities: understanding existing experiences, exploring design ideas and in communicating design concepts” (p. 1).

I think Buchenau and Suri make a good and important distinction in their definition of this concept of experience design: “the quality of people’s experience changes over time as it is influenced by variations in these multiple contextual factors” (p. 1). It sounds obvious, to a certain degree, when you read it, but I think this change-time factor can sometimes get lost when feeling the immediacy of working with an idea in the here and now (and perhaps working against a deadline in trying to bring this idea into reality).

The authors also offer an “operational definition” of the notion of an “Experience Prototype”—“any kind of representation, in any medium, that is designed to understand, explore or communicate what it might be like to engage with the product, space or system we are designing” (p. 2). “Experience Prototyping is less a set of techniques,” they go on to note, “than it is an attitude, allowing the designer to think of the design problem in terms of designing an integrated experience, rather than one or more specific artifacts” (p. 2).

I also liked their observation that “[l]ow-tech solutions seem to promote the attitude that it is the design question that is important, not the tools and techniques that can be brought to bear” (p. 9). I think this idea works well in combination with the notion that finished/polished prototypes can lead an “audience” (clients, higher-ups, etc.) to see it as a nearly finished (and therefore less potentially mutable) product.

Andresen, “Playing By Ear: Creating Blind-Accessible Games”

In this fairly fascinating gamasutra.com article, Andresen makes the case for, and lays out his team’s process of, creating blind accessible-games. He also argues for paying attention to sound in game design (at least more than the average designer does), in general:

So what if you’re not creating games for visually impaired players? Even if you are creating another first-person shooter with a target demographic of able-bodied 18-to-34-year-old males, you should still consider using audio for more than just gunshots, grunts, and death screams. No matter what type of game you are creating, paying careful attention to the audio user interface and 3D audio environment will enhance the player’s experience.

One, particularly interesting point Andresen raises relates to what might seem like a minor issue—what voice to use to narrate a game’s menus. The main character in his team’s game, “Momo the monkey, has a distinct, silly accent. Using Momo’s voice for the initial game-setup menus was a great way to introduce the player to Momo and to set the right mood for the rest of the game.” Reading “discoveries” like these makes me wonder how much of game design is determined by the instincts of its creators. Though maybe as the field/industry has developed over the years, maybe instinct gets replaced more and more by something like research and testing?

I also appreciated the consideration’s Andresen’s team gave to truly detailed audio design:

We follow a couple of general design principles to ensure our game is fully accessible to blind players. First, we make sure that if two items look different, they must sound different. That isn’t usually a problem; most objects in the real world make unique sounds, if they make any sound at all. We just avoid populating our game with items that make no sound. We also make sure that item or game state changes are accompanied by audio cues. For example, items make a “grabbed” sound when they are picked up. Pick up a chicken and you hear it squawk. While it’s in your hand, it will make a disgruntled clucking noise, instead of its normal, “I’m a happy chicken” noise.

It would be interesting to see how this line of thought and work has progressed in the decade since this article was written.

Reading this piece made me (re-)consider the sound design of some of the games I play. Of these, I think Skyrim is the only one I won’t—or really can’t—play with the sound off. You miss too much if you’re not listening. You’re also in greater. The depth of sound in that game really does help you orient yourself—and it adds to the story and the overall experience in significant ways.

Prototyping assignment – “Recycle Race”

20131130_001254 cropped

Star Wars: Race to Escape the Garbage Pit!

I imagine this could work as a browser game (using clicks/drags on a mouse or taps/drags on a trackpad), tablet game (using taps/swipes) or a simple console game (using a controller or motion sensor), but I think it would be most engaging as an exhibit—a motion-sensing input device (e.g. Xbox Kinect) hooked up to a large screen display.

The player, a Youngling, or Jedi-in-training, and their friend (an NPC), not a Jedi-in-training, stumble upon the infamous garbage pit races under Coruscant. The villainous Tunnel Master, amused by the children’s intrusion, tells them that they can go free if they can win a race—one where contestants jump down through levels of flying garbage skiffs and grab an item off the back of one of the giant Garbage Worms that inhabit the pit.

Even as a Jedi-in-training, the Youngling knows that they could win this race with one arm tied behind their back. But the two children have to finish the race together, and the player’s friend is a bit of a klutz. The Youngling notices that all the other—grown-up, experienced—contestants just land on and scramble across the garbage piled on top of the skiffs. It’s unlikely, though, that the friend could even make one jump before falling onto the mountains of garbage at the bottom of the pit.

So, the Youngling decides to use what Jedi powers they do know to help them get out of this mess. The player will have to jump down to each skiff first and, before their friend follows, clear all the garbage off the skiff and help cushion the friend’s fall. But the Jedi-in-training can’t just—using their telekinetic Force powers—push or fling all the garbage off the skiff; it could fly into or onto someone else. And even if the Youngling wasn’t concerned about other racers retaliating against the two children, the Jedi Code wouldn’t allow the Youngling to harm others in such a way. What the Youngling can, do, however, is task three Sanitation Droids to help the player collect the garbage. There are three kinds of these droids—one that collects plastic and metal, one that collects paper products, and one that collects all other garbage.

The game, then, consists of the player (or their Youngling avatar) standing on a garbage skiff with three kinds of items piled in front of them—metal/plastic, paper, and garbage. Each round consists of the avatar (via P.O.V. “cinematic,” no interaction) jumping down onto a skiff and then (via the motion-sensing input device) using their telekinetic Force powers to “grab” these items and push or throw them into the appropriate Sanitation Droid, all of which are hovering around/above the skiff.

Each skiff pile will contain ten items—three plastic/metal items, three paper, and four garbage—, randomly arranged. Items will never fall off the skiff/screen. The three droids are positioned at 10:30, 12:00 and 1:30, and the game will simply help guide an inaccurately/indeterminately thrown item into the nearest droid, though more accurate throws will take less time to complete their trajectory than less accurate ones. If an item is thrown at the incorrect droid (e.g. the player throws a metal item at the paper droid), the item will simply bounce back onto the skiff. Items that are thrown at the correct droid register in the droid with a satisfying “*thunk*.” After five successful hurls (i.e. the correct item into the correct droid), the player’s “Force Lightning bar” (located on the right-hand side of the screen) goes up one increment (of three). Once the Force Lightning bar is “full” (i.e. after a total of fifteen successful throws) the player can use their Force Lightning to zap all the garbage remaining in a pile, thus reducing the number items they need to sort/throw by up to 40% (but dissipating the Force Lightning, which must be built up again).

The first round/skiff/jump will last for 30 seconds. Each game lasts five rounds. (In an exhibition environment, this length seems appropriate.) Successive rounds are five seconds shorter—so, by the fifth round players will have just 10 seconds to sort all 10 item successfully. If a player fails to sort all ten items within the allotted time, the player can see, off in the distance, other racers landing on and then jumping from their skiffs, while the Youngling’s friend lands on their skiff tumbling and complaining before the next round/jump begins. Every player can play through all five rounds, getting a score at the end of the game that depends on their overall performance. Players that successfully sort all ten items in every round can retrieve an object (again, via motion-sensor device) from the back of the Garbage Worm, win the race, and escape the pit. Players that fail to sort all ten items in all five rounds lose the race and are goaded by the Tunnel Master to try again some other time.

The “timer” for each round is the action of the Youngling’s friend jumping. It doesn’t actually take the friend 30 seconds to fall from one skiff to the next, but Jedi can perceive time differently, and so things seem to be moving slower for them than for those not trained in the way of the Force. Since the player is just a Jedi-in-training, however, this slowed-down perception of time, and therefore the time the player has to act/react, lessens after each round. The jumping friend—the Youngling’s mental image of which is represented on the left-hand side of the screen—makes some kind of noise (yelling, hollering, whooping, etc.) at five-second intervals, which each of which sounds louder and louder as the image of the friend grows bigger and bigger (as you would expect observing a noise-making object falling towards you).

———-

If I were to prototype this game in class, I would ask for six volunteers—one to serve as the player/Youngling/Jedi-in-training, on as the friend/NPC/timer, three as Sanitation Droids, and one to operate the Force Lightning bar. I would ask the player to sit on their heels, with the three Sanitation Droids sitting on their heels approximately three feet away from—but facing—the player, their hands forming a big “C” on the floor, at 10:30, 12:00 and 1:30. On the floor, in front of the player, I would place 10 golf balls that represent the ten items to be sorted on the skiff. Three would be of one color (for metal/plastic items, three another (for paper items), and four (for “all other garbage” items) of a third color. The Sanitation Droids would be instructed to catch the golf balls rolled at them, reaching out to scoop in balls that are not necessarily on-target but closer to them than to the droid next to them. If a droid was rolled an incorrect ball (e.g. the volunteer representing the garbage droid was rolled a paper ball), they would simply roll it back to the player—or, rather, the player’s pile/group of items.

The friend/NPC/timer would stand to the player’s left, five paces away but within the player’s view. The Force Lightning bar operator would sit on their heels to the player’s right, about two feet away and similarly within the player’s view. After I start the round, the timer would start a stopwatch they had in their hand and, after five seconds, would take one step closer to the player, uttering some exclamation at each interval—and a bit louder for each step closer. The volunteer operating Force Lightning bar would be responsible for counting the number of correct throws. Once (or if) the player reached five, the bar operator would place one wooden block on the floor (within the player’s view). Once (or if) the bar was filled, the operator would say “Full!” The player would then have the choice when to use their Force Lightning—shouting “Zap!” when they do, at which point the timer would pause their stopwatch (and paces), while the operator cleared away all the existing garbage balls left in the pile (and the three accumulated Force Lightning blocks). After this process was completed, the operator would say “Ready? Go!”, and play would resume.

Animation scribble

I’ve watched a great deal of animation over the course of my lifetime. This is one of the most memorable scenes from the latter half of my life:

http://www.nick.com/videos/clip/happy-happy-joy-joy.html

Ultimately, I think what gets–and has stuck with–me over the years is the incredible straddling of happiness-joy-carelessness and madness-rage-idiocy. Repetition of elements is key here, I think. The audio helps with this straddling effect, of course, but I think even when the sound is off the mania of that state comes through.

“Elements of Experience Design,” by Nathan Shedroff

In this article Shedroff unpacks the notion of an “experience” as it pertains, in particular, to digital media design.

I found his framing of experience stages–attraction, engagement, and conclusion–to be fantastically simple and straightforward. It also reminded me of the traditional three-act structure that organizes so much theater and film. But what makes this additionally interesting, as far as I’m concerned, is that a digital media experience need not proceed in such a linear fashion as a film or play–which I think makes the whole notion of narrative much more challenging and much more exciting.

I also appreciated Shedroff’s differentiating presentation from organization and his focus on visualization.

There were a number of points in this article that made me think of our classes WSF website design challenge. One of the main ones being Shedroff’s argument about interactivity–which I find myself very much in agreement with. Any number of things (i.e. digital media products) describe themselves as “interactive,” but are only so in the most basic (and unimaginative) of terms. WSF referring to its site having “interactive videos” is, in this day and age, approaching false advertising. Seeing such a thing gives me a similar reaction to whenever I see a DVD listing “Interactive Menus” as one of its selling points.

“Gesture-based tech: A future in education?”, by Charlie Osborne for iGeneration

I did find this piece to have lots of interesting stuff in it, but–while I realize this is a blog post and not an academic article–there were a couple of broad (and sometimes big) statements she makes that leave me wanting more–consideration, detail, supporting evidence, etc. Namely:

  • “What makes gesture-based technology unique in this respect is that it has the potential to allow collaborative efforts on a wider scale–more than setting up a classroom blog, or using Powerpoint to create a presentation, and can be used to further promote content engagement.”
  • “Gesture-based technology can be considered a medium within itself that students can learn from, as an interactive, active learning platform, rather than simply a means to play or access study material.”

This all sounds really, really exciting! But exactly how does it do all this stuff?

On another point–I am genuinely excited by the potential for gesture-based design to tap into ways of learning that are more intuitive–we might even say “natural”? Watching some of the videos of experiments and implementations with this technology, however, got me wondering, and feeling a touch more skepticism (though it feels like productive skepticism). Gestures may be more natural than a pencil or a keyboard in some ways, but is it really natural for humans to keep their arms up for such long periods of time? We’ve evolved walking around with our hands hanging down at our sides for millions of years, no? Should we be aiming for motion capture technology to get precise enough to just read hand and finger gestures as we, say, sit in a chair or on a couch–or, if we’re feeling ambitious, walking around a room?

(I also thought the video embedded in the post, “Active vs passive learning,” was wonderfully hilarious–like a Flight of the Conchords video taking the piss out of Kraftwerk; or, rather, a Kraftwerk Appreciation Society at an IT grad program.)

Warfel, Prototyping Chapters 1 & 4

In Chapter 1 Warfel makes the case for–as the chapter title indicates–”The Value of Prototyping”–primarily in terms of project and client management. His argument is direct, declarative, practical, and concise. The chapter sections’ titles themselves outline his case:

  • Prototyping Is Generative
  • Prototyping—The Power of Show, Tell, and Experience
  • Prototyping Reduces Misinterpretation
  • Prototyping Saves Time, Effort, and Money
  • Prototyping Reduces Waste
  • Prototyping Provides Real-World Value

The argument of the only chapter section with a non-declarative title, “The Power  of Show, Tell, and Experience,” is that “Prototypes go beyond the power of show and tell–they let you experience the design” (p. 3).

Within the “Reduces Waste” section, Warfel makes a compelling (enumerated argument) not just in favor of prototyping, but points out a number of the shortcomings of the “traditional requirements-driven design and development process.” Having produced, in a professional setting, a number of long, collaborative–or semi-collaborative–documents (i.e. grant proposals and reports), a number of these criticisms  hit home.

In Chapter 4 Warfel offers “Eight Guiding Principles” for prototyping. Like the first one, this chapter is absolutely packed with concise, direct, compelling critiques, practical advice. Reading a book like this in an academic environment is a very interesting experience for me. In no other discipline (or DMDL class, for that matter,” have I come across descriptions/prescriptions for things like the “psychological technique known as priming” (p. 46) or “Principle 6: If You Can’t Make It, Fake It” (p. 51)