5 posts categorized "Mobile"

Design, Disruption, and Drunk Usability Testing

Touch Here

I held the drunk man's hand like a dance partner at a debutante ball, sashaying our way towards the front door of the Collins Pub.

We had both been at the Seattle Matsuri, a two-hour "all you can taste" exhibition of sakes that would be hitting the American market soon. At the event, most of us directed the delicious sakes from each brewer's bottle from our mouths into the handily-provided metal spittoons, thereby avoiding imbibing dozens of ounces of these potent wines and the fallout possible therein.

Then there were fellows like this man—whom we shall call Jeff, to protect his identity—who chose to swallow from each glass a bit too liberally. Upon running into him on the street after the event, he seemed quite lucid. But as our party sat down at the pub, desperate for a late dinner of burgers, fish, and chips to counter the onslaught of wine, you could see the power light draining right out of his eyes, his speech slurring from complete sentences to fragments. When he announced that he needed to get outside to wake up a bit, his attempt to stand up caused him to flip another table and fall to the ground in a mixture of both bewilderment and humiliation.

Sitting outside with Jeff for a little fresh air, we chatted haltingly about where he lived and what he did for a living, all the while demurring the advances of the usual Pioneer Square drug dealers offering cut-rate deals on stimulants and muscle relaxants. (Seriously, does this guy look like he needs a muscle relaxant?) But our real adventure began when he said the following: "Let's call my wife. She can pick me up."

First, we had to find the phone.

Continue reading "Design, Disruption, and Drunk Usability Testing" »

Phone Knows Best

Trust Me

Can a phone be your trusted best friend? Your personal trainer? Your confidante? Can it cheer you up when you're stressed? Can it know what you're feeling, and why you're feeling it? Can it go away when you just want to sit in the corner and cry?

If we're serious about pushing the utility of mobile devices to their absolute limit, then we will have to create software so sophisticated that it can discern the difference between the perceived intent of user actions and the actual intent contained in our brains and bodies. Computers will need to make us feel like they're reading our minds—not just our words, where our eyeballs are pointing, and where our body is positioned in physical space. And when we behave in a irrational manner (meaning like human beings) these same computers will need to withhold judgment on what does not compute.

We will call these design challenges "HAL 9000 problems," because this leap in technological evolution brings up some very gnarly dilemmas for designers and developers—though not because an AI in a spaceship is preparing to kill us. (Yet.)

Continue reading "Phone Knows Best" »

Doing the UI Pantomime

Unifi In Progress User Flow

In my last 80 Works class, I asked the talented designer Scott Scheff to come as a guest. He brought a great exercise that has a lot of practical application for a group of designers looking to explore the nooks and crannies of an interface. Scott dubbed the exercise the "UI Pantomime," and it is a twist on a few of IDEO's role-playing methods.

The students were tasked with helping create the in-store experience for a "record store of the future" called Unifi. At this store, you could use an iPhone application as an adjunct to the shopping experience. The app would add the following to your retail browsing experience:

1. When looking at a CD, you could sample audio from the CD's tracks
2. When you purchased a CD in the store, you would automatically get MP3s of the CD tracks downloaded to your device
3. The application would also prompt you with related artists and featured artists (this week: Wilco)

After the students were briefed, we set them loose with about 30 minutes to work through the details of the interface through "UI Pantomime".

Here's how the exercise played out:

First 5-10 minutes: The students chose roles. One was the shopper entering the store. She held an eraser, which was a stand in for an iPhone. A second student played the interface of the iPhone app and had to act out what was happening within the interface. The shopper and the app then worked through the use cases we'd provided, while another student was responsible for documenting a rough user flow and UI ideas based on the conversation between the phone and the shopper.

Second 5-10 minutes: All the students stopped role playing/documenting the action and examined the user flow and user interface sketches, making refinements to screens based on the varying perceptions of each student.

Third 5-10 minutes: The students playing the shopper and the iPhone interface attempt to follow the user flow/UI sketches as documented. The third student observed the tension between the real-world interaction and the documented flow and recorded any new screens/areas that emerged.

Fourth 5-10 minutes: The students debrief and revise the flow and screens.

If this was an at-work exercise, this iterative cycle could continue until the final "performance" felt complete.

When watching the students work through this exercise, there were a number of aha! moments for all of us.

Working out user flows and UI designs on paper is never a substitute for living through one. When creating a user flow and UI, it's fairly easy to document and improve upon what already exists. But if you're making a Web app or site from scratch, you should try to find a way to "live through it". Acting the flow out in the physical world affords us a much wider range of observed behaviors, which helps us select the one that is most usable and human.

What may not be apparent as an issue to one designer can be immediately apparent as an issue to a group of designers. Moving from free-form improvisation to literal, documented flows causes powerful tension that immediately calls into question every detail you've documented to date. After only a minute or two of directly following along with the documented user flow and UI design, the students asking to break out of the exercise and revise the UI right on the spot. They immediately knew which details weren't right.

Great Web sites and applications require friendly dialogue. Acting out interfaces forced us to bring a conversational nature to our application design. Alan Cooper said rightly that we treat computers like people, not boxes of logic -- we expect a tiny bit of emotion to how we exchange information. One of the students joked at the start of the exercise that he didn't feel like he was acting like an interface... he was too human-sounding. I shot back that he was acting how a good interface should behave.

Give this exercise a whirl and let me know how it works for you. We'll be trying it again in our next class!

Texting Your Fridge Was Never Easier

T-Mobile Cameo

We're all abuzz about items that consolidate all of our media-related needs into one hand-held device, such as the iPhone. Or, alternatively, we geek out over those little electronic doodads that just do things more simply. The Flip camcorder comes to mind, capturing 13 percent of the camcorder market within a year.

These two trends are buzzing merrily right along. We drool over devices of both stripes accordingly.

But there was a dearth of coverage this holiday season about a technological doodad that is pointing towards a new trend in our innovation-led technodevice industry. And lord knows you could find it on T-Mobile's Web site -- they don't even feature it online.

Continue reading "Texting Your Fridge Was Never Easier" »

Give Your Phone the Finger


Multi-touch, gestural interfaces are the new black. And for the next four to five years, they're the immediate future of our ever-evolving human/computer interactions. But for us designers, I'd like to project a little further into the future and discern an even more likely scenario: true sense integration on mobile and desktop computing devices.

As designers, we usually only get to consider how media looks, sounds, and feels in a mildly tactile sense. In the future, we'll be able to consider these variables at a much greater depth and dimension than that of a static, unchanging substrate. I also wouldn't be surprised if smell and taste gained much greater prominence in the designer's arsenal.

Specifically, there are certain kinds of interactions regarding mobile and desktop devices that don't seem very far off from a technology standpoint. They do, however, require weaning us off the idea of doing our computing through a screen-topped device with a gestural input mechanism. Multi-touch interfaces don't have a ton of utility if you have disabilities, and definitely don't exploit other mechanisms we humans have for conveying and receiving information.

Here's what I'm dreaming of...

An earpiece that doubles as a phone and really understands what I want.

I don't always need to see the Internet to be able to grasp the information from it.

If you're looking to access the visual Internet, the iPhone dominates the field for ease of use and clarity and will likely be the gold standard for some time. But what if I'm going out on the town and don't want that phone in my pocket? Make the earpiece a phone as well, and pair it with trainable natural-language voice recognition software driven through the cell-phone network that learns my voice, my needs, and my quirky slang.

I could imagine the earpiece phone recognizing commands such as "give me turn-by-turn directions to Pacific Place," "pay my cell phone bill with my credit card," or "text my friend Joanie that I'll be twenty minutes late" and it will be smart enough to fulfill your actions without any major hiccups.

This is a true expression of cloud computing separate of the desktop and is where Google is starting to lay the ground with services such as 1-800-GOOG-411, which they claim is a not-for-profit venture, but makes a heck of a lot of sense in their long term strategy for having a universe of cloud-driven Internet tools that have great utility for a broad audience and further help them sell search advertising.

Knowing how excited people get about these kinds of interfaces, I could see them being smart enough to recognize patterns of behavior and quietly prompt you: "Did you mean to pass by the cereal aisle? I know you like Lucky Charms." (Okay, that would be scary...)

A touch interface that communicates through sense of touch, not screen activity.

What's the weather going to be? I go to the weather service on my phone, and when I touch the screen to see what the upcoming weather's going to be like through the weekend, the surface of the touch interface gets hotter or colder depending on the time period my finger hovers over. Sounds frilly, right? Sure, if you aren't blind. Blind people should be able to ask their phone, "What's the temperature going to be tomorrow?" and have the phone adjust its heat output in relation to today's temperature to indicate the relative difference.

Another example. Let's say I'm considering taking SR-520 over I-90 to get to the Eastside. I ask my phone (using my voice interface) how the traffic is on SR-520. The steering wheel gets harder by 30%. Should I take I-90 then? The steering wheel softens dramatically. There are other ways of getting data instead of me barking orders to my phone/car/computer, then having it bark at me a series of choppily-voiced words, which are interrupting my enjoyment of the new MGMT album.

Yes, the multi-touch gestural interface is very cool and gets rid of that mousy thing on the desk. But I want more sense out of my touch interactions.

Forget the idea of the phone altogether. It's part of the devices around me.

I know phone manufacturers want to make money from our phone networks that require devices that earn money for large publicly traded companies through the use of night and weekend minutes... but doesn't that idea sound... quaint?

I'd be perfectly happy if phone calls followed me from device to device around me, instead of me having to carry a device around in my pocket. Sure, there is the love that I'd lavish on a phone as part of my technological pocket arsenal next to the iPod, the (soon to be smart) wallet, my house keys, my sketch notebook, and my pack of mints. But I'm of the "less is more" camp, and less means no phone whenever possible.

Since I'm Gen X, I'm cool with being a little out of touch. I'm already seeing that use of cell phones will stratify, with phones being generated for the youth as part of their uniform, while from Gen X on up, it's seen as a necessity, not as an entertaining activity. Higher-end luxury phones will be wispy, while phones for the youth will be badges.

But really, I'd like to get rid of the word phone altogether. Or at least call this new category of devices something else. The whole beauty of the term "mobile device" is that you don't have to say it's a phone/MP3 player/GPS/Knife/Wii remote. Let's just tack the word "multi-sensory" onto mobile devices and hope that the device manufacturers can pay it off with something that delivers some real utility to us technology junkies.