Slides from "My Top 10 Design Business Failures"
Interaction 11 Recap: Thinking on the Outside

This Week's Challenge: Chatty Baby Bib

(don't) Cry

While wandering through the Gap, you notice a hat that can broadcast your latest Facebook status. Or a scarf that displays @replies to your Twitter account, writ large in sparkling letters. Or a belt buckle that warns you when traffic becomes busy on your usual route home.

Mobile phones and tablets are only the first wave of connected devices—and our notion of what a "device" looks like is going to radically change. I find my designerly eye drawn most to wearable technology: clothing that can gather and display information, provide control to other devices or services, or otherwise remove our notion of the screen-based interface from human-computer interaction. If the current Arduino craze is any indication, our clothing can be enhanced with embedded processors, sensors, and lightweight software that communicates with cloud computers via Wi-Fi or cellular data networks.

How do designers create concepts that describe how to exploit these novel uses of technology? In this challenge, you're going to try and envision how a piece of wearable tech can be used by our most demanding technology users: babies.

You've been hired by a technology firm that wants your help in devising a line of baby clothing that is able to monitor body heat, pulse rate, blood pressure, and other biometric information. The clothing can then change color or display information regarding what data has been collected over time.

In 60 minutes, create a 6-panel storyboard that describes a critical usage scenario for this baby clothing, being sure to clearly show the context of its use.

If you want to take it further, move from a drawn storyboard to creating a photo-real video scenario that shows a faked prototype. Or, if you have the skills, a real prototype!

The above photo is by Pedro Klein on Flickr, shared via a Creative Commons Attribution 2.0 license on Flickr.



Hi David,
Reading this made me think of what I saw at Microsoft Research this summer. They have made a device that can use the body as an input device. It can accurately measure where you tap on the skin of you arm and fire an event to a device or an action based on that interaction with the skin - skinput. We go from baby clothing that touches our skin and is connected to out vitals to our skin it self becoming part of what can be leveraged in wearable devices.

They have one successful prototype of changing the song played and volume on an iPhone while running.

It is as if the skin becomes a touch interface.


Here is a link to the skinput project page:

The comments to this entry are closed.