August 2010 Archives

The following is the transcript of the talk I gave at Device Design Day last Friday.

Information as a Material from Kicker Studio on Vimeo.

You can download the PDF (885K), with all images and transcript, or look through it (and download the original Powerpoint) on Slideshare.


Good Morning! Thank you very much for inviting me. It's a pleasure to be here.

First, let me tell you a bit about myself. Most of my professional career over the last 17 years has been spent as a consulting creative director, interaction designer, user researcher and user experience strategy consultant. I started doing Web design in the Jurassic era of the Web, the early 1990s--this is the logo of the first commercial website I designed in 1993. Since then I've worked with probably hundreds of web sites, and I've also helped a number of large electronics companies with their user experience issues.

I sat out the first dotcom crash writing a book based on the work I had been doing. It's a cookbook of user research methods.

And 2001 I co-founded a design and consulting company called Adaptive Path.

I left the Web behind in 2005 and founded a company with Tod E. Kurt called ThingM. We're a small ubiquitous computing company and we design, manufacture and sell ubicomp hardware.

This talk is based on a chapter from my upcoming book on ubiquitous computing user experience design. It's called "Smart Things" and it's published by Morgan Kaufmann. They have a couple of copies here.

This book is my attempt to create a framework for the different kinds of activities, and the products of those activities, involved in device design, and to create some useful constraints to help move the field forward. Language is a pretty effective way to create conceptual constraints, there's a strong undercurrent of defining terms in it. I try not to invent completely new terms, but to clarify how existing terms apply to the practice of designing ubiquitous computing user experiences, if for no other reason so that there's some shared terminology to use when describing what I do to stakeholders.

The book also has lots of illustrations, techniques and in-depth case studies of a number of commercial products, so don't worry, it's not all about words and concepts. This talk is largely from the wordy/conceptual side of it.

I want to start by mentioning a curious phenomenon. If you any of you follow developments in microprocessors, you'll notice that the clock speed of today's new CPUs is basically the same as that of CPUs from five years ago. For those of us who used computers in the 80s and 90s, this is very confusing. We watched clock speeds go from 6MHz in 1983 to 3GHz in 2003. During that time, we became used to clock speeds as the measure of power and value in information processing.

But after 20 years during of a logarithmic increase that spanned 3 orders of magnitude, suddenly clock speed abruptly stopped going up in 2004. The industry went from exponential growth in clock speed to no growth, zero growth, in one season. It's like someone slammed the emergency brake.

I call this phenomenon Peak MHz.

Unlike oil, we're not literally running out of CPU clock cycles, but we are seeing a reevaluation of how we understand the value that computers provide, and this has resulted in a shift in the strategy of microprocessor makers. What happened in 2004 was, broadly speaking, that chip manufacturers saw that we were running out of uses for big, energy-hungry, hot processors, and they shifted the game. Since 2004 the competition has shifted from raw CPU to making smaller, cooler, cheaper chips that can do as much work as older chips, while using fewer resources.

Here's a slide from a talk Paul Otellini, the CEO of Intel, gave last year. Notice that instead of talking about numbers going up, processor manufacturing has become all about pushing numbers down. Instead of competing on doing more with more, they are now competing on doing the same with less. Less power, smaller size, and lower cost.

One of the main effects of this shift is that in addition to pushing the price, size and energy consumption of the latest CPUs down, it also pushes the price of all previous processing technologies down along with it. For example, at the beginning of the Internet era we had the 486 as the state of the art and it cost $1500 in today's dollars. It's the processor that the Web was built for and with. Today, you can buy that same amount of processing power for 50 cents, and it uses only a fraction of the energy. That decrease in price is the same 3 orders of magnitude drop as the increase in speed to 2004. This is not a coincidence, because both are the product of the same underlying technological changes.

When a technology falls in price this much, it opens up enormous possibilities for new products, while creating fundamental changes in society as the new technologies displace established social systems and networks.
Steam engines, for example, lowered the price of harnessing energy by orders of magnitude...and the Industrial Revolution was born as people found all kinds of new uses for mechanical energy. Mechanization suddenly became an option for making and using things where it never existed, or was highly impractical.

You can see similarly transformative effects if you look at what happened when the price of extracting aluminum dropped by two orders of magnitude in the late 19the century, or when electric motors became significantly cheaper and smaller in the 1920s. When something becomes cheap, it quickly joins the toolkit of things we create our world with. It becomes a design material. Sometimes for better and other times for worse.

In the last five years cheap, small processors have drastically lowered the cost of taking information in, evaluating it, manipulating it, rearranging it, and acting on it. It is no longer unthinkable to have an everyday object use an embedded processor to take a small piece of information--say the temperature, or the orientation of a device, or your meeting schedule--and autonomously act on it to help the device do its job better. Information processing is now part of the set of options we can practically consider when designing just about any object.

In other words, information is quickly becoming a material to design with.

This capability of everyday objects to make autonomous decisions and act using arbitrary information is as deep an infrastructural change in our world as electrification, steam power, and mechanical printing. Maybe it's as big of a deal as bricks. Seriously, it's a huge change in how the world works, and we're just at the beginning of it.

If information is a design material, what are its material properties? Sure, at some level there are basic information theoretic properties such as bandwidth, noise and complexity, but those are the microscopic properties, the equivalent of basic nuclear forces in material science. They won't help us design a Tickle Me Elmo Extreme, which is a device that's only practical to make using cheap information as a material. What are the MACROSCOPIC properties of information that we can use to design with?

  • Automatically sense the world

It can sense the world. There are thousands sensors that convert states of the world into electrical signals that can be manipulated as information. This also includes sensors that sense human intention. We call these "buttons", "levers", "knobs" and so on.

  • Autonomously act on the world

Actuators, which is the generic term for anything that can make a physical change based on input, can be triggered based on information. Thus, information can be used to autonomously affect the world in a way that no previous material was capable of.

  • Remember

Information can be used to knowledge about the state of the world and act on it later. This could be just a single piece of data, such as what a mechanical thermostat does when it stores the temperature you'd like to keep your house at, or something much more sophisticated, say, storing an image of everything you look at, which is what was doing a couple of years ago.

  • Repeat exactly

One of the most transformative qualities of information is that it can be duplicated exactly and transmitted flawlessly. This has already changed the music and video industry forever.

But it also means that device behavior can be replicated exactly. We've become acclimated to it, but--stepping back--the idea of near-exact replication in a world full of randomness and uncertainty is a pretty amazing thing, and is a core part of what makes working with information as a material so powerful.

  • Create complex behavior

Information enables behavior that's orders of magnitude more complex than possible with just mechanics, at a fraction of the cost. This is a modern small airplane avionics system. It consists of a bunch of small fairly standard computers running special software.

Compare that to a traditional gyroscopic autopilot where every single component is unique, it does very little, and to change its behavior you have to completely reengineer it.

If you just thought, "Wait a minute. I know all this and, besides, Norbert Weiner covered this in Cybernetics in 1948." you're right. This is not new. We are intuitively familiar with these properties because we've been using computers for a long time. However, now it is more relevant than ever, because now these same qualities can be distributed throughout the environment in a way that's never been economically feasible before. Weiner was writing from the equivalent position of Leonardo Da Vinci, who could see that mechanized flight was possible, but it was not until four hundred years later that the technology, which included new manufacturing techniques, design techniques, and materials, such as aluminum, made widespread commercial flight practical.
We're now at a point where theory can become reality, and we're now in the position where we actually have to make it happen.

So how does treating information as a material affect device design?

Object-oriented hardware
First, it changes the way that we think about hardware.

Because information can abstract knowledge, it makes it easier to reduce complexity, including the complexity of information technology itself.

Embedded processors make it possible to create an abstraction layer around basic sensing, processing and actuation components to creates building blocks that are meaningful in human terms, rather than just electronic terms. Each block is an atom of functionality that has a CPU and communicates with other blocks over a network. This is the start of object-oriented hardware. What you see here are mostly all prototypes that make it easier to demonstrate this idea, but this is already how many modern devices are constructed. A modern digital device is already more like a small network of interacting components than a monolithic product of a single ground-up engineering process.

From an interaction design standpoint, object oriented hardware means that rather than starting from basic principles of electronics, you get to focus more on what experience you're trying to create, rather than which capacitor to use. Most designers don't smelt their own iron to make things out of steel, or grow their own hardwood trees to make things out of wood. Similarly, object-oriented hardware turns information from a raw material into a design material.

ThingM, my company, makes a set of such atoms of information processing that emit light. Our BlinkM line of smart LED products makes it very easy to put controllable RGB light into arbitrary locations with no electronics knowledge or color theory. Pick some up today at fine electronics retailers worldwide.

OK, end of sales pitch.

Smart things
So what's made with these atoms?
On the next larger scale, we will see new personal tools. Today we have digital pedometers, Internet connected bathroom scales, networked parking meters, and cars that don't stall, but there will be many more. Pick nearly any object, add information to it, and you get a new object. My favorite example of this the adidas_1 shoe, which was put out 5 years ago and then almost immediately discontinued. It has a pressure sensor that it uses to estimate the qualities of the surface being run on an adjusts the heel in between strides to optimize the resiliency regardless of what surface you're running on. The buttons adjusts how it responds.
For me it represents how a small amount of information, carefully deployed can profoundly change an object.

Information as decoration
We will see information used as a decorative material, because just as you can use wood to hold up a house or make a sculpture with it, so information can be used to create incredibly beautiful, profound esthetic experiences. It has already revolutionized music and cinema, but treating it as a permanent material, rather than a medium, creates fantastic new opportunities.

Intelligent environments
When taken all together, all of these changes mean that at a large scale, our environment is growing increasingly information-based on a fractal level. Small information devices make large information devices that combine to form environment-sized devices made with information as a core material.

This is the big change that we're going to see happen in the world very soon.

I want to shift gears a bit and talk about two kinds of devices that I believe are important to distinguish in terms of how they use information as a material. These are two different classes of object that are actually made with the same material.

Two kinds of devices
One class consists of narrow-function devices whose value is primarily in the effect that they create locally. They are made with information to help them to do a small set of specific things much better than similar devices made without information. I call these devices appliances.

The second is the set of general-purpose computing devices are designed to do many things, and they have a wide variety of sensors to maximize the breadth of potential functionality, and much of their value comes from the remote services they provide access to, rather than their local technical capabilities. I call them terminals, because they evoke the tight relationship early terminals had with mainframes. In a sense, terminals are all the same object, but one which comes in a wide variety of sizes.

Both appliances and terminals are made of the same materials, and they have fundamentally identical capabilities, which tends to be confusing. If you know the old Slashdot refrain about Beowulf clusters made from random Linux-based devices, that's a joke about how any device that runs Linux can be used as a general purpose compute platform. Yes, technically that's true, but in the era of cheap processing, it's no longer interesting. It's like joking that you can make an airplane out of melted soda cans because they're both made of aluminum. Yeah. So?

The key difference between the two classes of devices is of course the user experience they create, and that's where the design difference has to come in. If you try to make a terminal experience into an appliance experience, you break its core values as a terminal, and vice versa.

So when starting a project you can ask yourself: am I designing an experience that's more appliance-like or a terminal-like?

Appliances, terminals and networks
Probably the key consideration is how your device is going to work with a network.

Appliance + network = The Internet of Things
As a narrow function device whose primary user value is local, appliances do one of two things over a network: they provide telemetry, or they serve as interfaces for a single, simple data feed.

This kind of simple, but highly relevant, data communication is what forms the core of the Internet of Things. In this approach appliances communicate with other appliances and people to create highly focused user experiences that connect physical products to each other in highly efficient, deliberately predetermined ways. Each device becomes more valuable because it is made with information, but only in one specific way.

For example, you can check on the status of your Amazon order because hundreds of devices, hundreds of appliances, are being used to track nearly every single atom Amazon is responsible for. Right now they're using barcodes. Soon these will become RFIDs and after that they'll be active devices, like the FedEx Sensaware smart tag, which has a bunch of sensors, a GPS and the equivalent of a phone in it for sending data about where a package is and what conditions it's traveling in.

Terminal + Network = Service Avatar
In the case of terminals, adding networked data has a different effect.

One of the core values of terminals is that they can make the same information accessible through a variety of devices. This has the effect of shifting value from the device, which is a generic container, to the information it carries. The terminal becomes a temporary representation of whatever information-based services it provides access to. It becomes the physical manifestations, what I call the avatar, of the service it represents. It is not the service itself, but people sees "through" it to the service it represents.

For example, I have every expectation that I should be able to pause a movie on one terminal running Netflix and then upause it on another. Why not? It's just a hole in space, a short-term manifestation of a single service I have subscribed to. The value is in the service, not the frame around it.

The design of terminals then is then a challenge to create the most transparent window, and the device design challenge is not in the device, but primarily in the design of the service it's going to create access to.

I feel that these are the kinds of questions we're going to have to ask, and the kinds of relationships we're going to have to examine as we extract device design from the hodgepodge of design techniques that still treat hardware, software and service design as separate entities.

Working with information as a material becomes a negotiation with this combination of technologies treated as a single thing. New materials create both possibilities and problems. We didn't get our flying cars, but nor did we didn't have to fight atomic hydroplaning Soviet battleships.

The most important thing is to engage with the material as a single material that you work with, as a unified set of ideas, rather than separate things that are divided and abdicated to others. If you are here, you create technology. This means that it's your responsibility to understand the properties of information, explore its capabilities, and build tools that make it easier to do the right thing with information than to do the wrong thing. It is our responsibility as designers to do this exploration much more than it is Intel's, or LG's or the government's. They're just mining the raw ore. We're the ones who decide what to make with it.

Thank you!

I was doing some writing for my upcoming Device Design Day talk and started to make a list of two common kinds of smart things that I've been seeing out in the world. For lack of better terminology, I'm calling these appliances and terminals. I haven't yet processed all of these ideas, but here is an initial stab at distinguishing two major classes of smart thing.

Appliances Terminals
Most functionality is Local Remote
Technical capabilities Narrow. Technology is only included if it supports core purpose. Broad. Many possible sensors and actuators are included in case they're needed by a service.
Effectiveness High. They're very good at the small number of things they do. Low. They're OK at many things.
Interface complexity Low. A narrow vision means the interface is relatively straightforward. High. The general-purpose nature of the devices means that the burden of efficacy is on the interface design.
A group of them that is interoperating is called... An ensemble A service
A single member of the group is called... An instrument An avatar
Barriers to interoperability High. Unless they're designed to work together from the start Theoretically low: they're designed to be avatars of the same service. In practice: high. Cross-avatar UX is still at an infancy.
Distinguished from each other by Specific function Size
Strength of links between linked devices Low. Connecting appliances that aren't designed to be connected is difficult. High. In theory. Theoretically service avatars should easily communicate, but that's not often the case in practice.
Examples Digital pedometers, Internet connected bathroom scales, networked parking meters, cars, Nike+iPod, cameras. smart phones, netbooks, laptops, connected TVs




A device studio that lives at the intersections of ubiquitous computing, ambient intelligence, industrial design and materials science.

The Smart Furniture Manifesto

Giant poster, suitable for framing! (300K PDF)
Full text and explanation

Recent Photos (from Flickr)

Smart Things: Ubiquitous Computing User Experience Design

By me!
ISBN: 0123748992
Published in September 2010
Available from Amazon

Observing the User Experience: a practitioner's guide to user research

By me!
ISBN: 1558609237
Published April 2003
Available from Amazon

Recent Comments

  • Katherina: Information not just material. In our days it is a read more
  • Hi Mike, totally agree on building the IoT in a read more
  • Mutuelle: Man is the reflections of his thought, some name it read more
  • Amanda Carter: You obviously placed a great deal of work into that read more
  • Molly: You might find it interesting to connect with return of read more
  • George: You might want to change "Size" to "form" for terminal. read more
  • Mike: Thanks for the reminder, Robin. I'm aware of that article, read more
  • Robin: It's a slightly different argument (it predates most work in read more
  • Tim: This reminded me of the Pleo video Mark posted awhile read more
  • michael studli: i was wonting to know is the game fun to read more

About this Archive

This page is an archive of entries from August 2010 listed from newest to oldest.

July 2010 is the previous archive.

September 2010 is the next archive.

Find recent content on the main index or look in the archives to find all content.