New data for old senses

(photos (cc) by eecue and decade_null, found on Flickr)

A couple of years ago I wrote about an idea I had for visualizing the implicit heat maps in Wifi signal strength using actual heat.

I never made the device, but I thought about what the ideas it was pointing at and generalized this as an observation I called "new data for old senses" and wrote some notes about it that I never shared here. Today PT at Makezine posted a link to a project along the lines that I was thinking about. It's a Wifi sensor that uses vibration to give you a sense of the Wifi landscape around you without having to look at anything, which was the crux of my idea in 2005. So, since the idea is now out there, here are my notes:


I'm interested in the idea of using senses that don't normally get used for device communication as secondary display channels. This is a way to allow access to what John Udell calls the vast middle ground between devices that either demand our full attention or none.

We have more senses than sight and sound, which are channels already full with important information, so how do we use our "secondary" senses to communicate "secondary" information?

What are other kinds of senses and other kinds of data we can use?

Here are the somatic senses (thanks, Google!):

  • touch
  • pressure
  • vibration
  • heat
  • cold
  • pain
  • proprioception (the feeling of joint movement)

What to visualize? Liz has been doing a bunch of stuff about visualizing people's relationship with the RF spectrum and geography, but I've been thinking that there are several granularities that would change in perceptible and interesting ways. At human scale in a city there's Wifi strength; at car scale there are things like crime maps, and at airplane scale there are political boundaries (voting records, natural phenomena).


The bottom line is:

How we can introduce secondary information into people's awareness in a secondary way, using their less-used senses and without adding additional cognitive noise to the primary channels of sight and sound?

No TrackBacks

TrackBack URL:

1 Comment

Well, there's the sense of smell, or olfaction, that you haven't mentioned, as far as I can see. Jofish Kaye, while at MIT Media Lab, wrote a thesis and built several olfactory displays. At Telenor Research & Innovation, we also built an olfactory display, used as a presence indicator in a SmartHome. I wrote a brief, informal review in 2002 about "Smelly interfaces", and added som preliminary guidelines. Olfactory displays have limitations, both with respect to the production of smells and with respect to the types of information that can be represented, but it is still interesting for "background information", as ambient displays.




A device studio that lives at the intersections of ubiquitous computing, ambient intelligence, industrial design and materials science.

The Smart Furniture Manifesto

Giant poster, suitable for framing! (300K PDF)
Full text and explanation

Recent Photos (from Flickr)

Smart Things: Ubiquitous Computing User Experience Design

By me!
ISBN: 0123748992
Published in September 2010
Available from Amazon

Observing the User Experience: a practitioner's guide to user research

By me!
ISBN: 1558609237
Published April 2003
Available from Amazon

Recent Comments

  • Kari Hamnes: Well, there's the sense of smell, or olfaction, that you read more

About this Entry

This page contains a single entry by Mike Kuniavsky published on March 15, 2007 7:33 PM.

A neighborhood tour through time was the previous entry in this blog.

Apples, oranges and swivel is the next entry in this blog.

Find recent content on the main index or look in the archives to find all content.