Last week I had the pleasure and honor of presenting at O'Reilly's Emerging Technology Conference. It was a great party, and I had a great time presenting with Matt Cottam of Tellart. Our presentation was essentially a follow-up to a blog post I had made in 2004. Matt and I met at CHI 2004 in Vienna and some of his observations led me to that blog post, and we've been in touch ever since, discussing these ideas.
Our presentation consisted of two parts. I lead with a discussion of why sketching is a good metaphor for the kind of rapid hardware prototyping that is required as we move from the definition of basic technologies to designing products and experiences with those technologies. This dovetailed well with Bruce Sterling's keynote, in which he talked about defining the future of smart objects by defining the language we use. My point is related: the definition of our technological future rests in tools we use. It's not a new idea, but I think it's important to be thinking about it right now, as the field moves from the component engineering stage to subassemblies defined by end-user experience, rather than by engineering constraints.
Here's the abstract:
Robust physical computing prototyping systems are appearing continuously. More than just Lego Mindstorms, the BASIC stamp and microcontrollers, physical computing prototyping kits are a lightweight way to create real world objects that have interesting functionality without having to learn (too much) electronics or mechanical engineering. New software glue layers allow for much easier interfacing with existing software products, which opens up a whole new world of hacking beyond the screen and beyond basic circuit bending.
And here are my slides (330K PDF)
Matt's half of the talk introduced NADA his company's piece of software that's explicitly created to enable sketching in hardware by easily and rapidly connecting rapid hardware prototyping toolkits to software that's created for designers, rather than engineers (specifically, Flash).
Here's Regine Debatty's coverage, and here are Liz's notes. Thank you, both!
(photos by James Duncan Davidson/O'Reilly Media)
Watch this space for more action along these lines. WOOHOO! Thank you, Matt!
[Addendum: I forgot to mention that in the hour before our talk, Matt and Mike Migurski of Stamen adapted Mike's IRC backchannel visualizer to be a virtual controller for an old table lamp. As people typed in the Etech IRC back channel, they could control the brightness of the lamp. It was possibly the fastest real-world data visualization mashup, ever. Unfortunately, as we were lifting the foamcore to the table right before the talk, we must have created a short in the relay controlling the lamp, and it didn't really work while we were on stage. However, next time, we're planning on having it as realtime speaking rating ambient display, so speakers can know what the audience is really thinking about their talk. It'll be a good social experiment. ;-)]