WineM, a ThingM technology sketch

At my new company ThingM, Mike and I have completed a technology sketch for WineM, a smart wine rack. Below is a video demonstration and an abstract. A full description can be found on the ThingM site. We periodically create Technology Sketches as a way to explore the ideas we’re thinking about.


(revver link)

Abstract:
WineM is a Technology Sketch of a smart wine rack. It’s designed to locate wines in a wine rack using RFIDs attached to bottles and to display which wines have been located using LED backlights behind the bottles. Collectors (or anyone with a large wine cellar) can use it to search through collections, track the location of specific bottles and manage inventory with a minimum of data entry. Linking bottles to networked databases can provide information that would otherwise be too time consuming or difficult to obtain (for example, the total value of a collection, or all the wine that is ready to drink).

Word of the day: interfaction

interfaction:
An interface that is interacts with you.
portmanteau of interface and interaction.

For interfaces with a touch response, interfaction == haptic. But there are other kinds of interfaction. The ring of LEDs that surround a rotory encoder to show a parameters value, or keys that beep when you press them are non-haptic interfactions.

The Wisdom of Emergent Crowds of Tipping Points

I’ve recently finished three books that all have more to do with one another than they might expect:

The last one, while not as well written, does acknowledge the space it’s in by referencing the previous two books. (and mentioned our own Dave Pennock)

At some point I want to sit down with all three of these again and write down my slowly gelling thoughts on all of this. This is how we should design software and build robots.

Physical Music

There’s at least two ideas I’d love to see implemented:

  • Lego-esque modular MIDI controllers
  • Physical tile-based Music Clip UI for Acid/Garageband/Logic/Cubase/etc.

Both of these ideas have been percolating around the my local noosphere
for some time. So I don’t forget, or the ideas mutate, I’m going to try to
describe each one.

  • Lego-esque modular MIDI controllers
    For better or worse, we’re in an age of software synthesizers, software mixers, software effects, and so on. Plug-in hell. While the malleability of software allows a diverse culture of interfaces, we humans must still interact with them with the anemic interface of a single 2-D pseudo-analog pointer. Interacting with these software doppelgangers via such an input is tiresome and inefficient compared to the rich multivariate interfaces of the devices they are based on.

    One might argue that there exists many external MIDI controllers or ‘control surfaces’ to help one break out of the mouse jail, but these devices suffer from being either too generic (a bank of blank knobs) or too specific (a bank of mixer sliders).

    Instead, imagine a collection of physical UI modules: a slider, a knob, a button, a display, and a controller that has the MIDI interface. To make a mixer, you grab a bunch of sliders, click them together, and click the controller to the end. To make a sequencer trigger, click together a bunch of buttons and displays.

    Ahh, but a problem with this concept: no feedback from the the software app about the state of the button, knob, slider. This can perhaps be solved by embedding simple displays within the ‘input’ devices, like what the Nord Lead has.

  • Physical Tile-based Music Clip UI
    The ‘clip’ interface present in Acid/GarageBand/Logic/etc. is pretty powerful: horizontal axis is time, vertical axis is ‘track’. An audio clip can be ‘drawn’ across a time range to indicate when it should play, and it can be truncated on either end to state at what point in the clip it should start and stop (or loop).

    Now think of how this could be implemented physically. I’m thinking: tiles representing audio clips on a surface
    that reads both the tile identity and its position. […tbd…]

Hilighting referring search engine text in web pages

On some random blog I arrived on via a search engine,
I noticed that the page had hilighted all the words
that were part of my query. This is awesome.

I wonder if this functionality could be added as a post-processing
step for all web pages before they are sent to the user.
(an Apache module?) The logic could be quite simple:

  if( http_referer_exists &&
      http_referer_is_search_engine ) {
    query = get_query_from_referer()
    foreach word in query {
     response =~ s/word/<div name=hilite>word</div>/g;
    }
  }