Interactive tables or large displays are great for exploring and interacting with urban data, for example in urban observatories or exhibitions. They turn working with maps, visualizations, or other urban information into a fun and social experience. However, such interactive tables or large displays are also expensive – much too expensive for schools, public libraries, community centres, hobbyists, or bottom up initiatives whose budgets are typically small.

We therefore asked ourselves how we could use the countless tablets and smart phones that are typically idling away in our pockets and bags to compose a low-cost but powerful multi-user and multi-device system from them. How can we enable users to temporarily share their personal devices for creating a joint cross-device system for a social and fun data exploration?



Video of HuddleLamp demo applications

Our result is HuddleLamp, a desk lamp with an integrated low-cost depth camera (e.g. the Creative Senz3d by Intel). It enables users to compose interactive tables (or other multi-device user interfaces) from their tablets and smart phones just by putting them under this desk lamp.

hud1
Technical setup of HuddleLamp with an integrated RGB-D camera (Tracking region: 1.0×0.6m).

HuddleLamp uses our free and open source computer vision software to continuously track the presence and positions of devices on a table with sub-centimetre precision. At any time, users can add or remove devices and reconfigure them without the need of installing any apps or attaching markers. Additionally, the users’ hands are tracked to detect interactions above and between devices.

All this information is provided to our free and open source Web API that enables developers to write cross-device Web applications. These applications can use all this information to become “spatially-aware”. This means that the applications can react to how the devices are arranged or moved in space. For example, physically moving a tablet on a desk can also pan and rotate the content of the screen, so that each device appears to be a kind of peephole through which users can view a spatially-situated virtual workspace. When putting multiple tablets or phones side-by-side, these peepholes turn into one huddle or federation of devices and users can interact with them as if they were just one large display.

hud2

Multiple tablets side-by-side form a huddle of tablets

Peephole navigation and using multiple tablets as one tiled display.

HuddleLamp was created by Hans-Christian Jetter of the Intel ICRI Cities at UCL in London and Roman Rädle of the Human-Computer Interaction Group of the University of Konstanz together with colleagues from the UCL Interaction Centre.

Thanks to the great work of our student research interns Oscar Robinson (UCL), Jonny Manfield (UCL), and Francesco De Gioia (University of Pisa) who visited the ICRI during Summer 2014, we are happy to not only present HuddleLamp in a talk at the ACM ITS Conference 2014 but also to give a live demonstration there.

HuddleLamp is a first step towards a “sharing economy” for excess display and interaction resources in a city. We envision that in future cities users will be able to seamlessly add or remove their devices to or from shared multi-device systems in an ad-hoc fashion without explicit setup or pairing. Instead this will happen implicitly as a by-product of natural use in space, for example, by bringing multiple devices to the same room, placing them side-by-side on a table or desk, and moving them around as needed. Ideally, users will experience these co-located cooperating devices and reconfigurable displays as one seamless and natural user interface for ad-hoc co-located collaboration.

After having created our free and open source base technology, we are now looking at creating and studying examples for the visual exploration of urban data. Our goal is to enable citizens to create their own bottom up urban observatories for community engagement and activism in spaces such as schools, public libraries, community centres, or museums.

Further reading:

To learn how to build your own HuddleLamp and HuddleLamp applications, please visit: http://www.huddlelamp.org or join the HuddleLamp Facebook group.

The ITS paper on HuddleLamp is also available here.

In their pursuit of a “natural” or “intuitive” interaction, researchers and designers in Human-Computer Interaction have created a multitude of post-WIMP (post-“Windows Icons Menu Pointer”) user interfaces and interaction techniques during the recent years. Examples range from “perceptual computing” with depth cameras for gesture/body tracking and simultaneous pen & multi-touch interaction to tangible displays for augmented reality or entire rooms equipped with display walls and interactive tables.

The workshop “Blended Interaction – Envisioning Future Collaborative Interactive Spaces” at CHI 2013 in Paris on Apr 28 that was organized by Christian Jetter with colleagues from Konstanz, Dresden, Edinburgh, St. Andrews and Hagenberg (see http://hci.uni-konstanz.de/blendedinteraction2013/) established “Blended Interaction” as a novel concept for understanding what makes interactive technologies “natural” or “intuitive”. In brief, “Blended Interaction” combines the virtues of physical and digital artifacts, so that desired properties of each are preserved while integrating computing power in a considered manner. In a world of Blended Interaction, computing is woven into the fabric of our natural physical and social environment (e.g. our cities) without being too obtrusive or disruptive.

Keynote speakers Robert Jacob (Tufts University), Michel Beaudouin-Lafon (Université de Paris-Sud) and Andy Wilson (Microsoft Research) gave exciting and inspiring talks about the theory, technology and vision of Blended Interaction. During the workshop, Christian also presented his vision of future self-organizing user interfaces that would be particularly appropriate for interacting in rapidly changing physical and social environments and usage contexts such as cities (slides: http://hci.uni-konstanz.de/downloads/blend13_jetter_slides.pdf, paper: http://hci.uni-konstanz.de/downloads/blend13_jetter.pdf).

ws1 ws2

Johannes Schöning (ICRI Cities, UCL & Hasselt University) and Hans-Christian Jetter (ICRI Cities, UCL) were invited to the European Commission’s Joint Research Centre (JRC) at Ispra, Italy for attending a workshop in the European Crisis Management Laboratory of the Institute for the Protection and Security of the Citizen (http://old.gdacs.org/cr/ECMLWorkshops/BigWallHCI/JRC-ECML-BigWallHCI-Workshop-Programme_FINALv3LQ.pdf).

ispra

A view into the European Crisis Management Laboratory.

Johannes and Christian presented their vision of collaborative interactions in future crisis rooms together with Harald Reiterer and Simon Butscher from the University of Konstanz. In a joint presentation they showed how different designs and technologies from their research in the field of Human-Computer Interaction and Information Visualisation (e.g., collaborative tangible search on tabletops, folding views and lenses for collaborative geographical visualization, user identification and hand tracking using RGB and IR cameras, curved displays) can be used to enable multi-user geospatial analysis of real-time data, e.g., twitter feeds.

ispra2

Different components for future crisis rooms developed by the University of Konstanz, University of Hasselt and UCL.

There was a lively exchange with the 40 workshop attendants from civil protection agencies, industry and academia about ideas and visions for future multi-user interaction with large displays in different application scenarios. For the research at ICRI Cities the event was particularly interesting to learn more about the many different data sources, sensor networks and simulation tools that are currently in use by civil protection practitioners to monitor, analyse, make sense and report about critical incidents on a global, regional or urban scale. By letting the workshop take place in the actual crisis room that is used to provide the President of the European Commission with situation reports, there was the opportunity to get a first-hand experience of typical tasks, tools and challenges. The inspiring event that was organized by Markus Rester from the Crisis Monitoring and Response Technologies (CRITECH) group at the JRC helped us to generate novel ideas about how to make data and simulations from a connected city accessible to analysts, policy makers, practitioners or city dwellers in a future “urban observatory”.