45. Tangible Interaction

469 shares

Tangible Interaction has come to be the 'umbrella term' used to describe a set of related research and design approaches which have emerged in several disciplines. It became noticeable as a research topic in the late 90s and then rapidly grew into a research area.

Broadly, Tangible Interaction encompasses user interfaces and interaction approaches that emphasize

  • tangibility and materiality of the interface

  • physical embodiment of data

  • whole-body interaction

  • the embedding of the interface and the users' interaction in real spaces and contexts.

Tangible Interaction is a very interdisciplinary area. It spans a variety of perspectives, such as HCI and Interaction Design, but specializes on interfaces or systems that are in some way physically embodied - be it in physical artefacts or in environments. Furthermore it has connections with product/industrial design, arts and architecture. Finally, new developments in Ubiquitous Computing, Actuation, Sensors, Robotics and Mechanics contribute through enabling technologies to the field of Tangible Interaction.

45.1 A history of Tangible Interaction: influences, perspectives, and influential prototype systems

Tangible Interaction has been influenced by work from different disciplines, in particular Computing, HCI, and Product/Industrial Design. For Computing and HCI, the notion of a ‘Tangible User Interface’ (as it was originally conceived in the mid/late 90s) constituted an alternative vision for computer interfaces that brings computing back ‘into the real world’ (Wellner, Mackay, Gold 1993; Ishii, Ullmer 1997). A general dissatisfaction with traditional screen-based interfaces and with Virtual Reality, which were seen as estranging people from ‘the real world’, motivated the development of the first prototypes, while technological innovations enabled building these (e.g. RFID technology). In contrast, the field of Industrial Design came to engage with Tangible Interaction out of necessity, as increasingly appliances contain electronic and digital components and become ‘intelligent’. For designers, this constituted new challenges as well as new opportunities (Djajadiningrat, Overbeeke, Wensveen 2000; Djajadiningrat et al 2004).

An interesting point is that challenges and established skills are complementary for the above mentioned disciplines: Where considerations of physical form factors, choice of materials and so on forced computer scientists and HCI researchers out of their comfort zone, industrial designers now had to focus on designing complex behaviour that is digitally controlled and has no inherent relationship to product form.

These practice and research fields had no common discussion forum and only intersected occasionally or through personal contacts, with e.g. particular product ideas and sketches inspiring the notion of a Tangible User Interface. The Marble Answering Machine, devised by Durrell Bishop while studying design at the Royal College of Art, is one such sketch that used marbles to represent incoming messages. The marbles fall out of the machine and can be played by placing them into a mould on the machine (Poynor 1995). Generalizing this design yielded the idea of representing data through physical objects and of manipulating the data by physical handling of the objects – Ishii’s Tangible Bits vision (Ishii, Ullmer 1997).

In the early years of the new century researchers with a design background more frequently participated at HCI-related conferences, starting a dialogue. From about the same time, the number of workshops addressing Tangible User Interfaces or Tangible Interaction (a term which was proposed by parts of the design community) as a topic increased steadily. From this grew an interdisciplinary research community that adopted the term ‘Tangible Interaction’ to describe its shared focus, and has its own conference since 2007.

With emerging technologies coming quickly onto the market, the field has become more diverse (e.g. some systems involve actuation, some rely on complex sensor-based data-collection, some are based on conductive fabrics etc.) and also more inclusive, as it has become easier and cheaper to build working prototypes and functioning systems. Whereas in the late 90s, specialized hardware and expertise was required to build a prototype with comparatively simple functionality, in 2009 this has become a standard project assignment in many industrial or interaction design courses.

The following gives an overview of the major influencing perspectives. As much of the conceptual and visionary development went hand in hand with the building of prototype systems, this is very much in the style of ‘a history through examples’.

45.2 HCI and Computing: Tangible User Interfaces

Within Computing and HCI Tangible Interaction first became prominent with the notion of 'Tangible User Interfaces' (TUIs) proposed by Hiroshi Ishii and his group at the MIT Media Lab in 1997 (Ishii, Ullmer 1997). This work built on prior work by George Fitzmaurice in collaboration with Bill Buxton and Ishii himself (Fitzmaurice, Ishii, Buxton 1995). Fitzmaurice's PhD thesis (1996) explored the use of graspable bricks as a more direct input mechanism for the interaction with graphical representations. It further suggested employing multiple graspable objects that are distributed in space, with strong-specific functionality, instead of the generic input device we know as a mouse, which distributes input over time. The bricks were laid on top of graphics (displayed on a horizontal screen), which then got anchored to them. Moving a brick thus moved the graphics, and moving two corners of a triangle apart with two bricks would stretch the triangle correspondingly.

Tangible User Interfaces were envisioned as an alternative to graphical displays that would bring some of the richness of interaction we have with physical devices back into our interaction with digital content (Ishii, Ullmer 1997). It was proposed to represent digital content through tangible objects, which could then be manipulated via physical interaction with these tangibles. The core idea was to quite literally allow users to grasp data with their hands and to unify representation and control. Digital representations were thought to be closely coupled, usually through graphical projections on and around the tangible objects, which came to be referred to as 'tokens'.

One of the first examples developed by the MIT's Tangible Media Group was a map that was manipulated by placing iconic representation of central buildings on it and moving these apart. Later-on the research group developed Urp, a system that supports urban planning (Underkoffler, Ishii 1999). Urp integrates a physical model with an interactive simulation of the effects of building placement on sunlight and wind flow. The tangible models of buildings cast (digital) shadows that are projected onto the surface. Simulated wind flow is projected as lines onto the surface. Several tools are available to probe e.g. the wind speed or the distance between points in space, and to change the properties of buildings (glass or stone walls) or the time of day, resulting in shadows moving. Over the years, a series of related systems have been built, and the notion of TUIs was taken up by many other research groups worldwide.

45.3 Influences from other disciplines: Product/Industrial Design and the Arts

Within other disciplines, the merging of physical form with digital contents and behaviors occurred alike. Product Design increasingly concerns complex computational behavior and designers need to rethink how to make IT-related appliances legible and usable. Some design researchers have come to investigate how form and digital behavior can be more closely coupled and how users could interact in richer ways with digital products (Djajadiningrat et al 2004; Jensen, Buur, Djajadiningrat 2005). The Marble Answering Machine is an early example of this endeavour. The term 'Tangible Interaction' originated in this context.

Djajadiningrat et al (2004) describe a concept sketch for a videodeck that integrates the physical controls within the mechanism of the mechanical device, creating physical legibility of the controls. For example the contours of the device are broken where there is interaction with the outside world. The eject button has turned into a ribbon which lies under the tape and is pulled outward. They further describe the concept design of a digital camera that attempts to replace all of the typical menu functions and identically looking buttons with physical manipulations of the camera. Here, the user e.g. slides the screen towards the memory card in order to save an image and slides the screen towards the lens to go into ready mode again.

A further merging of digital and physical design can be seen in the emergence of 'Physical Computing' within design worldwide through a culture of tinkering and making things (cp. Igoe and O'Sullivan 2004). Physical Computing involves fast prototyping with electronics, and often reuses and scavenges existing technology (tinkering). It is defined as the design of interactive objects, which are controlled by software, and that people interact with via sensors and actuators.

Within the interactive arts a related development can be seen. Many installations employ 'interactive spaces' which are sensorized to track users' behavior and integrate tangible objects into the installation (see e.g. Bongers 2002). Often, whole-body movement is used to interact within these environments. Interaction designers have also developed an interest in bodily interaction, which can be pure movement (gestures, dance) or is related to physical objects (Hummels, Overbeeke, Klooster 2007).

In a sense, whole-body interaction and interactive spaces is thinking of Tangible Interaction on another scale - instead of interacting with small objects that we can grab and move around within arms reach (this is more the focus of Tangible User Interfaces and Product Design) we interact with large objects within a large space and therefore need to move around with our whole body.

45.4 ‘Tangible Interaction’ brought different perspectives under one umbrella

The term 'Tangible Interaction' has come to embrace all these developments. As argued by Hornecker and Buur (2006), the field prioritizes as principles of design:

  • tangibility and materiality

  • physical embodiment of data

  • bodily interaction

  • embeddedness in real spaces and contexts.

Hornecker and Buur argue that the original definition of Tangible User Interfaces excludes many interesting developments and systems from product design and the arts and therefore suggest using a more inclusive, less strictly defined term. The shift in phrasing from Tangible Interface to Tangible Interaction was intentional, similar to the distinction between Interface and Interaction Design. It places the focus on the design of the interaction instead of the visible interface. This puts the qualities of the interaction into the foreground of attention, and requires system designers to think about what people actually do with the system (see also: Djajadiningrat, Overbeeke, Wensveen 2000; Jensen, Buur, Djajadiningrat 2005). It further encourages thinking of the tangible system as part of a larger ecology and as located in a specific context. This has been described as the 'practice turn' by Fernaeus et al (2008), with newer conceptualizations of Tangible Interaction focusing on human action, control, creativity and social action instead of the representation and transmission of information.

The adoption of ‘Tangible Interaction’ as umbrella term has supported the development of a larger interdisciplinary research community (the TEI conference series), but as a downside, results in some tension/ambivalence as to where to draw the line between Tangible Interaction and other areas. For a report on discussions during the TEI 2007 and TEI 2008 panel discussions see Hornecker et al (2008). For example it remains open whether a car is a Tangible Interface and whether gesture-based interaction can be considered tangible interaction. Different people in the research community would answer this question in different ways.

Tangible Interaction therefore overlaps at its fringes with a range of other research areas, summarized in this encyclopedia entry under ‘Related Topics’. Whether a particular paper is framed as ‘tangible’ or e.g. as gesture-based interaction often depends on the conference or journal that it is submitted to. The research community seems well aware of this ambivalence, but has decided to embrace it: The TEI conference in 2010 changed its name from ‘Tangible and Embedded’ to ‘Tangible Embedded and Embodied Interaction’ in order to more explicitly invite research on whole-body or gestural interaction.

45.5 Research directions

Tangible Interaction is a growing research area. Its commercial relevance is still somewhat unclear (if we disregard standard product design for a moment). Yet companies like Philips Design and Microsoft Research increasingly invest in research in this area, and TEI 2009 was hosted by Microsoft Research in Cambridge, UK.

Furthermore there are an increasing number of spin-off companies that market systems in this area. The system currently probably best known to the public and the media is the ReacTable (http://www.reactable.com/, see Jordá et al 2007) from the Universitat Pompeu Fabra. This is a table-based music performance instrument combining tangible input (movement of tagged objects on a flat surface) with multitouch interaction on the surface, enabling users to manipulate the graphics projected around the tangible input objects with their fingers. It was used by Björk during her 2007 world tour, won the Prix Ars Electronica Nica in 2008, and is now being marketed for museums and – soon – for musicians and DJs.

Application areas for Tangible Interaction are diverse. Many projects are aimed at supporting learning and education. This is where so far the most systems have been employed outside of the lab. Common are also domestic appliances, interactive music installations or instruments, museum installations, and tools to support planning and decision making.

Research still needs to tease apart what exactly are the advantages of tangible interaction systems and for which contexts and application areas they are the most suitable. While there is good evidence that tangibles tend to support collaboration and social interaction (Hornecker, Buur 2006), it is, for example, less clear what kinds of tangibles are most effective in supporting learning (see Marshall 2007). Related to this question, design knowledge and guidelines are still scarce.

The availability of toolkits for physical computing has made it significantly easier to develop systems, contributing to the interdisciplinarity of the field.

An exciting new direction for evolving work lies in the use of actuation. While with Tangible User Interfaces initially only input was tangible, actuation allows for tangible system output beyond visual and auditory feedback.

45.6 Relevant conference series

TEI (Tangible, Embedded, and Embodied Interaction) is the first conference series that is dedicated to Tangible Interaction. It took place first in 2007, in Baton Rouge, Louisiana. TEI is a yearly conference with proceedings published in the ACM DL and in 2010 is now organized in collaboration with ACM SigCHI.

Other conferences such as CHI, NordiCHI, OzCHI, DPPI, Interact, IDC, Pervasive, UbiComp, DesForm, DIS (Designing Interactive Systems), and IEEE Tabletop also tend to invite submissions on Tangible Interaction or Tangible Interfaces. In general, most conferences in the HCI or (Interaction) design area nowadays consider tangible interaction a standard topic. The same holds for journals, with Personal and Ubiquitous Computing being the most prominent, and featuring the first special issue on ‘Tangible User Interfaces in Perspective’ in 2004.

Before TEI was established, tangible interaction had been a focus or was listed as a topic of several workshops, for example

45.7 Related Topics

Given the field of Tangible Interaction is still developing and has multiple origins and inspirations, there are numerous related topics, which are only loosely demarcated from Tangible Interaction.

Among the closely related areas there are, for example, Physical Computing (a term made popular by Tom Igoe and employed by designer/artist makers), Tangible User Interfaces (see Ishii 2007), Graspable User Interfaces (cf. Fitzmaurice et al 1995), physical-digital appliances (a focus on designing interactive intelligent products), interactive spaces (as discussed early, important inspiration for Tangible Interaction came from interactive spatial art installations), and Tangible Augmented Reality (which employs principles of tangible input in an Augmented Reality context).

Somewhat wider related areas are, for example, Appliance Design, Whole-Body Interaction and Movement-Interaction (which rely less on tangible objects), Interactive Tabletops/Surfaces (which might feature tangible input elements, but may just rely on pure touch), Embodied Interfaces, Ambient Technology, Ubiquitous and Pervasive Computing, Interactive Buildings and Interactive Furniture, or Organic Interfaces. As fields these have less of a focus on tangibility, albeit example systems from these areas might very well fit within the area of Tangible Interaction.

45.8 What do you think?

Voice your opinition or make additions to this entry in the comments further down the page.

45.9 Suggestions for further reading

..

Djajadiningrat, Tom, Wensveen, Stephan, Frens, Joep and Overbeeke, Kees (2004): Tangible Products: redressing the balance between appearance and action. In Personal and Ubiquitous Computing, 8 (5) pp. 294-309

Dourish, Paul (2001): Where the Action Is: The Foundations of Embodied Interaction. MIT Press

Fernaeus, Ylva, Tholander, Jakob and Jonsson, Martin (2008): Towards a new set of ideals: consequences of the practice turn in tangible interaction. In: Proceedings of the 2nd international conference on Tangible and embedded interaction 2008, Bonn, Germany. pp. 223-230

Fitzmaurice, George W. (1996). Graspable User Interfaces (Ph.D. Thesis). Retrieved [Date unavailable] from http://www.dgp.toronto.edu/~gf/papers/PhD%20-%20Graspable%20UIs/Thesis.gf.html

Fitzmaurice, George W., Ishii, Hiroshi and Buxton, Bill (1995): Bricks: Laying the Foundations for Graspable User Interfaces. In: Katz, Irvin R., Mack, Robert L., Marks, Linn, Rosson, Mary Beth and Nielsen, Jakob (eds.) Proceedings of the ACM CHI 95 Human Factors in Computing Systems Conference May 7-11, 1995, Denver, Colorado. pp. 442-449

Hornecker, Eva and Buur, Jacob (2006): Getting a grip on tangible interaction: a framework on physical space and social interaction. In: Proceedings of ACM CHI 2006 Conference on Human Factors in Computing Systems 2006. pp. 437-446

Hummels, Caroline, Overbeeke, Kees and Klooster, Sietske (2007): Move to get moved: a search for methods, tools and knowledge to design for expressive and rich movement-based interaction. In Personal and Ubiquitous Computing, 11 (8) pp. 677-690

Igoe, Tom and O'Sullivan, Dan (2004): Physical Computing: Sensing and Controlling the Physical World with Computers. Course Technology

Igoe, Tom and O'Sullivan, Dan (2004): Physical Computing: Sensing and Controlling the Physical World with Computers. Course Technology

Ishii, Hiroshi (2007): Tangible User Interfaces. In: Sears, Andrew and Jacko, Julie A. (eds.). "The Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies and Emerging Applications (2nd Edition)". Lawrence Erlbaum Associatespp. 469-487

Ishii, Hiroshi and Ullmer, Brygg (1997): Tangible Bits: Towards Seamless Interfaces between People, Bits and Atoms. In: Pemberton, Steven (ed.) Proceedings of the ACM CHI 97 Human Factors in Computing Systems Conference March 22-27, 1997, Atlanta, Georgia. pp. 234-241

Shaer, Orit and Hornecker, Eva (2010): Tangible User Interfaces: Past, Present and Future Directions. In Foundations and Trends in Human-Computer Interaction, 3 (1) pp. 1-138

Ullmer, Brygg and Ishii, Hiroshi (2001): Emerging Frameworks for Tangible User Interfaces. In: Carroll, John M. (ed.). "Human-Computer Interaction in the New Millennium". Addison-Wesley Publishingpp. 579-601

45.10 Extended literature list

..

Bongers, Bert (2002): Interactivating Spaces. In: Proceedings of the the 4th Annual Symposium on Systems Research in the Arts 2002.

Djajadiningrat, J. P., Overbeeke, Kees and Wensveen, Stephan (2000): Augmenting fun and beauty: a pamphlet. In: Designing Augmented Reality Environments 2000 2000. pp. 131-134

Hornecker, Eva, Jacob, Robert J. K., Hummels, Caroline, Ullmer, Brygg, Schmidt, Albrecht, Hoven, Elise van den and Mazalek, Ali (2008): TEI goes on: Tangible and Embedded Interaction. In IEEE Pervasive Computing, 7 (2) pp. 91-96

Jensen, Mads Vedel, Buur, Jacob and Djajadiningrat, Tom (2005): Designing the user actions in tangible interaction. In: Bertelsen, Olav W., Bouvin, Niels Olof, Krogh, Peter Gall and Kyng, Morten (eds.) Proceedings of the 4th Decennial Conference on Critical Computing 2005 August 20-24, 2005, Aarhus, Denmark. pp. 9-18

Jordà, Sergi, Geiger, Günter, Alonso, Marcos and Kaltenbrunner, Martin (2007): The reacTable: exploring the synergy between live music performance and tabletop tangible interfaces. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 139-146

Marshall, Paul (2007): Do tangible interfaces enhance learning?. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction 2007. pp. 163-170

Poynor, R. (1995): The Hand That Rocks the Cradle. In I.D. The International Design Magazine, pp. 60-65

Underkoffler, John and Ishii, Hiroshi (1999): Urp: A Luminous-Tangible Workbench for Urban Planning and Design. In: Altom, Mark W. and Williams, Marian G. (eds.) Proceedings of the ACM CHI 99 Human Factors in Computing Systems Conference May 15-20, 1999, Pittsburgh, Pennsylvania. pp. 386-393

Wellner, Pierre, Mackay, Wendy E. and Gold, Rich (1993): Computer-Augmented Environments: Back to the Real World - Introduction to the Special Issue. In Communications of the ACM, 36 (7) pp. 24-26

Topics in This Book Chapter

469 shares

Open Access—Link to us!

We believe in Open Access and the democratization of knowledge. Unfortunately, world-class educational materials such as this page are normally hidden behind paywalls or in expensive textbooks.

If you want this to change, , link to us, or join us to help us democratize design knowledge!

Share Knowledge, Get Respect!

Share on:

or copy link

Cite according to academic standards

Simply copy and paste the text below into your bibliographic reference list, onto your blog, or anywhere else. You can also just hyperlink to this book chapter.

Hornecker, E. (2015, July 5). Tangible Interaction. Interaction Design Foundation - IxDF.

New to UX Design? We're Giving You a Free eBook!

The Basics of User Experience Design

Download our free ebook “The Basics of User Experience Design” to learn about core concepts of UX design.

In 9 chapters, we’ll cover: conducting user interviews, design thinking, interaction design, mobile UX design, usability, UX research, and many more!

A valid email address is required.
315,315 designers enjoy our newsletter—sure you don’t want to receive it?

Download Premium UX Design Literature

Enjoy unlimited downloads of our literature. Our online textbooks are written by 100+ leading designers, bestselling authors and Ivy League professors.

Bringing Numbers to Life
The Encyclopedia of Human-Computer Interaction
Gamification at Work: Designing Engaging Business Software
The Social Design of Technical Systems: Building Technologies for Communities

New to UX Design? We're Giving You a Free eBook!

The Basics of User Experience Design

Download our free ebook “The Basics of User Experience Design” to learn about core concepts of UX design.

In 9 chapters, we’ll cover: conducting user interviews, design thinking, interaction design, mobile UX design, usability, UX research, and many more!

A valid email address is required.
315,315 designers enjoy our newsletter—sure you don’t want to receive it?