Next Article in Journal
Hybrid Multi-Criteria Decision Making for Additive or Conventional Process Selection in the Preliminary Design Phase
Previous Article in Journal
Development of Self-Help Lifting Pads for Elderly People with Difficulty in Sitting Up
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Content-Specific Display: Between Medium and Metaphor

by
Lukas Van Campenhout
*,
Elke Mestdagh
and
Kristof Vaes
Product Development, Faculty of Design Sciences, University of Antwerp, 2000 Antwerp, Belgium
*
Author to whom correspondence should be addressed.
Designs 2024, 8(6), 109; https://doi.org/10.3390/designs8060109
Submission received: 15 August 2024 / Revised: 13 September 2024 / Accepted: 24 September 2024 / Published: 25 October 2024
(This article belongs to the Section Smart Manufacturing System Design)

Abstract

:
This paper examines the current generation of displays, as found primarily in smartphones, laptops and tablet computers, from an interaction design perspective. Today’s displays are multifunctional, versatile devices with a standardized, rectangular shape and a standardized interaction. We distinguish two pitfalls. First, they facilitate an interaction that is isolated and detached from the physical environment in which they are used. Second, their multi-touch interface provides users with few tangible clues and handles. From our background in embodied interaction, we establish an alternate, hypothetical vision of displays: the content-specific display. The content-specific display is defined as a display designed for one specific function and one type of on-screen content. We explore this concept in three student projects from the First Year Master’s program at Product Development, University of Antwerp, and present two key themes that emerge from it: causality and transformation. Both themes reside in the field of coupling, a concept well-known within the field of embodied interaction, and aim at a more seamless integration of on-screen content within the physical world. Finally, we discuss how the content-specific display influences the design process of digital products, and how it fosters collaboration between product designers and designers of graphical user interfaces.

1. Introduction: The Current Generation of Displays

In the last two decades, the rapid proliferation of displays has been undeniable and possibly irreversible. Today, displays are everywhere. They form the main interface of the current generation of computers and computer-like devices: smartphones, laptops and tablet computers. Next to that, displays are a standard feature of most function-specific digital products, including cars, household appliances, sports equipment, digital toys, medical devices, public devices and industrial machines. In fact, displays have become so omnipresent in daily life that it is hard to find a place or situation where they are absent. We dare to say that the display has changed the face of industrial design [1]. As a consequence, displays have penetrated the personal space of people, and affect their behavior and identity. In this chapter, we discuss the standardized shape and interaction of the current generation of displays and formulate two forms of criticism.

1.1. Standardization

A reason for the ubiquity and omnipresence of today’s displays is their versatility. Today’s displays are conceived as multipurpose interfaces that can be employed in a wide area of applications and fulfill a diverse range of functions. Such versatility would not be attainable if the display was not, to a certain degree, a standardized device. From an interaction perspective, we distinguish three forms of standardization.
First, most displays share the same overall shape. They follow the golden standard of a flat, mostly rectangular and rigid form on which 2D images, be they static or moving, are presented [1].
Second, what is depicted on today’s displays, their on-screen content, is standardized. Today, this on-screen content is mostly provided by the Graphical User Interface (GUI), which has been the dominant paradigm in interaction with digital products for the past three decades [2]. The GUI makes digital information visually perceivable and accessible by generating on-screen content in the form of windows, icons and menus, and by spatially arranging them over the display’s surface. Originally, the GUI was developed for the desktop PC, a device that fulfills many different tasks. In order to suit all these tasks, the GUI was conceived as a generic interface: a dialogue between the user and the product, where the user manipulates icons and navigates through menus. As the PC left the office and the desktop, and as digital technology found its way into everyday life, the GUI has been increasingly applied in various digital products and systems [3,4]. It is said to be “generic and not sensitive to the particular nature or character of some given piece of information or specific act of use” [5]. In other words, the GUI has standardized the interaction with displays.
A third form of standardization stems from a specific hardware evolution: the advent of the multi-touch display. Products equipped with multi-touch displays offer enhanced versatility. Consequently, smartphones can adopt lots of identities. Next to a mobile phone, a smartphone is an Internet browser, a music player, a GPS system, a pocket calculator, an agenda and a camera. We referred to this phenomenon as the multi-touch computing paradigm [4]. The flip side of this versatility is, once again, a standardized interaction. Specific interaction routines with dedicated artifacts are replaced by standard finger gestures on a glass plate. The desktop PC was already criticized for the extent to which it homogenized different human tasks and activities [6]. The multi-touch display has further accelerated this evolution.

1.2. Two Forms of Criticism

1.2.1. A Detached Interaction

Driven by standardization, today’s displays are windows on a separate, detached digital world. They are distinct black boxes within the physical environment, predominantly utilizing a GUI interface, through which one person interacts with digital phenomena. When this person engages in such an interaction, they are drawn into a dialogue with this other, separate digital world, away from their physical environment and social context. The GUI display forms a standalone entity, which is decoupled from its physical surroundings. The digital information that it provides, its on-screen content, resides on the display and never becomes an integral part of the environment in which it is used [7,8]. One popular term that speaks volumes is that of the smartphone zombie, indicating the lot of inattentive smartphone users that form a danger in today’s city traffic [9]. This paradigm, where the digital is a separate realm, only accessible through discrete windows in the physical environment, forms a foundation of traditional Human–Computer Interaction (HCI) [10], and is still dominant today. To illustrate this point, consider the following example. In-vehicle GPS systems do not display navigation instructions directly on the car’s windshield, in order to match with the road reality. Instead, this information stays within the boundaries of the vehicle’s built-in display. As such, the driver is required to divert their attention away from the road in order to obtain the information. By separating the information from its contextual surroundings, the GPS system does not fit in the action of driving a car, and disrupts the interaction flow. Such systems poorly correspond with the way the driver interacts with their physical environment [11]. Furthermore, they are distracting and unsafe to use [12].

1.2.2. Touching the World through a Glass Plate

Before 2007, the year Apple launched its iPhone, displays were mainly output devices. They provided visual information, including text, graphics, animations, pictures or movies. The display was only one part of a digital system, and the user interacted with this system and the display through a physical control element, such as a button setup (keyboard) and/or a pointer device (mouse, joystick). With the advent of multi-touch technology, displays, in addition to output devices, also became input devices. The multi-touch display has the characteristic that it replaces physical control elements with on-screen ones. Push buttons, levers, sliders and dials are all represented and directly manipulated on the display. We can say that the multi-touch display has dematerialized the physical control element [4]. A pitfall of the multi-touch display is its lack of tangibility. The on-screen control element is less tangible than its physical counterpart. You can touch it, but you cannot feel it [13]. There is no graspable shape or texture, and no mechanical movement. The information that the display provides, is visual and symbolic and requires cognitive interpretation from the user. By relying solely on cognitive skills and not stimulating the body, GUIs lead to an impoverishment of interaction and are often experienced as monotonous and repetitive to use [14].

1.3. Research Question

Following the desktop computing era, during which personal computers were used to fulfill office-related tasks, we are now in the era of ubiquitous computing [15]. Computers have “left the office and interweaved themselves in the fabric of everyday life” [16]. In both eras, the GUI display together with a standardized form of interaction, whether with a mouse and keyboard or with multi-touch technology, plays a capital role. Desktop PCs, laptops, smartphones and tablet computers are highly versatile devices that can perform different tasks and adopt multiple identities. While there is obviously merit in this versatility, we question its unilateralism, and regret the loss of tangibility and rich physical actions that it has brought about. We want to explore another side of displays, and formulate our research question: How can a display be defined that does not convey a sense of detachment, and that offers tangible richness and the capacity for expressive action routines?

2. Foundations for a New Perspective

In this chapter, we present two frameworks that shed a different light on displays. Next, we coin the concept of the content-specific display.

2.1. Embodied Interaction

PCs, smartphones, laptops and tablet computers are exponents of traditional HCI [17] (p. 206). We adhere to another perspective on interaction with digital phenomena: embodied interaction [17] (pp. 99–126). Embodied interaction has roots in phenomenology [18,19] and ecological psychology [20]. It considers humans as physical creatures with a mind and a body that are strongly intertwined, and advocates a deep integration of digital phenomena in the physical world, by giving them material form, or embodying them. Consequently, these digital phenomena become tangible and graspable, and fit better in the physical world. People’s interaction with them becomes more natural and intuitive than the “abstract, symbolic styles of representation and interaction” that characterize traditional HCI interfaces [17,21] (p. 206). Embodied interfaces feel physical, while traditional HCI interfaces feel computer-like [3,22].
An example of embodied interaction with a digital system is riding with an e-bike. The pedaling movements of the cyclist, together with environmental factors such as hills and headwinds, are detected by the e-bike’s motor system. This motor system adapts to these inputs by generating extra power when needed, maintaining a similar level of effort for the cyclist, throughout their cycling journey. In the case of an e-bike, the user interacts with a digitally controlled machine in a natural and intuitive way that does not feel computer-like and relies on bodily skills.

2.2. Strong Specific Products

Within the field of embodied interaction, we adopt a particular perspective. This perspective advocates strong specific products, or digital products that are specifically designed to perform one particular function. Buxton [23] states that single-purpose products offer advantages over the one-size-fits-all approach of the personal computer. The reason for this is specialization [17] (pp. 195–196). If a product is dedicated to a specific task, its user interface can be designed around this task and its design can reflect a physical commitment to it. The authors in [3,24] speak of strong specific products, which they place opposite to the weak general PC. The single-purpose character of strong specific products forms the blueprint for the very nature of their interaction concept, and opens the gate to the physical world and embodied interaction [4]. In order to clarify our stance, we provide an example. A well-known strong specific product is the pen display (Figure 1). The pen display fulfills one particular function, on-screen sketching and illustration, and it was designed to do so. Its single-purpose character and its design stimulate a physical, embodied interaction that differs substantially from interaction with a GUI and a mouse-keyboard setup. For example, using the mouse to drag a file into the on-screen wastebin feels like dropping a paper sheet into a real wastebin. On the contrary, interacting with a pen display does not only feel like drawing on a flat surface, it is, in fact, drawing on a flat surface. The on-screen content of the pen display is part of this interaction concept, since it was designed for interaction with a pen, instead of a mouse or finger gestures.

2.3. The Content-Specific Display

The perspective of strong-specific products inspired us to coin a hypothetical concept, which we refer to as the content-specific display [1]. The content-specific display is a specialized, single-purpose display, tailored to a single task, and to a single type of on-screen content. This implies that it only supports on-screen content that was specifically designed for that type of display. Consequently, the content-specific display can reflect a physical commitment to the on-screen content that it displays. In such instances, the display and its on-screen content form a synergy [1], they are coupled. Coupling is a recurring theme in the field of embodied interaction [17] (pp. 138–144). It has been the focus of our previous research [25,26], and will continue to be so.

3. Research Design

The design research that we conducted adheres to the Design Science Research Methodology (DSRM) process model, proposed by Peffers et al. [27,28]. This model provides a nominal sequence of six activities, which are listed below. For each activity, we have provided a brief explanation of how and to what extent it was implemented in our research process.
  • Problem identification and motivation
In Section 1.2, we identified two problems that are related to the current generation of smartphone, laptop and tablet displays. These problems are a detached interaction, and a dematerialization of physical control elements. In Section 1.3, we formulated our research question.
2.
Definition of objectives for a solution
In Section 2.3, we defined a qualitative objective, by proposing the content-specific display as a concept to support solutions for both problems. In this study, we did not define quantitative objectives.
3.
Design and development
In order to articulate the content-specific display, and to deepen our understanding of coupling, we employed a Research through Design [29] approach. We initiated the IxD (Interaction Design) Project in the 1st Year Master’s program at the Product Development program, University of Antwerp. As part of this project, the students designed and built a conceptual product incorporating a content-specific display. The results of this project are presented in Section 4.
4.
Demonstration
The resulting demonstrators and videos form an integral part of the first author’s lectures on embodied interaction at the University of Antwerp, and on public events.
5.
Evaluation
It should be noted that the demonstrators have not undergone a formal user test. Their objective was to explore and critically reflect [30], rather than to verify. It is therefore up to the viewer of the accompanying videos to assess the extent to which the demonstrators provide a solution for the identified problems.
6.
Communication
This publication is aimed at design students, design researchers and design professionals. It uses videos to present the demonstrators and the interaction they afford [31], while simultaneously establishing a formal framework.

4. IxD Project Results

In this section, we present and discuss three results from the IxD project. In the IxD project, students were asked to define a specific product function and design a conceptual product that fulfills this function [32,33]. The conceptual product had to contain a content-specific display. Although the product was not required to align with an existing market context, it was essential that it possessed the necessary product characteristics in order to serve as an inspiration to design students, design researchers and professional designers. The objective of the project was to construct a working demonstrator that would enable the spectator to experience what the product looks and feels like [34,35]. The resulting demonstrators comprise 3D printed components, wooden elements and spare parts. The electronics of the demonstrators (LEDs, actuated movements) are realized with the Arduino prototyping platform, while the displays are simulated with a small projector. The most distinctive aspect of the IxD project is its integration of two seemingly separate design tasks: the design of the physical product and the design of the on-screen content. Both tasks are completed simultaneously, by the same students. We present three key results from this project and distill two themes that extend the concept of coupling.

4.1. Balans—Sien Heirbaut, Chiara Rousseau and Laurien Wouters

Demonstrator video can be seen in Supplementary Material: Video S1—Balans.

4.1.1. Description of the Interaction

Balans is situated in a house where two people live together, as a couple, friends or roommates. With Balans, each of the two persons can record an audio message for the other, who then listens to the recorded message. In the video, Balans is situated on a cabinet. It is a white, spiral-shaped object with two endings (Figure 2). The smaller ending is for recording a message. The larger ending is for playing messages. When not in use, the device is in neutral mode. Both endings are balanced, and the spiral’s side plane displays lines that accentuate the outer shape of the product by running across it. Chiara records a message for her roommate Sien, by pulling out the slider at the small ending (Figure 3a). The lines on the side plane of Balans follow the pull movement and run toward Chiara. The device is now in record mode. As Chiara speaks her message, the small ending of Balans fills up with light, representing the recorded message (Figure 3b). When Chiara stops recording, she pushes the slider back in (Figure 3c). The on-screen elements are pushed through the spiral shape, representing the recorded message that flows to the other side of Balans. At that point, the whole product tilts over to this ending (Figure 4a). Once tilted, the large ending of the product is filled with light, and the product is in play mode. The 3D shape of the product, as well as its on-screen content, shows that a message is waiting to be heard. Chiara leaves the house, and sometime later, Sien comes home. She sees that Chiara has left a message, and slides her finger into the slot at the large ending, toward the speaker (Figure 4b). The light follows the movement of her finger, and flows out of the device, as the message is played (Figure 4c). Once the message is played, Balans tilts back to neutral mode.

4.1.2. Balans as a Content-Specific Display

The side plane of Balans is a content-specific display (Figure 2). It was specifically designed, together with its on-screen content, for this particular product. It is a simple RGB LED matrix and does not require multi-touch or OLED technology. The display differs from the aforementioned display paradigm, in the following ways.
The display has a 3D shape that follows the spiral shape of the product. The on-screen content seems to ‘be aware’ of this shape, as it itself is shaped by it. The light, representing the message, moves from one product ending to the other and shows the user where they should approach the product.
The product features two physical shape changes. First, there is the pulling of the slider at the small ending in order to record the message, a shape change performed by the user (Figure 3a). Second, there is the tilting of the product toward the large ending, an actuated movement (Figure 4a). Both shape changes are meaningful in that they reveal something about the product’s state at that moment. Together with these physical shape changes, the on-screen content on the side plane moves along.

4.1.3. Coupling

In the context of the content-specific display, coupling refers to the relationship between the physical display on one hand, and its on-screen content on the other. Henceforth, movements of the physical display, whether initiated by the user [36] or automatically actuated [37,38], are referred to as physical events. On-screen movements, i.e., movements of the display’s on-screen content, are called on-screen events. Coupling encompasses the user’s perception of these events as belonging together and forming an integrated whole.
In a previous study [26], we defined two aspects of coupling, coupling of meaning and coupling of expression. Both aspects are interrelated, as coupling of meaning is a prerequisite for coupling of expression. We begin by explaining coupling of meaning, using a simple example. Consider a concept for a wall thermostat that regulates the temperature in a two-story flat (Figure 5). The thermostat has a display and two rotary dials. Two on-screen thermometers show the temperature of the ground and first floor, respectively. This concept couples a physical event, the rotating of the dial, with an on-screen event, the vertical movement of the thermometer level. Both events are coupled on the following aspects:
  • Time: When the left dial is rotated by the user, the temperature on the left thermometer, corresponding to the ground floor, is adjusted (Figure 5). The same process applies to the right dial and the temperature on the first floor.
  • Location: Each rotary dial is positioned next to the thermometer that it controls, in accordance with the principle of natural mapping [39].
  • Direction: The direction of the rotation is coupled to the movement of the thermometer level. When this direction is reversed, the movement of the thermometer level is reversed as well.
Figure 5. Concept for a wall thermostat.
Figure 5. Concept for a wall thermostat.
Designs 08 00109 g005
These three aspects belong to coupling of meaning. They concern pragmatic values [40], the product’s usability and utility: how well does a product fulfill the task it was designed to fulfill, and how easily can this task be accomplished by the user.
Coupling of expression extends beyond coupling of meaning and emerges as a consequence of it [26,41]. It is related to hedonic values or the user’s emotional well-being [40,42]. Balans clearly illustrates how both forms of coupling relate to one another. When the user pulls the slider to record a message, the lines on Balans’ display follow this movement, and move toward the user, as if they were pulled toward her (Figure 3a). The physical event of pulling the slider is coupled to the on-screen event on the aspects of time, location and direction, much like the rotary dials in the thermostat example. The user experiences coupling of meaning, which makes the interaction logical and intuitive. But there is more. When pulling the slider, the on-screen content seems to possess real physical properties, and the user gets the impression that they actually pull the on-screen elements toward them. This effect is further enhanced when the user, after recording the message, pushes the slider back (Figure 3c). The on-screen lines follow this movement, and cause Balans to tilt toward play mode, as if influenced by their ‘weight’ (Figure 4a). The user realizes that it is impossible to grasp and manipulate an on-screen event, and that on-screen elements are weightless as they inherently lack physical properties of their own [43,44,45]. The seemingly natural coupling has been deliberately staged in the design of the product and causes an experience of magic and surprise. We referred to this form of coupling, which seems natural and staged at the same time, as coupling of expression [26].
In Section 4.2, we will show that coupling of expression appears in two forms. Balans demonstrates one of them. We refer to the relationship where Newtonian laws seem to join physical and on-screen events as causality [1]. Newton states that interaction occurs when two bodies reciprocally act on each other. For every action in nature, there is an equal and opposite reaction. When this Newtonian logic is applied to both components of the content-specific display, this implies that they both act in pairs. Causality can operate in both directions, within the same content-specific display. A movement of a physical element causes a movement of the on-screen content, which in turn causes a movement of the physical element.

4.2. Furo—Anthony Collin, Christophe Demarbaix and Jasper Verschuren

Demonstrator video can be seen in Supplementary Material: Video S2—Furo.

4.2.1. Description of the Interaction

The Furo concept (Figure 6) is a beverage dispenser that is conceived as a tap with a digital display, mounted on a horizontal surface that also serves as a display. The concept enables the user to set the drink volume in a direct, intuitive way. At the beginning of the interaction routine, a red liquid substance is displayed on the horizontal surface, around the tap base (Figure 7a), indicating that the tap is ready for use. The user pulls the slider at the top of the tap, and seemingly absorbs fluid from the horizontal surface into the tap’s display (Figure 7b). They do this until the desired drinking volume is reached (Figure 7c). Subsequently, an indication is displayed on the horizontal surface (Figure 8a), and the user is asked to place a glass bowl in the indicated position. The user then places the glass bowl (Figure 8b), and the virtual fluid on the tap flows downwards, ‘becoming’ real fluid as it fills the glass bowl (Figure 8c). Together with the downward movement of both the virtual and the real fluid, the slider moves down to its original position (Figure 9a). The user then takes the glass bowl (Figure 9b), and the virtual fluid reappears around the tap base (Figure 9c).

4.2.2. Furo as a Content-Specific Display

The frontal curved surface of the tap body, together with the horizontal surface, forms a content-specific display (Figure 6). The shape of the tap display is vertical, positioned upright, and it comprises a top and a bottom end. The slider at the top end of the tap body is aligned with the display surface. At the bottom end, the tap body ends in a physical opening where the fluid leaves the tap. Furo features three physical shape changes, which may be performed by the user or actuated by the product: the movement of the slider at the top of the tap body (Figure 7c), the placing of the glass bowl (Figure 8b), and the fluid flowing out of the tap into the bowl (Figure 8c).

4.2.3. Coupling

Coupling of meaning is clearly present in Furo. The movement of the slider in the Furo concept is coupled to an on-screen event, namely, the change in the fluid level, on the aspects of time, location and direction (Figure 7b).
The causality theme, and thus coupling of expression, is also present in Furo. When the user pulls the slider upwards, the fluid at the tap base flows upwards as well, as if attracted by the movement of the slider. The act of pulling the slider upwards is perceived as a literal pulling up of the fluid in the display. As such the movement of a physical element appears to naturally cause a movement on the display. At the same time, the user realizes that what they experience is not real.
However, Furo’s most distinctive interaction sequence occurs when the on-screen fluid descends, and causes the physical fluid to flow out of the tap (Figure 8c). Here, the causality is reversed. But there is something else. The form-giving of Furo brings the on-screen fluid very close to the physical fluid. Where the on-screen fluid stops, the physical fluid begins, as if the on-screen fluid materializes, and becomes physical. Apparently, the wall between digital and physical is but a thin membrane, which is permeable in some places [17] (p. 36). We call this theme, where an on-screen element becomes a physical one or the other way around, transformation [1]. In a content-specific display, an entity can have two states, a physical state and an on-screen one. When the interaction is designed in such a way that the states alternate with one another, i.e., do not occur simultaneously, the entity is required to switch its state at certain moments. In the Furo concept, the transformation from virtual to physical fluid establishes a tangible link between the selected fluid volume, and the actual volume of fluid dispensed. This link makes the functioning of the concept easy to understand and adds to its usability. Furthermore, it evokes a feeling of surprise and alienation, which contributes to the concept’s aesthetic appeal. The user is aware that the transformation is not real, yet they experience it as real. The transformation theme belongs to coupling of expression and can be seen as the superlative of causality. In the transformation theme, the coupling between physical and on-screen events is taken to an extreme. The on-screen event literally becomes a physical one, as exemplified by Furo. We further developed transformation in the next project.

4.3. Opus–Victor Warndorff, Stijn Vanvolsem and Rens Musters

Demonstrator video can be seen in Supplementary Material: Video S3—Opus.

4.3.1. Description of the Interaction

The Opus terminal enables the transfer of data between a personal, mobile data carrier (e.g., a memory card or USB stick) and a fixed data storage device (Figure 10). The user approaches the terminal and inserts her data carrier, which is a brick-shaped token with a blue color, into the opening on the front (Figure 11a). Subsequently, the carrier slides further into the terminal, thereby transforming into an on-screen rectangle on the terminal’s display (Figure 11b). This on-screen rectangle has the same dimensions and blue color as the physical carrier. The on-screen rectangle moves away from the user, while the data are copied (Figure 11c). Upon reaching the rear of the terminal, the rectangle initiates a physical push-button to move upwards and present itself to the user (Figure 12a). By pushing this button down, the user consents to the transfer of her data to the storage device (Figure 12b). The blue rectangle moves out of the terminal’s display (Figure 12c), and a white rectangular outline returns to the user (Figure 13a). The outline then transforms back to its original state as a physical data carrier, which simultaneously slides out of the terminal (Figure 13b). The physical data carrier is white in color, as the user retrieves it (Figure 13c).

4.3.2. Opus as a Content-Specific Display

The interaction of Opus resembles that of Furo, in that the main surface of Opus is a content-specific display with a rectangular shape, and two interaction touchpoints, situated at the narrow ends of the rectangular shape (Figure 10). Opus features two distinct physical shape changes, the movement of the physical data carrier (Figure 11b), and the movement of the push button (Figure 12a).

4.3.3. Coupling

Similar to Furo, Opus employs transformation as the most distinctive aspect of its interaction routine. However, in contrast to Furo, the transformation occurs in both directions, from physical to on-screen and the other way around. At the start of the routine, the physical token transforms into an on-screen counterpart, the blue rectangle (Figure 11b). This rectangle passes through several events. It forms the status indicator of an on-screen loading bar, as it moves away from the user. Subsequently, it causes a push button to move upwards (Figure 12a). After the data transfer, it slides back to the user, appearing white and empty. It then transforms again into the token that has adopted the white color of the rectangle (Figure 13b). During the course of the interaction, the token dematerializes temporarily [44], and subsequently reverts to its material state.
The two distinct states that an entity can adopt in a content-specific display afford different interaction possibilities. During the course of interaction, the entity switches between the two states depending on which one is preferred at that particular moment. This interplay between physical and on-screen states is exemplified by Opus. When the physical token slides into the terminal and transforms into its on-screen counterpart, it becomes impossible to grasp and retrieve it. This guarantees a robust physical connection between the token and the terminal, while the data are being transferred. The on-screen rectangle is now an icon and part of the device’s GUI. It moves away from the user, and once it has reached the button icon, the physical button rises out of the display surface and toward the user. This is the point at which the user decides to release her data. Once more, the concept of transformation is employed, this time to provide tactile information during a touch action on a display. The user pushes the button icon and feels the physical button going down. When the button is pushed down, the data are transmitted to the device. When the empty rectangle icon moves back to the user, it transforms again into the physical token, which slides out of the device and is taken by the user. Just like its on-screen counterpart, the physical token has become white, indicating that its data have been transferred. The interaction with the on-screen icon has left real traces on the physical token.

5. Discussion

In this discussion, we look back on the delivered work and articulate how the content-specific display deals with the two forms of criticism formulated in Section 1.2. Next, we lay out how it reflects our stance on embodied interaction.

5.1. Integrated Displays

Each of the three presented concepts features a display that is integrated into the physical environment in which it is used. This integration is twofold.
First, the shape of the display is adapted to this environment. In all three cases, the displays are not flat and rectangular. Rather, they have a 3D shape with a semantic meaning, which offers affordances to the user. This is certainly not novel. Emerging technologies allow for the fabrication of non-rectangular displays with unlimited shape possibilities [46], both 2D and 3D [47]. For example, in-vehicle displays are shaped such that they physically fit into the car’s dashboard [48].
Second, also the spatial layout and movements of the display’s on-screen content match the display’s physical environment. As such, the on-screen content constitutes an integral part of the display’s form semantics and affordances. We illustrate this with examples from the three concepts.
  • The Balans display presents two distinct endings, a small ending for recording messages, and a large ending for playing messages (Figure 2). The display is positioned in between these endings and its on-screen content is mostly conceived as a flow from the smaller ending to the larger one. The physical control element, the slider, is positioned in line with this flow, and its movement is coupled with the movement of the on-screen elements. As such, the combination of the display’s physical shape and on-screen content form a fundamental element of the product’s spatial topology, and thus of the physical environment that the product generates around itself.
  • Similarly, the tap display of Furo also has two endings (Figure 6), and its on-screen content is evenly conceived as a fluid flow between both endings. First, the on-screen fluid flows upwards within the tap’s body, and later, it flows downward and transforms into real fluid. The front surface of the slider, situated at the top of the tap, is aligned with the display surface, and its movement is coupled with the movement of the on-screen fluid.
  • Opus features, just like Furo, a longitudinal display, positioned between the data storage device and the user (Figure 10). The Opus display visualizes the actual flow of data from the user toward the storage device as if it were a bridge over which data can pass. This bridge has a tangible length, and the data appear to cross this length in real time, as they flow into the storage device. Consequently, the Opus display and its on-screen content constitute an active component of the spatial concept surrounding the storage device.
This second form of integration is a manifestation of embodied interaction. The on-screen content of each display does not merely represent a flow of fluid, data or spoken messages. Rather, it embodies this flow by giving it a physical place, direction, length and speed, situated within the display’s physical environment. The on-screen content literally flows out of the display into this environment, and out of the environment into the display, each time at the borders of the display. We argue that the content-specific displays in these projects are not decontextualized windows in a digital world. Rather, they are windows on specific places within the physical environment. When the user looks through one of these windows, they see a digitally augmented version of the physical environment.

5.2. Physical Interaction

In contrast to mechanical products such as umbrellas and scissors [13], which possess only moving physical parts and no display, today’s multi-touch displays are devoid of moving physical parts or controls. In this sense, they form the opposite of mechanical products. In a multi-touch display, the user interacts with on-screen push buttons, sliders and rotary dials by means of finger gestures on a glass plate, without actually feeling these control elements. Instead of tactility, the multi-touch display offers a seemingly endless variety of control elements that can appear or disappear at will. This is particularly well suited to a multi-purpose device like the smartphone, but for the content-specific display, which is designed for a specific task, the number of different on-screen interfaces is less of a concern. Consequently, the content-specific display offers the potential for the integration of specific physical control elements that enrich the on-screen content with tangible qualities and physical form semantics. The three conceptual products presented demonstrate this integration in a specific manner. None of them are conventional display-button configurations, like the thermostat presented in Section 4.1.3. As previously discussed, the movements of their physical control elements are coupled on the aspect of expression to the movement of their on-screen content. As such, these control elements physically express and afford the function that they control, in a manner that is more distinctive than a generic push button.
Balans and Furo provide physical control elements that are positioned next to the display. In contrast, the control element of Opus is integrated into the display itself. Opus features an on-screen push button that, when activated, literally transforms into a physical button, rising out of the display and the GUI (Figure 12a). When the user pushes this button down (Figure 12b), she authorizes the release of her data. The students who designed Opus considered the authorization of data release to be a pivotal moment in the interaction flow, and an important decision that should be taken consciously. They wanted to emphasize the action that accompanies this decision and make it stand out from the broader interaction routine. To this end, they temporarily materialized the on-screen button, thereby making the pushing action more tangible and prominent for the user. In order to lend the data release action the requisite importance, the students drew upon the rich values of the physical world [45].

5.3. Between Medium and Metaphor

According to Dourish [17] (pp. 101–102), the difference between “using the real world as a metaphor for interaction, or using it as a medium for interaction” lies at the heart of the distinction between traditional HCI and embodied interaction. PCs, laptops, smartphones and tablet computers, with their displays and GUI interface, rely on metaphors of the real world, with the desktop metaphor being the most prominent one. The icons on the GUI are visual representations of real-world artifacts, such as files and folders. By means of their on-screen shape, they suggest and guide action, and help people to understand the digital system. On the contrary, embodied interaction seeks to integrate digital phenomena into the real world, so that people interact with them simply by interacting in the real world. In that case, the real world becomes a medium for people to interact with digital phenomena.
The concept of the content-specific display advocates a stance between both interaction perspectives. It is still a traditional display in the sense that it contains on-screen content that is a representation of real-world things, and as such employs the real world as a metaphor. On the other hand, the content-specific display integrates its physical form and controls with its on-screen content to such an extent that its on-screen events feel physical. Balans, Furo and Opus employ the real world as a medium. Balans tilts over and changes its position under the weight of its on-screen content (Figure 4a). Furo turns on-screen fluid in real water (Figure 8c), and Opus transforms physical elements into on-screen ones (Figure 11b) and back again (Figure 13b). The interaction that emerges lies in a sweet spot between digital and physical events. This sweet spot is consciously pursued by the designer, in an iterative design process propelled by coupling. They can employ coupling of meaning, in order to make the interaction with the content-specific display more straightforward and intuitive. In addition, the designer may choose to go further and apply coupling of expression, in order to blur the distinction between physical and on-screen events. In our research, we have uncovered two forms that coupling of expression can assume: causality and transformation.
  • Causality starts from the fictitious idea that on-screen elements have physical properties: mass, weight and inertia. Because of these physical properties, they coexist with physical elements in a natural way. In the experience of the user, both on-screen and physical elements cause and influence each other’s movements, thereby following Newtonian laws. This causality goes back and forth. A physical movement may start an on-screen movement, which, in its turn, may cause another physical movement.
  • Transformation is a radical version of causality. The on-screen and the physical element do not just cause each other’s movements, they literally transform into one another, thereby adopting each other’s state. The on-screen element becomes physical and as such acquires physical properties. On the other hand, a physical element can dematerialize and transform into an on-screen one, thereby gaining digital characteristics such as form flexibility and multiplicability.
Because of its focus on coupling, the design process of a product with a content-specific display is different from that of a traditional digital product. In the next chapter, we illustrate what it means to design for the content-specific display.

6. Designing for the Content-Specific Display

6.1. An Integrated Process

When a display’s form and interaction are entirely designed around one specific function, this means that both the physical display and its on-screen content can be designed simultaneously, as a whole. Moreover, they can mutually influence each other: the physical shape of the display determines the shape and nature of its on-screen content. On the other hand, the on-screen content is allowed to constitute the physical display. Coupling physical and on-screen events thus becomes an important, driving aspect in the design process. We illustrate this with one of the presented conceptual products, Balans.

6.2. The Design Process of Balans

When we look back at the design process of Balans, we distinguish three important steps or breakthroughs. In each one of them, the coupling between physical and on-screen events played a propelling role.
Already in the first brainstorms, the idea of the tilting spiral emerges, with the display integrated into the side plane (Figure 14a). The on-screen content and its movement were already present in the initial sketches (Figure 14b). A cardboard model was made to get a grip on the overall dimensions (Figure 15a). The first projections of the on-screen content were performed on this cardboard model (Figure 15b). At this moment, the product was rotationally symmetrical. Both roommates had their own ending, and at both endings, messages could be recorded and played. The tilting movement went both ways.
In a second phase, the students decided to enlarge the surface of the side plane, in order to emphasize the display and its on-screen content (Figure 16a). The spiral shape obtained more body and character, and was refined. The product was still rotationally symmetrical. Two identical sliders were designed at each product ending. The movement of the slider was coupled with the movement of the on-screen content. A physical model was built, and projections were created to simulate the on-screen content (Figure 16b).
In the third phase, each ending was assigned a specific function. One ending is for recording messages, and the other one is for playing them. As a result, recording and playing functions were given specific controls that fit the physical shape of the product. The on-screen content was redesigned, and its design drove the physical shape of the large ending. The metaphor of the post horn was adopted (Figure 17), and the overall product’s shape was further refined. The final product only tilts to one side. A final physical model was built, including an actuated mechanism for tilting. The projection was finalized.

6.3. On Design Practice

The design process of Balans goes radically against the prevailing design practice of today’s consumer electronics, in which the physical and the digital are designed separately from each other, and sequentially: first a generic hardware structure is created (a PC, a smartphone or a tablet), and only after that, software applications are designed and added. Today, software design is driven by hardware design.
With regard to digital products, Buxton distinguishes two types of design [23]. The first type, Design behind the glass, refers to the creation and streamlining of on-screen content. The second, Design in front of the glass, refers to the design of the physical product around the display. From the perspective of the content-specific display, on-screen design can precede and drive product design, so that both occur simultaneously and are mutually intertwined. This implies that product designers and designers of graphical user interfaces collaborate closely with each other, while even sharing overlapping activities. Designing on-screen content becomes a part of the product designer’s task, next to the design of the physical product and its display. Product designers must therefore acquire the skills to cope with this on-screen content. On the other hand, designers of graphical user interfaces become more involved in the design of the physical product. They must widen their focus from the display to the physical product that surrounds it. The space where both types of designers meet each other is coupling. It is through the creation of couplings that product designers will find themselves fascinated by on-screen design, and that designers of graphical user interfaces will discover the appeal of physical form and movement. Both types of designers share a passion for functional beauty. The joint creation of couplings will allow them to find common ground.

7. Conclusions

Today’s displays, as witnessed in smartphones and tablet computers, instigate an interaction that detaches the user from their physical environment. Moreover, the multi-touch technology that accompanies these displays, deprives this interaction from the rich values of the physical world: 3D shape and movement, texture and tactility. Both phenomena contain pitfalls. As a reaction to them, we formulated the following research question: How can a display be defined that does not convey a sense of detachment, and that offers tangible richness and the capacity for expressive action routines?
To address this question, we initiated a Research-through-Design project. We devised the concept of the content-specific display and explored it in three design projects, performed by First Year Master students. The student projects demonstrate that the content-specific display facilitates a more engaged interaction with the physical environment, and offers a more physical, tangible experience than that provided by smartphones or tablet computers. The key to successfully designing for these values lies in coupling. With the content-specific display as a vehicle, we deepened our understanding of coupling and identified two key themes: causality and transformation. Both themes are instantiations of coupling of expression. Finally, we presented the design process of one of the projects and outlined how it was propelled by the dual nature of the content-specific display and by coupling.

7.1. Design for a Hypothetical Concept

The content-specific display is a hypothetical concept [49]. We used it as a thought exercise that we conducted through the act of designing [50]. Upon reflection on the delivered work, we identify three merits in our research. First, we sought to contribute to the embodied interaction research agenda by developing new insights into the theory of coupling. A second contribution lies embedded in the demonstrators themselves, which act as exemplars of what embodied interaction can become in a world full of displays. These exemplars are particularizations of the newly formed theoretical coupling principles. They reveal these principles with a degree of nuance that goes beyond the theory itself. We want the reader of this paper to understand the theory in the first place. Furthermore, by offering an imagined experience with the videos [51], we want the reader to grasp what this theory means when brought into practice. As a third contribution, we aim for this work to serve as an exemplar of design methodology, as we advocate a holistic design process that encompasses both physical product design and on-screen design.
We are aware that the three demonstrators are not representative of commercial products. It is not always feasible or preferable to design and develop a content-specific display for each function-specific product, either from a commercial or a sustainability perspective [52]. Nevertheless, the principles of coupling that we delineated in this publication can inspire novel interaction styles for the function-specific products from our introduction, but also for smartphones and tablet computers. By combining a specifically designed physical control element with a traditional display, and by coupling the on-screen content of this display with the control element, traditional push button and multi-touch interaction can be elevated and made more intuitive and appealing.

7.2. Future Research

The demonstrators presented in this publication are of an outspoken exploratory nature, and were not intended for verification. In future work, we will test and evaluate the content-specific display. We will conduct a comparative user experiment, in which we will compare a commercial smartphone or tablet application with a self-designed content-specific display.
We believe that causality and transformation are not the only manifestations of coupling of expression. In order to further explore the research space that is formed by coupling in general, and by coupling of expression in particular, further research is required. Our intention is to conduct this research in the same way as we did for this paper, through the act of designing [29].

Supplementary Materials

All images and videos can be found on Google Drive. https://drive.google.com/drive/folders/1XzqZFoGnp6FUfK7MM2N7-o0kfCfZ6mAN?usp=sharing (accessed on 23 September 2024).

Author Contributions

Conceptualization, L.V.C.; methodology, L.V.C., K.V. and E.M.; validation, L.V.C., K.V. and E.M.; investigation, L.V.C. and K.V.; writing—original draft preparation, L.V.C.; writing—review and editing, L.V.C., K.V. and E.M.; visualization, L.V.C.; supervision, L.V.C. and K.V.; project administration, L.V.C.; All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The pictures presented in this study are available in the article. The presented videos are available in the Supplementary Material.

Acknowledgments

We want to thank the students from Product Development, University of Antwerp, for the conception and realization of the three demonstrators and videos presented in this article. We also want to thank Marieke Van Camp, for her research on the content-specific display, and for defining the concepts “causality” and “transformation”.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Van Camp, M. Embodied Interaction in a World Full of Displays. Doctoral Dissertation, University of Antwerp, Antwerp, Belgium, 2022. [Google Scholar]
  2. Bravo, R.Z. Descriptive Study on the Use of Bimanual and Same-Hand Multifinger Interaction on a Multitouch Display. Master’s Thesis, Södertörn University, Huddinge, Sweden, 2013. [Google Scholar]
  3. Djajadiningrat, T.; Wensveen, S.; Frens, J.; Overbeeke, K. Tangible Products: Redressing the Balance between Appearance and Action. Pers. Ubiquitous Comput. 2004, 8, 294–309. [Google Scholar] [CrossRef]
  4. Van Campenhout, L.D.E.; Frens, J.; Hummels, C.; Standaert, A.; Peremans, H. Touching the Dematerialized. Pers. Ubiquitous Comput. 2016, 20, 147–164. [Google Scholar] [CrossRef]
  5. Redström, J. Tangled Interaction: On the Expressiveness of Tangible User Interfaces. ACM Trans. Comput.-Hum. Interact. 2008, 15, 1–17. [Google Scholar] [CrossRef]
  6. Klemmer, S.R.; Hartmann, B.; Takayama, L. How Bodies Matter: Five Themes for Interaction Design. In Proceedings of the 6th ACM Conference on Designing Interactive Systems–DIS’06, University Park, PA, USA, 26–28 June 2006; ACM Press: New York, NY, USA, 2006; p. 140. [Google Scholar] [CrossRef]
  7. Ishii, H.; Ullmer, B. Tangible Bits: Towards Seamless Interfaces between People, Bits and Atoms. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems, Atlanta, GA, USA, 22–27 March 1997; ACM Press: New York, NY, USA, 1997; pp. 234–241. [Google Scholar] [CrossRef]
  8. Ong, S.K.; Wang, X.; Nee, A.Y.C. 3D Bare-Hand Interactions Enabling Ubiquitous Interactions with Smart Objects. Adv. Manuf. 2020, 8, 133–143. [Google Scholar] [CrossRef]
  9. Appel, M.; Krisch, N.; Stein, J.-P.; Weber, S. Smartphone Zombies! Pedestrians’ Distracted Walking as a Function of Their Fear of Missing out. J. Environ. Psychol. 2019, 63, 130–133. [Google Scholar] [CrossRef]
  10. Hornecker, E.; Buur, J. Getting a Grip on Tangible Interaction: A Framework on Physical Space and Social Interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Montréal, QC, Canada, 22–27 April 2006; ACM Press: New York, NY, USA, 2006; Volume 10, pp. 437–446. [Google Scholar]
  11. Riener, A.; Wintersberger, P. Natural, Intuitive Finger Based Input as Substitution for Traditional Vehicle Control. In Proceedings of the 3rd International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Salzburg Austria, 30 November–2 December 2011; ACM Press: New York, NY, USA, 2011; pp. 159–166. [Google Scholar] [CrossRef]
  12. Ramnath, R.; Kinnear, N.; Chowdhury, S.; Hyatt, T. Interacting with Android Auto and Apple CarPlay When Driving: The Effect on Driver Performance; TRL: Berkshire, UK, 2020. [Google Scholar] [CrossRef]
  13. Wensveen, S.A.G.; Djajadiningrat, J.P.; Overbeeke, C.J. Interaction Frogger: A Design Framework to Couple Action and Function through Feedback and Feedforward. In Proceedings of the 2004 Conference on Designing Interactive Systems Processes, Practices, Methods, and Techniques—DIS’04, Cambridge, MA, USA,, 1–4 August 2004; ACM Press: New York, NY, USA, 2004; p. 177. [Google Scholar] [CrossRef]
  14. Bodin, T.; Berglund, K.; Forsman, M. Activity in Neck-Shoulder and Lower Arm Muscles during Computer and Smartphone Work. Int. J. Ind. Ergon. 2019, 74, 102870. [Google Scholar] [CrossRef]
  15. Bødker, S. When Second Wave HCI Meets Third Wave Challenges. In Proceedings of the 4th Nordic conference on Human-Computer Interaction: Changing Roles, Oslo Norway, 14–18 October 2006; ACM Press: New York, NY, USA, 2006; pp. 1–8. [Google Scholar] [CrossRef]
  16. Weiser, M. The Computer for the 21st Century. Sci. Am. 1991, 265, 94–104. [Google Scholar] [CrossRef]
  17. Dourish, P. Where the Action Is: The Foundations of Embodied Interaction; The MIT Press: Cambridge, MA, USA, 2001. [Google Scholar] [CrossRef]
  18. Merleau-Ponty, M. Phenomenology of Perception, 1st digital ed.; Routledge: London, UK, 1982. [Google Scholar] [CrossRef]
  19. van Dijk, J.; Moussette, C.; Kuenen, S.; Hummels, C. Radical Clashes: What Tangible Interaction Is Made of. In Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction—TEI’13, Barcelona, Spain, 11–14 February 2013; ACM Press: New York, NY, USA, 2013; p. 323. [Google Scholar] [CrossRef]
  20. Gibson, J.J. The Ecological Approach to Visual Perception: Classic Edition, 1st ed.; Psychology Press: London, UK, 2014. [Google Scholar] [CrossRef]
  21. Maximilaan Smeitink, H.; Frens, J. Domino: Twin Interfaces to Unveil the Influence of Personality Traits on Usability. In Proceedings of the Sixteenth International Conference on Tangible, Embedded, and Embodied Interaction, Daejeon, Republic of Korea, 13–16 February 2022; ACM Press: New York, NY, USA, 2022; pp. 1–14. [Google Scholar] [CrossRef]
  22. Øritsland, T.A.; Buur, J. Taking the Best from a Company History—Designing with Interaction Styles. In Proceedings of the Conference on Designing Interactive Systems Processes, Practices, Methods, and Techniques—DIS ’00, New York City, NY, USA, 17–19 August 2000; ACM Press: New York, NY, USA, 2000; pp. 27–38. [Google Scholar] [CrossRef]
  23. Buxton, W. Less Is More (More or Less). In The Invisible Future: The Seamless Integration of Technology in Everyday Life; McGraw Hill: New York, NY, USA, 2001; pp. 145–179. [Google Scholar]
  24. Frens, J. Designing for Rich Interaction—Integrating Form, Interaction and Function. Doctoral Dissertation, Eindhoven University of Technology, Eindhoven, The Netherlands, 2006. [Google Scholar]
  25. Van Campenhout, L.; Frens, J.; Vaes, K.; Hummels, C. The Aesthetics of Coupling: An Impossible Marriage. Int. J. Des. 2020, 14, 1–16. [Google Scholar]
  26. Van Campenhout, L.; Vancoppenolle, W.; Dewit, I. From Meaning to Expression: A Dual Approach to Coupling. Designs 2023, 7, 69. [Google Scholar] [CrossRef]
  27. Peffers, K.; Tuunanen, T.; Rothenberger, M.A.; Chatterjee, S. A Design Science Research Methodology for Information Systems Research. J. Manag. Inf. Syst. 2007, 24, 45–77. [Google Scholar] [CrossRef]
  28. Dresch, A.; Lacerda, D.P.; Antunes, J.A.V., Jr. Design Science Research: A Method for Science and Technology Advancement; Springer International Publishing: Cham, Switzerland, 2015. [Google Scholar] [CrossRef]
  29. Koskinen, I.K. (Ed.) Design Research through Practice: From the Lab, Field, and Showroom; Morgan Kaufmann: Waltham, MA, USA; Elsevier: Amsterdam, The Netherlands, 2011. [Google Scholar]
  30. White, J.; Odom, W.; Brand, N.; Zhong, C. Memory Tracer & Memory Compass: Investigating Personal Location Histories as a Design Material for Everyday Reminiscence. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, Hamburg Germany, 23–28 April 2023; ACM Press: New York, NY, USA, 2023; pp. 1–19. [Google Scholar] [CrossRef]
  31. Yoo, M.; Berger, A.; Lindley, J.; Green, D.P.; Boeva, Y.; Nicenboim, I.; Odom, W. Beyond Academic Publication: Alternative Outcomes of HCI Research. In Designing Interactive Systems Conference, Pittsburgh PA USA, 10–14 July 2023; ACM Press: New York, NY, USA, 2023; pp. 114–116. [Google Scholar] [CrossRef]
  32. Smith, N. Present/Future Objects: Creating Material Knowledge in Speculative Design. In Proceedings of the 6th Annual Symposium on HCI Education, New York, NY, USA, 5–7 June 2024; ACM Press: New York, NY, USA, 2024; pp. 1–8. [Google Scholar] [CrossRef]
  33. Nabuurs, J.; Heltzel, A.; Willems, W.; Kupper, F. Crafting the Future of the Artificial Womb—Speculative Design as a Tool for Public Engagement with Emerging Technologies. Futures 2023, 151, 103184. [Google Scholar] [CrossRef]
  34. Lee, K.-R.; Kim, B.; Kim, J.; Hong, H.; Park, Y.-W. ADIO: An Interactive Artifact Physically Representing the Intangible Digital Audiobook Listening Experience in Everyday Living Spaces. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Yokohama Japan, 8–13 May 2021; ACM Press: New York, NY, USA, 2021; pp. 1–12. [Google Scholar] [CrossRef]
  35. Kim, S.; Jang, S.; Moon, J.; Han, M.; Park, Y.-W. Slide2Remember: An Interactive Wall Frame Enriching Reminiscence Experiences by Providing Re-Encounters of Taken Photos and Heard Music in a Similar Period. In Proceedings of the Designing Interactive Systems Conference, Virtual Event Australia, 13–17 June 2022; ACM Press: New York, NY, USA, 2022; pp. 288–300. [Google Scholar] [CrossRef]
  36. Moon, J.-Y.; Kim, N.; Goh, G.; Lee, K.-R.; Kim, H.; Park, Y.-W. Stubbi: An Interactive Device for Enhancing Remote Text and Voice Communication in Small Intimate Groups through Simple Physical Movements. In Proceedings of the 2023 ACM Designing Interactive Systems Conference, Pittsburgh, PA, USA, 10–14 July 2023; ACM Press: New York, NY, USA, 2023; pp. 1773–1788. [Google Scholar] [CrossRef]
  37. Zhong, C.; Wakkary, R.; Chen, A.Y.S.; Oogjes, D. deformTable: A Design Inquiry and Long-Term Field Study into Creative and Contingent Appropriations of a Shape-Changing Artifact. Int. J. Des. 2023, 17, 55–70. [Google Scholar] [CrossRef]
  38. Winters, A.; Barati, B.; Van Oosterhout, A.; Bruns, M. The Material Aesthetics Lab: Creating Interactive Experiences with Matter. Interactions 2023, 30, 12–15. [Google Scholar] [CrossRef]
  39. Norman, D. The Design of Everyday Things; Basic Books: New York, NY, USA, 2013. [Google Scholar]
  40. Hassenzahl, M. Experience Design: Technology for All the Right Reasons. Synth. Lect. Hum.-Centered Inform. 2010, 3, 1–95. [Google Scholar] [CrossRef]
  41. Djajadiningrat, T.; Matthews, B.; Stienstra, M. Easy Doesn’t Do It: Skill and Expression in Tangible Aesthetics. Pers. Ubiquitous Comput. 2007, 11, 657–676. [Google Scholar] [CrossRef]
  42. Diefenbach, S.; Kolb, N.; Hassenzahl, M. The “hedonic” in Human-Computer Interaction: History, Contributions, and Future Research Directions. In Proceedings of the 2014 Conference on Designing Interactive Systems, Vancouver, BC, Canada, 21–25 June 2014; ACM Press: New York, NY, USA, 2014; pp. 305–314. [Google Scholar] [CrossRef]
  43. Hallnäs, L.; Melin, L.; Redström, J. Textile Displays: Using Textiles to Investigate Computational Technology as Design Material. In Proceedings of the Second Nordic Conference on Human-Computer Interaction—NordiCHI ’02, Aarhus, Denmark, 19–23 October 2002; ACM Press: New York, NY, USA; p. 157. [Google Scholar] [CrossRef]
  44. Van Campenhout, L.D.E.; Frens, J.; Overbeeke, C.J.; Standaert, A.; Peremans, H. Physical Interaction in a Dematerialized World. Int. J. Des. 2013, 7, 1–18. [Google Scholar]
  45. Van Campenhout, L.; Frens, J.; Hummels, C.; Standaert, A.; Peremans, H. The Enriching Limitations of the Physical World. Pers. Ubiquitous Comput. 2019, 23, 81–98. [Google Scholar] [CrossRef]
  46. Serrano, M.; Roudaut, A.; Irani, P. Investigating Text Legibility on Non-Rectangular Displays. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, San Jose, CA USA, 7–12 May 2016; ACM Press: New York, NY, USA, 2016; pp. 498–508. [Google Scholar] [CrossRef]
  47. Benko, H.; Wilson, A.D.; Balakrishnan, R. Sphere: Multi-Touch Interactions on a Spherical Display. In Proceedings of the 21st Annual ACM Symposium on User Interface Software and Technology, Monterey, CA, USA, 19–22 October 2008; ACM Press: New York, NY, USA, 2008; pp. 77–86. [Google Scholar] [CrossRef]
  48. Simon, F.; Roudaut, A.; Irani, P.; Serrano, M. Finding Information on Non-Rectangular Interfaces. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; ACM Press: New York, NY, USA, 2019; pp. 1–8. [Google Scholar] [CrossRef]
  49. Zimmerman, J.; Stolterman, E.; Forlizzi, J. An Analysis and Critique of Research through Design: Towards a Formalization of a Research Approach. In Proceedings of the Conference on Designing Interactive Systems, Aarhus, Denmark, 16–20 August 2010; Volume 10, pp. 310–319. [Google Scholar]
  50. Gasca, O.M.; van Campenhout, L. The Silent Influence of Designed Things: A Postphenomenological Approach to Embodied Interaction. 2024. Available online: https://www.designsociety.org/publication/47681/The+Silent+Influence+of+Designed+Things%3A+A+Postphenomenological+Approach+to+Embodied+Interaction (accessed on 23 September 2024).
  51. Van Schaik, P.; Hassenzahl, M.; Ling, J. User-Experience from an Inference Perspective. ACM Trans. Comput.-Hum. Interact. 2012, 19, 1–25. [Google Scholar] [CrossRef]
  52. Holmquist, L.E. Bits Are Cheap, Atoms Are Expensive: Critiquing the Turn Towards Tangibility in HCI. In Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems, Hamburg, Germany, 23–28 April 2023; ACM Press: New York, NY, USA, 2023; pp. 1–8. [Google Scholar] [CrossRef]
Figure 1. Drawing on a pen display.
Figure 1. Drawing on a pen display.
Designs 08 00109 g001
Figure 2. Balans in neutral mode.
Figure 2. Balans in neutral mode.
Designs 08 00109 g002
Figure 3. (a) Chiara pulls out the slider, Balans goes to record mode. (b) Chiara records a message. (c) Chiara pushes the slider back in.
Figure 3. (a) Chiara pulls out the slider, Balans goes to record mode. (b) Chiara records a message. (c) Chiara pushes the slider back in.
Designs 08 00109 g003
Figure 4. (a) Balans in play mode. (b) Sien slides her finger toward the speaker. (c) Balans plays the recorded message.
Figure 4. (a) Balans in play mode. (b) Sien slides her finger toward the speaker. (c) Balans plays the recorded message.
Designs 08 00109 g004
Figure 6. Furo.
Figure 6. Furo.
Designs 08 00109 g006
Figure 7. (a) The red fluid is displayed on the horizontal surface. (b) The user pulls the slider up. (c) The desired drinking volume is reached.
Figure 7. (a) The red fluid is displayed on the horizontal surface. (b) The user pulls the slider up. (c) The desired drinking volume is reached.
Designs 08 00109 g007
Figure 8. (a) A position is indicated on the horizontal surface. (b) The user places a glass bowl. (c) The fluid flows out of the tap in the bowl.
Figure 8. (a) A position is indicated on the horizontal surface. (b) The user places a glass bowl. (c) The fluid flows out of the tap in the bowl.
Designs 08 00109 g008
Figure 9. (a) As the bowl fills up, the slider moves down. (b) The user takes the bowl. (c) The red fluid reappears on the horizontal surface.
Figure 9. (a) As the bowl fills up, the slider moves down. (b) The user takes the bowl. (c) The red fluid reappears on the horizontal surface.
Designs 08 00109 g009
Figure 10. Opus.
Figure 10. Opus.
Designs 08 00109 g010
Figure 11. (a) The user inserts the data carrier. (b) The data carrier slides into the terminal, and transforms into an on-screen rectangle. (c) The on-screen rectangle moves toward the data storage device.
Figure 11. (a) The user inserts the data carrier. (b) The data carrier slides into the terminal, and transforms into an on-screen rectangle. (c) The on-screen rectangle moves toward the data storage device.
Designs 08 00109 g011
Figure 12. (a) The push button moves up. (b) The user pushes the button. (c) The blue rectangle moves out of the terminal’s display.
Figure 12. (a) The push button moves up. (b) The user pushes the button. (c) The blue rectangle moves out of the terminal’s display.
Designs 08 00109 g012
Figure 13. (a) The white outline moves to the user. (b) The outline transforms into the data carrier. (c) The user takes the data carrier out of the terminal.
Figure 13. (a) The white outline moves to the user. (b) The outline transforms into the data carrier. (c) The user takes the data carrier out of the terminal.
Designs 08 00109 g013
Figure 14. (a) First sketches. (b) Different shapes offer different action possibilities.
Figure 14. (a) First sketches. (b) Different shapes offer different action possibilities.
Designs 08 00109 g014
Figure 15. (a) First cardboard model. (b) First projections.
Figure 15. (a) First cardboard model. (b) First projections.
Designs 08 00109 g015
Figure 16. (a) Sketches of the enlarged side plan. (b) A second physical model, with projections.
Figure 16. (a) Sketches of the enlarged side plan. (b) A second physical model, with projections.
Designs 08 00109 g016
Figure 17. The image of the post horn appears in the sketches.
Figure 17. The image of the post horn appears in the sketches.
Designs 08 00109 g017
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Van Campenhout, L.; Mestdagh, E.; Vaes, K. The Content-Specific Display: Between Medium and Metaphor. Designs 2024, 8, 109. https://doi.org/10.3390/designs8060109

AMA Style

Van Campenhout L, Mestdagh E, Vaes K. The Content-Specific Display: Between Medium and Metaphor. Designs. 2024; 8(6):109. https://doi.org/10.3390/designs8060109

Chicago/Turabian Style

Van Campenhout, Lukas, Elke Mestdagh, and Kristof Vaes. 2024. "The Content-Specific Display: Between Medium and Metaphor" Designs 8, no. 6: 109. https://doi.org/10.3390/designs8060109

APA Style

Van Campenhout, L., Mestdagh, E., & Vaes, K. (2024). The Content-Specific Display: Between Medium and Metaphor. Designs, 8(6), 109. https://doi.org/10.3390/designs8060109

Article Metrics

Back to TopTop