Next Article in Journal
People-Centred Development of a Smart Waste Bin
Next Article in Special Issue
Real-Time Stylized Humanoid Behavior Control through Interaction and Synchronization
Previous Article in Journal
A Compact Fiber-Coupled NIR/MIR Laser Absorption Instrument for the Simultaneous Measurement of Gas-Phase Temperature and CO, CO2, and H2O Concentration
Previous Article in Special Issue
My Caregiver the Cobot: Comparing Visualization Techniques to Effectively Communicate Cobot Perception to People with Physical Impairments
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Designing Man’s New Best Friend: Enhancing Human-Robot Dog Interaction through Dog-Like Framing and Appearance

1
School of Biomedical Engineering, Science and Health Systems, Drexel University, Philadelphia, PA 19104, USA
2
Warfighter Effectiveness Research Center, United States Air Force Academy, Air Force Academy, Colorado Springs, CO 80840, USA
3
Department of Psychology, George Mason University, Fairfax, VA 22030, USA
4
Institute of Creative Technologies, University of Southern California, Los Angeles, CA 90007, USA
5
Department of Psychology, College of Arts and Sciences, Drexel University, Philadelphia, PA 19104, USA
6
Drexel Solutions Institute, Drexel University, Philadelphia, PA 19104, USA
7
Department of Family and Community Health, University of Pennsylvania, Philadelphia, PA 19104, USA
8
Center for Injury Research and Prevention, Children’s Hospital of Philadelphia, Philadelphia, PA 19104, USA
*
Authors to whom correspondence should be addressed.
Sensors 2022, 22(3), 1287; https://doi.org/10.3390/s22031287
Submission received: 24 November 2021 / Revised: 28 January 2022 / Accepted: 2 February 2022 / Published: 8 February 2022

Abstract

:
To understand how to improve interactions with dog-like robots, we evaluated the importance of “dog-like” framing and physical appearance on interaction, hypothesizing multiple interactive benefits of each. We assessed whether framing Aibo as a puppy (i.e., in need of development) versus simply a robot would result in more positive responses and interactions. We also predicted that adding fur to Aibo would make it appear more dog-like, likable, and interactive. Twenty-nine participants engaged with Aibo in a 2 × 2 (framing × appearance) design by issuing commands to the robot. Aibo and participant behaviors were monitored per second, and evaluated via an analysis of commands issued, an analysis of command blocks (i.e., chains of commands), and using a T-pattern analysis of participant behavior. Participants were more likely to issue the “Come Here” command than other types of commands. When framed as a puppy, participants used Aibo’s dog name more often, praised it more, and exhibited more unique, interactive, and complex behavior with Aibo. Participants exhibited the most smiling and laughing behaviors with Aibo framed as a puppy without fur. Across conditions, after interacting with Aibo, participants felt Aibo was more trustworthy, intelligent, warm, and connected than at their initial meeting. This study shows the benefits of introducing a socially robotic agent with a particular frame and importance on realism (i.e., introducing the robot dog as a puppy) for more interactive engagement.

1. Introduction

1.1. Robot Dogs Designed as Social Actors

Dog-like robots have become more commonplace both in people’s personal lives and in their workplaces. For example, the social robot Aibo can alleviate loneliness [1], serve as a companion [2,3], or be used as a tool for group therapy [4]. More recently, the Air Force adopted four-legged, dog-like robots with autonomous capabilities that can assist with patrolling and guarding remote parts of operational bases, thus freeing up personnel to focus on other tasks [5].
Dog-like robots are commonly designed and deployed as autonomous social actors because they are arguably one of the more successful social robot paradigms to date. The reason why this robot form is so effective may be because it is modeled after human-dog relationships, which exhibit effective communication, cooperation, attachment, and bonding [6]. Humans and real dogs have biologically co-evolved in mutually beneficial symbiotic relationships that have developed over centuries [7,8]. Thus, creating a social machine in dog form may foster more accurate mental models of trust in and bonding with these robotic companions [9,10] than other less familiar forms.
However, not all robotic dogs receive an equally warm welcome. Boston Dynamics Spot, a dog-like robot, received praise for its epic dancing moves online but when deployed in New York for street surveillance, caused concern among residents who were disturbed by these robots [11]. Depending on the context and use, the perception and trustworthiness of these autonomous “dogbots” can vary greatly and may need to be re-evaluated. Additionally, such systems could benefit from enhancing a robot dog’s “dog-likeness,” defined here as the degree to which an object is designed to mimic the form, characteristics, and behavior of a real dog.
The purpose of the current study was to investigate how enhancing the dog-likeness of a robot may improve human-robot interaction, mutual bonding, and trust. We used Sony’s newly released and improved version, the ERS 1000, of the Artificial Intelligence roBOt or Aibo (https://us.aibo.com/, accessed on 4 February 2022). Aibo is a social dog robot designed to be a partner, companion, and friend (Aibo means pal or partner in Japanese) and can learn and evolve with long-term use. In the mid-2000s, there was a wave of studies investigating earlier models like the ERS 110 and 210, but few studies to date have investigated the ERS 1000 [12,13]. This study expands on previous work by evaluating the new model and assessing the benefits of enhancing the dog-likeness of a robot through framing and appearance [14,15,16].

1.2. Improving the Dog-Likeness of the Aibo Robot Dog for Long-Term Bonding and Trust

Aibo was specifically designed to emulate a dog-like pet [17,18] and has been utilized as a platform for comparing human-animal interactions with human-robot interactions (HRI). Studies have shown that, compared to non-animal-like robots, animal-like robots such as Aibo lead to more interactive engagements due to their zoomorphic appearance and movements [12]. Previous research has further demonstrated that users express empathy towards Aibo and form trust relationships and long-term attachments with this robot [19,20,21,22,23,24,25,26]. For example, long-time owners of Aibo robots have shown extraordinary commitment and attachment to their robotic pets, demonstrated by persistent commitments to repair them and even holding funeral services for their pets once repairs could no longer be made [20].
Although Aibo is designed like a dog, research investigating the behaviors of people interacting with Aibo shows that a robotic dog is not the same as a real one. Even though Aibo is treated similarly to a real dog, people report enjoying interacting with a living dog more and doing so for longer [15,27,28,29,30,31]. Primary reasons why Aibo may not be loved as much as real dogs may be due to their lack of emotions, aliveness, and personality [32]. Thus, robot dog design and behavior, to date, has not sufficiently emulated a living dog.
To address this design gap, Sony and researchers independently have tried to further improve Aibo’s dog-likeness by enhancing its appearance, behavior, and perception (see Figure 1). Improving on its original creation, Sony continuously re-designed Aibo’s form, and owners indicated that they preferred later generations of Aibo (ERS-7 and ERS-2xx) than the older generations [33]. Researchers have also found that enhancing an Aibo with puppy-scented fur stimulated adult dogs to approach the robot-like they approach a real puppy in some situations [34]. In terms of behavior, Aibo was enhanced with a complicated and unique behavior system that enables each Aibo robot to be unique in their internal “emotional states” [35]. Furthermore, extensive research has shown that framing the role, background, or origin of a robot can change how humans perceive and interact with it [36,37,38,39,40,41,42]. This may also be effective with Aibo as, for example, in Western society, dogs are already framed as “man’s best friend” [43]. Researchers have indeed shown that trust in Aibo can be increased by priming the initial expectations through framing Aibo’s role and agency when introducing the robot to participants [44]. People with higher initial expectations of a zoomorphic robot’s life-likeness may also be more likely to initiate companion-like relationships with the robot that are similar to human-human relationships [45].
Not all efforts to improve a robot’s dog-likeness may result in a positive outcome. A well-studied phenomenon in human-robot interaction is the uncanny valley hypothesis which predicts that as artificial agents become increasingly, but not perfectly human-like, peoples’ responses towards these agents become increasingly positive (e.g., likable, trustworthy) up to a point after which responses fall into a markedly negative “valley” (e.g., eerie, creepy) [46,47,48,49]. Recent studies have shown that there may also be an uncanny valley for robots that are animal-like (zoomorphic), such that robots high and low in animal-likeness (i.e., PARO, MiRO, etc.) were preferred over those mixing realistic and unrealistic features [50]. Recent news reports further indicate a potential public concern over creepy robot dogs [9]. The existence of an uncanny valley, or “uncanine valley” [13], for robotic dogs is, therefore, a possibility (see Figure 1), and attempts should be made to avoid it when enhancing the dog-likeness of a robot.
While most work defining human-likeness [51,52] or animal-likeness [50] has focused on the physical appearance and form of a robot, what comprises the overall assessment of a robot may include additional elements such as unique characteristics, personality traits, and behaviors. Recent work has begun to chart the relative contributions of these dimensions to outcome measures relevant to human-robot interaction [53,54]. Depending on the dog-like feature modeled on the x-axis (a combination of form, characteristics, behavior), the outcome measure (y-axis) may show a different pattern. In this paper, we focus on both the physical form of dog-likeness (Aibo’s shape and fur) and the behavioral component of dog-likeness by exploring how participants command Aibo and how it responds.

1.3. The Current Study

With Sony’s newly designed Aibo, the ERS 1000, we sought to re-evaluate its dog-like design and how this affects bonding, trust, and the potential for long-term interaction. Additionally, since previous studies have shown that enhancing the perception of Aibo can facilitate more engagement with the robot, we hypothesized the following:
Hypothesis 1 (H1):
Participants will respond more positively to and interact more with an Aibo that is framed as a puppy (i.e., in need of development) rather than framed as simply a robot.
Because research has also shown that as a robot’s appearance approximates the intended agent it mimics (e.g., dog-like), it may result in more positive assessments of the robot, we also proposed the second hypothesis:
Hypothesis 2 (H2):
Adding fur to an Aibo to increase its dog-like form will result in more positive assessments and interactions than an Aibo without fur.
We designed a 10-min study in which students from a military academy interacted briefly with an Aibo by issuing commands and engaging with the robot. To analyze human-robot interactions in detail, we used T-pattern analysis, a detection technique originally used by Magnusson to examine hidden real-time patterns in individual behavior [55]. This assessment method was used previously to compare individual behaviors of humans while interacting with an Aibo robot dog or with a real dog [15].

2. Materials and Methods

2.1. Participants

Twenty-nine participants (Age = 18–26 (20.53 +/−2.32 yrs); Females = 12) completed the study. All participants were recruited from the U.S. Air Force Academy and volunteered to participate in this study in exchange for course credits. For the analysis, one participants’ data were excluded due to poor quality in video recording.

2.2. Apparatus and Materials

Two ERS-1000 Sony Aibo dogs (Software version 2.5.1) were used for participant interaction in this study. The ERS-1000 has several cameras, sensors, and microphones that allow it to sense its environment (see Figure 2). Aibo also has two expressive OLED eyes that are used to convey expressions that the robot is tired, angry, or excited. Its ears, head, and tail can move to add further expressiveness and dog-like behavior. For the purposes of our experiment, we made one removable adaptation to Aibo by designing a fitted fur suit [13]. The suit was made from white textile faux fur that gave a realistic feel when touched and shed like a real dog. The fur was designed as a bodysuit on the dog and covered its back, shoulders, and hips. We attached the fur to the dog using Velcro. Along with the Aibos, we used a Lenovo ThinkPad laptop to video record the interactions that the participants had with the Aibos. Additionally, we used an open space, without any other people present, for the participants to interact with the dog. All questionnaires were administered to participants using Google Forms before and after the interaction (see Appendix B).

2.3. Task Paradigm

For easing the interaction between the participant and Aibo, the participants were given a list of commands that they could try with the Aibo. If the Aibo could understand the given command, Aibo would respond by engaging in behavior associated with the command. The commands and associated behaviors are shown in Table 1.

2.4. Video Coding and Self-Report Measures

For this study, each participant’s interaction with the Aibo was recorded with a video camera from a perspective that allowed for easy examination of both Aibo and participants’ actions. As shown in Table 2, the behaviors occurring throughout the interaction were categorized in terms of Aibo’s and individuals’ actions. After categorizing, two different experimenters manually coded footage second-by-second in each 10-min video to identify each unit of behavior and the start/end times for those behaviors for analyses. In the event of a coding mismatch, a third experimenter reviewed the footage again and made a final decision about the observed behavior in each case.
We used questionnaire assessments after a brief introduction of the robot dog and again after interacting with the Aibo to assess whether participants changed their opinions of the Aibo in a pre-post design. We asked participants to fill out a customized scale, developed in previous research [48], where participants rated their perceptions of Aibo across 12 characteristics. These measures are commonly used in studies of the uncanny valley effect, including creepiness, likability, scariness, trustworthiness, uncanniness, dog-likeness, consciousness, lifelikeness, intelligence, friendliness, connection, and warmth (see Appendix B for all 12 items). To establish clarity across participant ratings, we provided participants with definitions of each of the characteristics. Definitions were derived from definitions found in the Oxford English and Merriam Webster’s Dictionaries and Dictionary.com. Participants rated the Aibo both before and after interaction on each characteristic using Likert-type scales, which ranged from 0 (not at all) to 10 (extremely). Two additional items were included on this questionnaire as well; whether participants felt a connection with the dog and whether they thought the dog was warm and caring, rated from 1 (strongly disagree) to 5 (strongly agree).

2.5. Experimental Design and Procedure

For the experiment, we used a 2 × 2 between-subjects design with two independent factors: framing of the Aibo (puppy vs. robot), appearance (fur vs. no fur). Each of the 29 participants were randomly assigned to one of the four experimental conditions.
For framing, we framed Aibo either as a puppy or a robot when introducing Aibo to the participants. Two different scripts were developed for this purpose. In the puppy condition, Aibo was introduced by indicating that the dog was a puppy who was in training. We also used its name (Kipling or Bernard) and gendered pronouns like “he” and “him.” consistent with how previous research has introduced robots using backstories and framing [41,56,57,58]. In the robot condition, Aibo was described as a robot and was called, “Aibo” and referred to as “it” as a non-gendered identifier. The appearance condition consisted of Aibo either wearing a fur suit or not. In the fur condition, Aibo was outfitted with the fur suit (see Figure 2). In the no fur condition, the Aibo was presented to participants in its factory condition, without the fur suit. To differentiate these conditions, the name Kipling was used in the puppy no fur condition, while Bernard was used in the puppy fur condition.
For the study, participants began the experiment by completing an informed consent form. They were then randomly assigned into one of the four experimental conditions and guided to the experimental room, where participants were either introduced to an Aibo with a fur suit or without the fur suit. We then read the appropriate script either framing Aibo as a puppy with the name either Kipling or Bernard, or as a robot with the name Aibo. After a brief introduction to the dogbot, the participants then filled out the pre-interaction questionnaires about rating Aibo on the 12 characteristics mentioned. The participants were then instructed to interact with the Aibo robot for 10 min by attempting each of the eight commands provided in Table 1 with the robot. Throughout the experiment, a laptop was used to video record the participants interacting with Aibo. The experimenter left the room to allow participants to freely interact and attempt commands with Aibo. After 10 min, we asked the participants to fill out the post-interaction questionnaires, rating Aibo once again on the same characteristics. The entire experiment took approximately 20 min to complete.

2.6. Measures and Analysis of Commands Issued

Statistical analysis of the commands issued (including the total number of commands and command type) employed the use of linear mixed-effects with repeated measures across the entire sample, allowing for a population inference implemented in NCSS (NCSS, LLC. Kaysville, UT, USA; www.ncss.com, accessed on 4 February 2022), a comprehensive statistics software. The subject factor was treated as a random effect drawn from a larger population, while the fixed effects were conditions of framing and appearance. This analysis was employed to indicate any main or interaction effect of framing or appearance on command issuing behavior. The analyzed data is displayed in Table 3.

2.7. Measures and Analysis of Interaction Blocks

Interaction blocks within this experiment were defined as the initiation of a command (i.e., “Come Here”) ending with either the successful completion of the command by the Aibo, or the utterance of a different command (i.e., “Come Here” followed by “Sit Down” without completing “Come Here” first). The independent variables found within the Aibo and person were accounted for within interaction blocks, and linear mixed models with repeated measures were completed using the fixed factors of framing (puppy vs. robot) and appearance (fur vs. no fur) with subject as a random factor via the NCSS software. This modeling allowed for a higher resolution behavioral assessment compared to an overall interaction approach. Furthermore, this approach evaluated the duration of each person and Aibo. In addition to the Aibo, and person behaviors evaluated as dependent variables (see Table 2) using the linear mixed-effects approach.

2.8. T-Pattern Measures and Analysis

T-pattern is a detection algorithm that is commonly utilized to discover the hidden or non-obvious time patterns in behavior [55,59,60,61]. The basic assumption of this methodological approach is that the temporal structure of a complex behavioral system is mostly unknown, however, this system may contain a set of particular types of repeated temporal patterns (T-patterns) composed of much simpler and more distinguishable event-types, which are coded in terms of their beginning and end points (e.g., “person, b, look” represents the behavior of “person begins looking at Aibo”, while “Aibo, e, bark” indicates “Aibo ends barking”, as can be seen in Figure 3a [15]. The set of time point series that results from such coding of behavior within a specific observation window acts as the input to the T-pattern definition and detection algorithms (e.g., the behaviors “person ends looking”, “Aibo starts barking”, “Aibo ends barking”, “person ends looking” in order creates a T-pattern, represented by the black lines between the red dots, as can be seen in case in Figure 3b).
For T-pattern analysis in our study, the interactions were transcribed and analyzed using ThemeEdu 6.0 software (www.patternvision.com, accessed on 4 February 2022) [62]. During the coding procedure, we recorded the start and end points of each action of both participant and Aibo. Taking our search for temporal patterns (T-patterns) into account, we used a minimum of three occurrences in the 10-min period for each pattern as a search criterion to filter out the non-repetitive patterns. The tests for the critical interval were set at p = 0.005. Using the software, we extracted the total amount of interactive behavior patterns, defined as the patterns that contain both the participant and Aibo “actors”. We extracted several measures, including the number of unique interactive patterns, the number of interactive pattern occurrences, the average complexity level, and length of the T-patterns in the 10-min period for each participant. For the number of unique interactive T-patterns, each different pattern inside the interaction was counted as one, while for the number of interactive pattern occurrences, we counted the total number of T-patterns inside the 10-min interaction. T-pattern complexity level refers to the number of behaviors inside a specific pattern subtracted by 1 (e.g., a T-pattern that contains four behavior units is a 3rd level T-pattern). Thus, more stacked levels of behavioral units result in a higher value for T-pattern complexity. We also extracted the total number of times that any of the behavior units for Aibo or the participant, listed in Table 2, occurred inside an interactive T-pattern. To analyze the extracted T-pattern features, we used linear mixed models with the framing (puppy vs. robot) and appearance (fur vs. no fur) as fixed factors and subjects as a random factor via NCSS software.

3. Results

3.1. Aibo Performance and Participant Command Behavior

When a command was issued, the Aibo had an acknowledgment accuracy of 81% (SE = 1.3%), responding with an acknowledgment bark in 5.94 s (SE = 0.27 s). The Aibo responded correctly 55% (SE = 1.7%) of the time within about 14.61 s (SE = 0.60 s). The Aibo completed the commands incorrectly (with a wrong response) 39% (SE = 1.6%) of the time, generally within 8.45 s (SE = 0.98 s). The Aibo did not respond to a command 9% (SE = 1%) of the time.
We conducted the first analysis using a linear-mixed regression model involving commands issued (“Come Here” vs. “Very Lovely Aibo” vs. “Sit Down”, etc.) as a fixed factor with the participant as a random factor and Bonferroni corrections for post hoc analyses. Type of command issue was a significant factor (F(7,216) = 20.522, p < 0.001***). Post-hoc analysis revealed that specifically, the “Come Here” task was issued at a significantly higher rate (between 5.75 and 7.39 additional issues) compared to any other task (p < 0.001*** for each post hoc comparison). No other command was issued at any further or lesser rate than the other.
Neither framing nor appearance led to any significant main or interaction effects regarding total numbers and types of commands issued (see Table 3). These nonsignificant findings indicate a similarity in the frequency and type of commands issued by participants across manipulations.

3.2. Interaction Block Results

For the statistical analysis of each behavior of the individual and Aibo in the interaction blocks, we used linear mixed models with framing (puppy vs. robot) and appearance (fur vs. no fur) as fixed factors with the subject as a random factor. As seen in Figure 4a,b, within the person behavior units, interaction blocks revealed a main effect for framing, revealing that when the Aibo was introduced as a puppy, participants called the Aibo by its dog name more often (F(1,26) = 5.842, p = 0.023*, d = 0.670). Furthermore, in the puppy condition, Aibo’s were praised significantly more often (F(1,25.8) = 6.807, p = 0.015*, d = 0.727). Aibo’s appearance as a factor was not significant neither in number of praises (F(1,23.9) = 0.01*, p = 0.92) nor number of dog name uses (F(1,24.0) = 0.53, p = 0.47).

3.3. T-Pattern Analysis

We used linear mixed models with framing (puppy vs. robot) and appearance (fur vs. no fur) as fixed factors to analyze T-pattern measures. Before the statistical analysis, we excluded the participants that had less than 100 interactive patterns and more than 10,000 interactions according to the T-pattern analysis to avoid outliers (4 participants). We compared the number of T-patterns and their complexity levels between each group of framing and appearance, in addition to how many times each behavior mentioned in Table 2 appeared in the interactive patterns.
Participants interacted more with the Aibo when framed as a puppy both in terms of diversity of interaction (F(1,20.0) = 4.88, p = 0.04*, puppy mean = 935.4 vs. robot mean = 262.8, d = 0.90) and quantity (F(1,20.0) = 4.71, p = 0.04*, puppy mean = 3331.1 vs. robot mean = 1008.3, d = 0.87) of the interactive patterns, compared to Aibo introduced as a robot (see Figure 5). Furthermore, the interaction T-patterns with the Puppies were more complex on average (F(1,20.0) = 7.25, p = 0.01*, d = 0.90) than robots. We did not find any significance for appearance in any of the T-pattern measures including total number of interactive T-patterns (F(1,20.0) = 0.59, p = 0.49), number of unique T-patterns (F(1,20.0) = 0.50, p = 0.49), and average T-pattern complexity (F(1,20.0) = 0.13, p = 0.72).
The main interaction between framing and appearance was significant in terms of participants smiling (F(1,20.0) = 5.75, p = 0.03*) and laughing (F(1,20.0) = 7.45, p = 0.01*) in interactive T-patterns. Participants smiled significantly more in interactive T-patterns with the Aibo as a puppy with no fur, compared to Aibo as a puppy with Fur (F(1,20.0) = 7.87, p = 0.01*, d = 1.48) and robot with no fur (F(1,20.0) = 5.42, p = 0.03*, d = 0.96) (see Figure 6a). Participants laughed significantly more in the interactive T-patterns when the Aibo had no fur and was framed as a puppy, compared to puppy Aibo with fur (F(1,20.0) = 6.26, p = 0.02*, d = 1.16) and robot Aibo with no fur (F(1,20.0) = 4.77, p = 0.04*, d = 0.72) (see Figure 6b).

3.4. Questionnaire Results

To analyze the 12 questionnaire items, we used mixed linear mixed models with framing (puppy vs. robot) and appearance (fur vs. no fur) as the between-subjects factors and experience (pre vs. post-interaction) as a within-subject factor.
We did not find any significant effects of framing, appearance, or the interaction of both factors. Additionally, we did not find any significant interactions between pre-interaction and post-interaction ratings for the three “negative” characteristics, which are uncanniness, scariness, and creepiness (see Figure 7). However, we found that after interacting with Aibo, self-reported scores in terms of the robot’s “positive” perceived characteristics including trustworthiness (F(1,24) = 5.02, p = 0.035*, d = 0.41; pre-mean = 4.57 vs. post-mean = 5.57), consciousness (F(1,24) = 13.45, p = 0.001**, d = 0.53; Pre-mean = 3.70 vs. post-mean = 5.21), intelligence (F(1,24) = 6.10, p = 0.02*, d = 0.52; pre-mean = 4.61 vs. post-mean = 5.93), connection (F(1,24) = 9.08, p = 0.006**, d = 0.65; pre-mean = 2.45 vs. post-mean = 3.18) and warmth (F(1,24) = 6.28, p = 0.019*, d = 0.59; pre-mean = 2.89 vs. post-mean = 3.50) significantly increased compared to the ratings before the interaction (see Figure 8).

4. Discussion

The purpose of the current study was to investigate how enhancing the dog-likeness of a dog-like robot may improve human-robot interaction, mutual bonding, and trust. After interacting with Aibo, participants felt Aibo was more trustworthy, intelligent, warm, and connected than at their initial meeting. While instructing Aibo, participants used the “Come Here” command most often. When framed as a puppy, participants used Aibo’s dog name more often, verbalized more praise, and exhibited more unique, interactive, and complex behavior with Aibo. Framing and appearance also interacted. Participants exhibited the most smiling and laughing behaviors with Aibo framed as a puppy with no fur compared to a puppy with fur or a robot with no fur. The strong framing is consistent with previous human-robot interaction studies, which have shown that personification of a robot improves people’s engagement and positive perceptions of that robot [36,37,38,39,40,41,56].
Interestingly, adding fur inhibited social reactions like smiling and laughing when Aibo was framed as a puppy, contrary to Hypothesis 2, which posed that adding more dog-like fur would enhance perceptions of dog-likeness and improve overall human-dog-robot interaction. One explanation may be that the fur suit did not cover the entire Aibo, but only part of its body. This may have resulted in a somewhat “uncanine” effect [50], although this was not reflected in subjective reports. Furthermore, the use of fur did not reduce social reactions when the Aibo was framed as a robot, indicating tolerance amongst users for realistic additions to robots. It is possible that with the framing of the Aibo as a robot, expectations for matching the exact fidelity of a dog were lower compared to the puppy frame. Because possible mechanisms of uncanniness are routed in perceptual mismatches [49,63], the mismatch between the robot frame and the fur may have been much lower or non-existent compared to the other conditions.
Given the design iterations of Aibo, it is possible that co-evolution between humans and robot dogs is already underway [9]. The new version of Aibo elicits a sufficiently interactive response from human partners due to its dog-like form and basic behaviors. Yet, the Aibo does not always respond accurately or speedily and may not adapt as fast as a real dog, providing a potential limitation for longer-term bonding and trust development beyond the initial interaction [64,65,66]. For example, previous work has indicated that robots with very basic abilities are eventually abandoned after a few months [67]. On the other hand, early task successes with robots can reinforce people to form positive attitudes towards their interactions [12]. Newer and updated versions of the Aibo may thus need to focus on improving these performance characteristics.
There were several limitations with this study regarding the robot dog, the participants, and measurements, which can be improved in future explorations. First, we only used the most recent version of Sony’s Aibo dog. Other appearance configurations could be examined by comparing the Aibo to previous versions, the Spot, toy dog robots, other companion dog robots, or real dogs. The fur could also be made more realistic, cover the entire robot body and take on additional colors and textures. The different name use of Kipling or Bernard in the fur conditions for puppy framing could have contributed to some differences across those conditions, although recent research suggests that name use has a negligible effect on outcome measures in human-robot interaction [59]. Additionally, all interactions with Aibo were brief—future research could explore long-term interactions with Aibo across several days or weeks. Other populations could be explored, including different age groups, robot dog owners, civilians, and military personnel. If robotic dogs are deployed more widely in a community, by the police, for example, additional design considerations may be needed to accommodate the composition of an entire community, including children, adults, and even pets like dogs [68], as each member may respond to the robot in a unique way. Furthermore, two-way interactions are critical for bonding between humans and dogs, such as when police or military personnel and canines work together to find explosives or when a blind handler works with their guide dog to navigate [8]. Studying more interactive, physical, and ecologically valid behaviors could be beneficial, including playing fetch, tug-of-war, games, or common work tasks. Measurements could have been improved by recording data directly from Aibo and by automatic classification of emotions using facial recognition software. Finally, while the study could have had a larger study population, our approach mitigated potential concerns about analyzing a small sample, including reporting moderate to large effect size for all effects, analyzing multiple repeated measures for each participant, and the use of linear mixed-effects modeling, which is an optimal analytical method to use when sample sizes are smaller.
These limitations notwithstanding, our study showed that the Aibo platform and its dog-like form continue to be a successful interaction paradigm for human-robot interaction as evidenced by the increased connection, warmth, and trustworthiness felt before and after only a brief interaction with the Aibo ERS-1000.

5. Conclusions

Our study demonstrated that framing a dog-like robot as a learning puppy can enrich behavioral interactive patterns and perceptions between humans and robots. Future research could focus on developing unique robot dog framing depending on the collaborative working environments and situation with humans while executing collaborative activities such as guarding, herding, navigating, detecting explosives, and conducting urban search and rescue operations. Creating frames based on the demands of the context may further foster the co-evolution between humans and robot dogs and sustain long-term human-robot engagement and bonding.

Author Contributions

The data on which this study is based was conceptualized by E.J.d.V., E.P. and C.C.T., E.J.d.V. was involved in supervision. Data were coded and analyzed by E.J.d.V., H.A., S.J. and Y.T. The manuscript was drafted by E.J.d.V., S.J., Y.T., H.A., J.G., C.C.T., E.P. and F.K. All authors have read and agreed to the published version of the manuscript.

Funding

This material is based upon work supported by the Air Force Office of Scientific Research (to C.C.T. and E.J.d.V) under award numbers 16RT0881 and 21USCOR004. S.J. was supported by the Eunice Kennedy Shriver National Institute of Child Health & Human Development of the National Institutes of Health under Award Number F30HD103527. The views expressed in this paper are those of the authors and do not reflect those of the US Air Force, Department of Defense, National Institutes of Health, or US Government.

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki and approved by the Institutional Review Board (or Ethics Committee) of the United States Air Force Academy.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Acknowledgments

We would like to thank USAFA cadets Heidi Schellin, Tatiana Oberley, and Kaitlyn Patterson for collecting the data and Drexel undergraduate students Hannah Brown and Micah Beiser for assistance with coding the data.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. The Scripts for Puppy and Robot Conditions

Puppy Script: Today, you will be playing with (Bernard/Kipling). He is a robotic puppy who is still in training. He is only a puppy, so please be patient with him because he is still learning obedience. He might take a second to warm up because he is a little shy. The playdate today will be recorded and reviewed for future reference and data collection. Past researchers have found that playing with these dogs like (Bernard/Kipling) reduces stress. You will play with (Bernard/Kipling) for 10 min to try out different commands. Please try the following commands”:
“Very lovely Aibo”; “Sit down”; “Take a picture”; “Sing a song”; and “Come here”.
But after that, you can play with him however you please, in a respectful manner, of course. After you’re done playing with (Bernard/Kipling), you will be given a post-test. Please fill this test out thoroughly. Thank you and have fun! If you have any questions, please let us know.
Robot Script: Today, you will be interacting with Aibo. It is a robot whose actions and manners are similar to that of a dog. With the software programmed into Aibo, it might take some time to recognize your voice. The interaction today will be recorded and reviewed for future reference and data collection. Past researchers have found that playing with robots like Aibo can reduce stress. You will interact with Aibo for 10 min to try out different commands. Please try the following commands”:
“Very lovely Aibo”; “Sit down”; “Take a picture”; “Sing a song”; and “Come here”.;
But after that, you may interact with it however you please, in a respectful manner, of course. After you are done with Aibo, you will be given a post test. Please fill this test out thoroughly. Thank you, and if you have any questions, please let us know.

Appendix B

  • Creepiness is “the characteristic of causing an annoyingly unpleasant sensation”. Please rate this robot on a scale of 0 (not at all creepy) to 10 (extremely creepy).
  • Likability is “having qualities that cause others to feel warm familiarity, comfort, and favorable regard”. Please rate this robot on a scale of 0 (not at all likable) to 10 (extremely likable).
  • Scary is “the characteristic of causing fright or alarm”. Please rate this robot on a scale of 0 (not at all scary) to 10 (extremely scary).
  • Trustworthy is “the characteristic of inducing a sense of being reliable, capable, ethical and sincere”. Please rate this robot on a scale of 0 (not at all trustworthy) to 10 (extremely trustworthy).
  • Uncanny is “the characteristic of seeming mysterious, weird, uncomfortably strange or unfamiliar”. Please rate this robot on a scale of 0 (not at all uncanny) to 10 (extremely uncanny).
  • Dog-likeness is “the characteristic of having the appearance and traits of a dog”. Please rate this robot on a scale of 0 (not at all dog-like) to 10 (extremely dog-like).
  • Conscious is “the characteristic of being aware of one’s own existence, sensations, thoughts, and surroundings”. Please rate this robot on a scale of 0 (not at all conscious) to 10 (extremely conscious).
  • Lifelike is “the characteristic of being able to adapt to one’s environment”. Please rate this robot on a scale of 0 (not at all lifelike) to 10 (extremely lifelike).
  • Intelligent is “the characteristic of having good understanding or a high mental capacity; quick to comprehend”. Please rate this robot on a scale of 0 (not at all intelligent) to 10 (extremely intelligent).
  • Friendly is “the characteristic of being like a friend; kind, helpful”. Please rate this robot on a scale of 0 (not at all friendly) to 10 (extremely friendly).
  • I felt I had a connection with the robot (Use scale from 1 (Strongly Disagree) to 5 (Strongly Agree)
  • The robot was warm and caring (Use a scale from 1 (Strongly Disagree) to 5 (Strongly Agree).

References

  1. Banks, M.R.; Willoughby, L.M.; Banks, W.A. Animal-assisted therapy and loneliness in nursing homes: Use of robotic versus living dogs. J. Am. Med. Dir. Assoc. 2008, 9, 173–177. [Google Scholar] [CrossRef] [PubMed]
  2. Coghlan, S.; Waycott, J.; Neves, B.B.; Vetere, F. Using robot pets instead of companion animals for older people: A case of ‘reinventing the wheel’? In Proceedings of the 30th Australian Conference on Computer-Human Interaction, Melbourne, VIC, Australia, 4–7 December 2018; pp. 172–183. [Google Scholar]
  3. Hudson, J.; Ungar, R.; Albright, L.; Tkatch, R.; Schaeffer, J.; Wicker, E.R. Robotic Pet Use Among Community-Dwelling Older Adults. J. Gerontol. Ser. B 2020, 75, 2018–2028. [Google Scholar] [CrossRef] [PubMed]
  4. Tanaka, K.; Makino, H.; Nakamura, K.; Nakamura, A.; Hayakawa, M.; Uchida, H.; Kasahara, M.; Kato, H.; Igarashi, T. The Pilot Study of Group Robot Intervention on Pediatric Inpatients and Their Caregivers, Using ‘New Aibo’. Eur. J. Pediatr. Germany 2021. [Google Scholar] [CrossRef] [PubMed]
  5. USAF. Tyndall Air Force Base Receives Semi-Autonomous Robot Dogs. Available online: https://www.tyndall.af.mil/News/Article-Display/Article/2550793/tyndall-brings-in-the-big-dogs/ (accessed on 25 March 2021).
  6. Krueger, F.; Mitchell, K.C.; Deshpande, G.; Katz, J.S. Human-dog relationships as a working framework for exploring human-robot attachment: A multidisciplinary review. Anim. Cogn. 2021, 24, 371–385. [Google Scholar] [CrossRef]
  7. Nagasawa, M.; Mitsui, S.; En, S.; Ohtani, N.; Ohta, M.; Sakuma, Y.; Onaka, T.; Mogi, K.; Kikusui, T. Social evolution. Oxytocin-gaze positive loop and the coevolution of human-dog bonds. Science 2015, 348, 333–336. [Google Scholar] [CrossRef] [PubMed]
  8. Helton, W.S. Canine Ergonomics: The Science of Working Dogs; CRC Press: Boca Raton, FL, USA, 2009; pp. 1–349. [Google Scholar]
  9. Phillips, E.; Schaefer, K.E.; Billings, D.R.; Jentsch, F.; Hancock, P.A. Human-animal teams as an analog for future human-robot teams: Influencing design and fostering trust. J. Hum.-Robot Interact. 2016, 5, 100–125. [Google Scholar] [CrossRef]
  10. De Visser, E.J.; Monfort, S.S.; Goodyear, K.; Lu, L.; O’Hara, M.; Lee, M.R.; Parasuraman, R.; Krueger, F. A Little Anthropomorphism Goes a Long Way: Effects of Oxytocin on Trust, Compliance, and Team Performance with Automated Agents. Hum. Factors 2017, 59, 116–133. [Google Scholar] [CrossRef] [Green Version]
  11. New York Police Department (N.Y.P.D.). Robot Dog’s Run Is Cut Short After Fierce Backlash. Available online: https://www.nytimes.com/2021/04/28/nyregion/nypd-robot-dog-backlash.html (accessed on 28 April 2021).
  12. Kim, B.; Haring, K.S.; Schellin, H.J.; Oberley, T.N.; Patterson, K.M.; Phillips, E.; Visser, E.J.d.; Tossell, C.C. How Early Task Success Affects Attitudes Toward Social Robots. In Proceedings of the Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, Cambridge, UK, 23–26 March 2020; pp. 287–289. [Google Scholar]
  13. Schellin, H.; Oberley, T.; Patterson, K.; Kim, B.; Haring, K.S.; Tossell, C.C.; Phillips, E.; Visser, E.J.d. Man’s New Best Friend? Strengthening Human-Robot Dog Bonding by Enhancing the Doglikeness of Sony’s Aibo. In Proceedings of the 2020 Systems and Information Engineering Design Symposium (SIEDS), Charlottesville, VA, USA, 24 April 2020; pp. 1–6. [Google Scholar] [CrossRef]
  14. Kerepesi, A.; Jonsson, G.K.; Miklósi, Á.; Topál, J.; Csányi, V.; Magnusson, M.S. Detection of temporal patterns in dog-human interaction. Behav. Processes 2005, 70, 69–79. [Google Scholar] [CrossRef]
  15. Kerepesi, A.; Kubinyi, E.; Jonsson, G.K.; Magnusson, M.S.; Miklósi, Á. Behavioural comparison of human-animal (dog) and human-robot (AIBO) interactions. Behav. Processes 2006, 73, 92–99. [Google Scholar] [CrossRef]
  16. Kerepesi, A.; Jonsson, G.K.; Kubinyi, E.; Miklósi, Á. Can robots replace dogs? Comparison of temporal patterns in dog-human and robot-human interactions. In Human-Robot Interaction; Sarkar, N., Ed.; Itech Education and Publishing: Vienna, Austria, 2007; pp. 201–212. [Google Scholar]
  17. Fujita, M. How to make an autonomous robot as a partner with humans: Design approach versus emergent approach. Philos. Trans. Ser. A Math. Phys. Eng. Sci. 2007, 365, 21–47. [Google Scholar] [CrossRef]
  18. Fujita, M. On activating human communications with pet-type robot AIBO. Proc. IEEE 2004, 92, 1804–1813. [Google Scholar] [CrossRef]
  19. Friedman, B.; Kahn, P.H.; Hagman, J. Hardware companions? What online AIBO discussion forums reveal about the human-robotic relationship. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Ft. Lauderdale, FL, USA, 5–10 April 2003; pp. 273–280. [Google Scholar]
  20. Knox, E.; Watanabe, K. AIBO Robot Mortuary Rites in the Japanese Cultural Context*. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 2020–2025. [Google Scholar]
  21. Kahn, P.H.; Friedman, B.; Pérez-Granados, D.R.; Freier, N.G. Robotic pets in the lives of preschool children. Interact. Stud. 2006, 7, 405–436. [Google Scholar] [CrossRef]
  22. Ribi, F.N.; Yokoyama, A.; Turner, D.C. Comparison of Children’s Behavior toward Sony’s Robotic Dog AIBO and a Real Dog: A Pilot Study. Anthrozoös 2008, 21, 245–256. [Google Scholar] [CrossRef]
  23. Weiss, A.; Wurhofer, D.; Tscheligi, M. “I love this dog”—Children’s emotional attachment to the robotic dog AIBO. Int. J. Soc. Robot. 2009, 1, 243–248. [Google Scholar] [CrossRef]
  24. Okita, S.Y.; Schwartz, D.L. Young children’s understanding of animacy and entertainment robots. Int. J. Hum. Robot. 2006, 3, 393–412. [Google Scholar] [CrossRef]
  25. Ihamäki, P.; Heljakka, K. Robot Dog Intervention with the Golden Pup: Activating Social and Empathy Experiences of Elderly People as Part of Intergenerational Interaction. In Proceedings of the 54th Hawaii International Conference on System Sciences, Kauai, HI, USA, 5–8 January 2021; p. 1888. [Google Scholar]
  26. Leite, I.; Martinho, C.; Paiva, A. Social Robots for Long-Term Interaction: A Survey. Int. J. Soc. Robot. 2013, 5, 291–308. [Google Scholar] [CrossRef]
  27. Melson, G.F.; Kahn, P.H.; Beck, A.M.; Friedman, B.; Roberts, T.; Garrett, E. Robots as dogs? Children’s interactions with the robotic dog AIBO and a live Australian shepherd. In CHI ’05 Extended Abstracts on Human Factors in Computing Systems, Portland, OR, USA, 2–7 April 2005; Association for Computing Machinery: New York, NY, USA, 2005; pp. 1649–1652. [Google Scholar]
  28. Bartlett, B.; Estivill-Castro, V.; Seymon, S. Dogs or Robots—Why do Children See Them As Robotic Pets Rather Than Canine Machines? In Proceedings of the AUIC, Dunedin, New Zealand, 18–22 January 2004. [Google Scholar]
  29. Sinatra, A.M.; Sims, V.K.; Chin, M.G.; Lum, H.C. If it looks like a dog: The effect of physical appearance on human interaction with robots and animals. Interact. Stud. 2012, 13, 235–262. [Google Scholar] [CrossRef]
  30. Francis, A.; Mishra, P. Is AIBO Real? Understanding Children’s Beliefs About and Behavioral Interactions with Anthropomorphic Toys. J. Interact. Learn. Res. 2009, 20, 405–422. [Google Scholar]
  31. Pepe, A.A.; Ellis, L.U.; Sims, V.K.; Chin, M.G. Go, Dog, Go: Maze Training AIBO vs. a Live Dog, An Exploratory Study. Anthrozoös 2008, 21, 71–83. [Google Scholar] [CrossRef]
  32. Konok, V.; Korcsok, B.; Miklósi, Á.; Gácsi, M. Should we love robots?—The most liked qualities of companion dogs and how they can be implemented in social robots. Comput. Hum. Behav. 2018, 80, 132–142. [Google Scholar] [CrossRef]
  33. Kertész, C.; Turunen, M. Exploratory analysis of Sony AIBO users. AI Soc. 2019, 34, 625–638. [Google Scholar] [CrossRef] [Green Version]
  34. Kubinyi, E.; Miklósi, Á.; Kaplan, F.; Gácsi, M.; Topál, J.; Csányi, V. Social behaviour of dogs encountering AIBO, an animal-like robot in a neutral and in a feeding situation. Behav. Processes 2004, 65, 231–239. [Google Scholar] [CrossRef] [PubMed]
  35. Donath, J. The robot dog fetches for whom? In A Networked Self and Human Augmentics, Artificial Intelligence, Sentience; Routledge: London, UK, 2018; pp. 10–24. [Google Scholar]
  36. Swift-Spong, K.; Wen, C.K.F.; Spruijt-Metz, D.; Matarić, M.J. Comparing backstories of a Socially Assistive Robot exercise buddy for adolescent youth. In Proceedings of the 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), New York, NY, USA, 26–31 August 2016; pp. 1013–1018. [Google Scholar]
  37. Kang, S.H.; Gratch, J. Socially anxious people reveal more personal information with virtual counselors that talk about themselves using intimate human back stories. Stud. Health Technol. Inform. 2012, 181, 202–206. [Google Scholar]
  38. Kang, S.H.; Gratch, J. People like virtual counselors that highly-disclose about themselves. Stud. Health Technol. Inform. 2011, 167, 143–148. [Google Scholar] [PubMed]
  39. Bickmore, T.W.; Schulman, D.; Yin, L. Engagement vs. Deceit: Virtual Humans with Human Autobiographies. In Proceedings of the IVA, Amsterdam, The Netherlands, 14–16 September 2009. [Google Scholar]
  40. Darling, K. ‘Who’s Johnny?’ Anthropomorphic Framing in Human-Robot Interaction, Integration, and Policy. In Robot Ethics 2.0: From Autonomous Cars to Artificial Intelligences; Lin, P., Abney, K., Jenkins, R., Eds.; Oxford University Press: Oxford, UK, 2015; Volume 2, Chapter 12. [Google Scholar]
  41. You, S.; Lionel, P.R., Jr. Human-Robot Similarity and Willingness to Work with a Robotic Co-worker. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, Chicago, IL, USA, 5–8 March 2018; pp. 251–260. [Google Scholar]
  42. Lucas, G.M.; Boberg, J.; Traum, D.; Artstein, R.; Gratch, J.; Gainer, A.; Johnson, E.; Leuski, A.; Nakano, M. Getting to Know Each Other: The Role of Social Dialogue in Recovery from Errors in Social Robots. In Proceedings of the 2018 13th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Chicago, IL, USA, 5–8 March 2018; pp. 344–351. [Google Scholar]
  43. Thacker Thomas, D.; Vermilya, J.R. Framing ‘Friend’: Media Framing of ‘Man’s Best Friend’ and the Pattern of Police Shootings of Dogs. Soc. Sci. 2019, 8, 107. [Google Scholar] [CrossRef] [Green Version]
  44. Groom, V.; Srinivasan, V.; Bethel, C.L.; Murphy, R.; Dole, L.; Nass, C. Responses to robot social roles and social role framing. In Proceedings of the 2011 International Conference on Collaboration Technologies and Systems (CTS), Philadelphia, PA, USA, 23–27 May 2011; pp. 194–203. [Google Scholar]
  45. De Graaf, M.M.A.; Allouch, S.B. The Influence of Prior Expectations of a Robot’s Lifelikeness on Users’ Intentions to Treat a Zoomorphic Robot as a Companion. Int. J. Soc. Robot. 2017, 9, 17–32. [Google Scholar] [CrossRef] [Green Version]
  46. Mori, M.; MacDorman, K.F.; Kageki, N. The Uncanny Valley [From the Field]. IEEE Robot. Autom. Mag. 2012, 19, 98–100. [Google Scholar] [CrossRef]
  47. Stein, J.-P.; Liebold, B.; Ohler, P. Stay back, clever thing! Linking situational control and human uniqueness concerns to the aversion against autonomous technology. Comput. Hum. Behav. 2019, 95, 73–82. [Google Scholar] [CrossRef]
  48. Kim, B.; Bruce, M.; Brown, L.; Visser, E.d.; Phillips, E. A Comprehensive Approach to Validating the Uncanny Valley using the Anthropomorphic RoBOT (ABOT) Database. In Proceedings of the 2020 Systems and Information Engineering Design Symposium (SIEDS), Charlottesville, VA, USA, 24 April 2020; pp. 1–6. [Google Scholar] [CrossRef]
  49. Diel, A.; Weigelt, S.; Macdorman, K.F. A Meta-analysis of the Uncanny Valley’s Independent and Dependent Variables. J. Hum.-Robot Interact. 2021, 11, 1–33. [Google Scholar] [CrossRef]
  50. Löffler, D.; Dörrenbächer, J.; Hassenzahl, M. The Uncanny Valley Effect in Zoomorphic Robots: The U-Shaped Relation Between Animal Likeness and Likeability. In Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, Cambridge, UK, 23–26 March 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 261–270. [Google Scholar]
  51. Phillips, E.; Zhao, X.; Ullman, D.; Malle, B.F. What is Human-like? Decomposing Robots’ Human-like Appearance Using the Anthropomorphic roBOT (ABOT) Database. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, Chicago, IL, USA, 5–8 March 2018; pp. 105–113. [Google Scholar]
  52. Haring, K.S.; Watanabe, K.; Velonaki, M.; Tossell, C.C.; Finomore, V. FFAB—The Form Function Attribution Bias in Human-Robot Interaction. IEEE Trans. Cogn. Dev. Syst. 2018, 10, 843–851. [Google Scholar] [CrossRef]
  53. Roesler, E.; Manzey, D.; Onnasch, L. A meta-analysis on the effectiveness of anthropomorphism in human-robot interaction. Sci. Robot. 2021, 6, eabj5425. [Google Scholar] [CrossRef] [PubMed]
  54. Onnasch, L.; Roesler, E. Anthropomorphizing Robots: The Effect of Framing in Human-Robot Collaboration. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Seattle, WA, USA, 28 October–1 November 2019; Volume 63, pp. 1311–1315. [Google Scholar] [CrossRef] [Green Version]
  55. Magnusson, M.S. Discovering hidden time patterns in behavior: T-patterns and their detection. Behav. Res. Methods Instrum. Comput. 2000, 32, 93–110. [Google Scholar] [CrossRef] [PubMed]
  56. Darling, K.; Nandy, P.; Breazeal, C. Empathic concern and the effect of stories in human-robot interaction. In Proceedings of the 2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Kobe, Japan, 31 August–4 September 2015; pp. 770–775. [Google Scholar]
  57. Wen, J.; Stewart, A.; Billinghurst, M.; Tossell, C. Band of Brothers and Bolts: Caring About Your Robot Teammate. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 1853–1858. [Google Scholar]
  58. Wen, J.; Stewart, A.; Billinghurst, M.; Dey, A.; Tossell, C.; Finomore, V. He who hesitates is lost (…in thoughts over a robot). In Proceedings of the Technology, Mind, and Society, Washington, DC, USA, 5–8 April 2018. Article 43. [Google Scholar]
  59. Borrie, A.; Jonsson, G.; Magnusson, M. Application of T-pattern detection and analysis in sports research. Metodol. De Las Cienc. Del Comport. 2001, 3, 215–226. [Google Scholar]
  60. Casarrubea, M.; Magnusson, M.S.; Anguera, M.T.; Jonsson, G.K.; Castañer, M.; Santangelo, A.; Palacino, M.; Aiello, S.; Faulisi, F.; Raso, G.; et al. T-pattern detection and analysis for the discovery of hidden features of behaviour. J. Neurosci. Methods 2018, 310, 24–32. [Google Scholar] [CrossRef] [PubMed]
  61. Magnusson, M.S. Hidden Real-Time Patterns in Intra- and Inter-Individual Behavior: Description and Detection. Eur. J. Psychol. Assess. 1996, 12, 112–123. [Google Scholar] [CrossRef]
  62. Magnusson, M.S. Why Search for Hidden Repeated Temporal Behavior Patterns: T-Pattern Analysis with Theme. Int. J. Clin. Pharmacol. Pharmacother. 2017, 2, 128. [Google Scholar] [CrossRef]
  63. Diel, A.; MacDorman, K.F. Creepy cats and strange high houses: Support for configural processing in testing predictions of nine uncanny valley theories. J. Vis. 2021, 21, 1. [Google Scholar] [CrossRef]
  64. Krueger, F.; McCabe, K.; Moll, J.; Kriegeskorte, N.; Zahn, R.; Strenziok, M.; Heinecke, A.; Grafman, J. Neural correlates of trust. Proc. Natl. Acad. Sci. USA 2007, 104, 20084–20089. [Google Scholar] [CrossRef] [Green Version]
  65. Krueger, F.; Meyer-Lindenberg, A. Toward a Model of Interpersonal Trust Drawn from Neuroscience, Psychology, and Economics. Trends Neurosci. 2019, 42, 92–101. [Google Scholar] [CrossRef]
  66. Lewicki, R.J.; Bunker, B.B. Developing and maintaining trust in work relationships. Trust. Organ. Front. Theory Res. 1996, 114, 139. [Google Scholar]
  67. De Graaf, M.M.A.; Allouch, S.B.; van Dijk, J. Why Do They Refuse to Use My Robot? Reasons for Non-Use Derived from a Long-Term Home Study. In Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, Vienna, Austria, 6–9 March 2017; pp. 224–233. [Google Scholar]
  68. Morovitz, M.; Mueller, M.; Scheutz, M. Animal-robot interaction: The role of human likeness on the success of dog-robot interactions. In Proceedings of the 1st International Workshop on Vocal Interactivity in-and-between Humans, Animals and Robots (VIHAR), London, UK, 29–30 August 2017; pp. 22–26. [Google Scholar]
Figure 1. Representation of the “uncanine” valley, a hypothesized adaption of the uncanny valley. “Dog-likeness” is defined as the degree to which an object is designed to mimic the form, characteristics, and behavior of a real dog.
Figure 1. Representation of the “uncanine” valley, a hypothesized adaption of the uncanny valley. “Dog-likeness” is defined as the degree to which an object is designed to mimic the form, characteristics, and behavior of a real dog.
Sensors 22 01287 g001
Figure 2. (a) Sony Aibo ERS-1000 without the fur; (b) Aibo outfitted with a fur suit.
Figure 2. (a) Sony Aibo ERS-1000 without the fur; (b) Aibo outfitted with a fur suit.
Sensors 22 01287 g002
Figure 3. An example of an interactive T-pattern in ThemeEdu software; this is an example pattern extracted from one of the participants while interacting with Aibo. (a) The events occurring inside the specific pattern, listed in the order in which they occur within the pattern. The first event in the pattern appears at the top and the last at the bottom; (b) The frequencies of each behavior in the pattern, each red dot indicates a single behavior (e.g., the person begins looking, Aibo ends barking, etc.) at a certain time point. The light green lines on the red dots show the behavior which red dots represent. The black lines connecting the red dots represent the patterns between different behaviors. Smaller patterns inside the pattern can also occur when some of the events within the pattern occur without the whole pattern occurring.
Figure 3. An example of an interactive T-pattern in ThemeEdu software; this is an example pattern extracted from one of the participants while interacting with Aibo. (a) The events occurring inside the specific pattern, listed in the order in which they occur within the pattern. The first event in the pattern appears at the top and the last at the bottom; (b) The frequencies of each behavior in the pattern, each red dot indicates a single behavior (e.g., the person begins looking, Aibo ends barking, etc.) at a certain time point. The light green lines on the red dots show the behavior which red dots represent. The black lines connecting the red dots represent the patterns between different behaviors. Smaller patterns inside the pattern can also occur when some of the events within the pattern occur without the whole pattern occurring.
Sensors 22 01287 g003
Figure 4. The average count of dog praises and dog name use across command blocks in each framing condition: (a) participants praised puppy Aibo more compared to robot Aibo; (b) participants used the dog name assigned significantly more while interacting with puppy Aibo, compared to robot Aibo.
Figure 4. The average count of dog praises and dog name use across command blocks in each framing condition: (a) participants praised puppy Aibo more compared to robot Aibo; (b) participants used the dog name assigned significantly more while interacting with puppy Aibo, compared to robot Aibo.
Sensors 22 01287 g004
Figure 5. The number of interactive T-patterns and unique T-pattern interactions in each framing condition: (a) puppy Aibo led to more interactions compared to robot Aibo; (b) Interacting with puppy Aibo led to more unique interactive T-patterns compared to robot Aibo. (c) Interacting with puppy Aibo resulted in more complex interactive T-patterns compared to robot Aibo. * p < 0.05.
Figure 5. The number of interactive T-patterns and unique T-pattern interactions in each framing condition: (a) puppy Aibo led to more interactions compared to robot Aibo; (b) Interacting with puppy Aibo led to more unique interactive T-patterns compared to robot Aibo. (c) Interacting with puppy Aibo resulted in more complex interactive T-patterns compared to robot Aibo. * p < 0.05.
Sensors 22 01287 g005
Figure 6. The number of times participants smiled and laughed based on Aibo behavior in interactive T-patterns across each framing and appearance condition. (a) Participants smiled at Aibo with puppy framing and no fur significantly more than puppy Aibo with fur and robot Aibo with no fur; (b) Participants laughed more while interacting with Aibo with puppy framing and no fur than puppy Aibo with fur; * p < 0.05.
Figure 6. The number of times participants smiled and laughed based on Aibo behavior in interactive T-patterns across each framing and appearance condition. (a) Participants smiled at Aibo with puppy framing and no fur significantly more than puppy Aibo with fur and robot Aibo with no fur; (b) Participants laughed more while interacting with Aibo with puppy framing and no fur than puppy Aibo with fur; * p < 0.05.
Sensors 22 01287 g006
Figure 7. Self-reported ratings of questionnaires for “negative” characteristics (a) creepiness; (b) uncanniness, and (c) scariness) towards the robot pre-interaction (in orange) and post-interaction (in yellow).
Figure 7. Self-reported ratings of questionnaires for “negative” characteristics (a) creepiness; (b) uncanniness, and (c) scariness) towards the robot pre-interaction (in orange) and post-interaction (in yellow).
Sensors 22 01287 g007
Figure 8. Self-reported ratings of “positive” perceived characteristics of the robot pre-interaction (in orange) and post interaction (in yellow). * p < 0.05, ** p < 0.01.
Figure 8. Self-reported ratings of “positive” perceived characteristics of the robot pre-interaction (in orange) and post interaction (in yellow). * p < 0.05, ** p < 0.01.
Sensors 22 01287 g008
Table 1. Commands and their corresponding behaviors for Aibo.
Table 1. Commands and their corresponding behaviors for Aibo.
CommandBehavior
“Very Lovely Aibo” Dances and barks as a song plays
“Sit Down” Sits down in dog-like posture and pants
“Take a Picture” Counts down and snaps a picture with a camera
“Sing a Song” Strikes a sitting pose and sings a tune
“Come Here” Turns and walks towards the speaker
“Happy Birthday” Dances and barks as “Happy Birthday” song plays
“Let’s Pose” Rolls over on belly and moves feet
“If You’re Happy and You Know it” Dances and barks to the famous song
Table 2. Behavioral units recorded within the video per and between Aibo and participant.
Table 2. Behavioral units recorded within the video per and between Aibo and participant.
Aibo’s Behavior Units Person’s Behavior Units
Wags Tail Laughs
Rotates Body Smiles
Rotates Head Pets Aibo
Kneels Praises Aibo
Sits Relocates Aibo
Barks Repeated Commands
Looks at Person Looks at Aibo
Lays Down
Pivots Back and Forth
Table 3. Commands issued totals per command types, and per framing and appearance factors.
Table 3. Commands issued totals per command types, and per framing and appearance factors.
Commands
Issued
Overall
(n = 29)
Robot
(n = 11)
Puppy
(n = 18)
No Fur
(n = 15)
Fur
(n = 14)
“Very Lovely Aibo”2.14 (0.26)2.19 (0.42)2.08 (0.31)2.52 (0.32)1.75 (0.41)
“Sit Down”3.46 (0.67)2.79 (1.08)4.13 (0.79)2.79 (1.05)4.21 (0.82)
“Take a Picture”2.07 (0.33)2.02 (0.53)2.12 (0.39)2.39 (0.41)1.75 (0.52)
“Sing a Song”3.28 (0.52)3.21 (0.85)3.35 (0.62)3.06 (0.65)3.5 (0.83)
“Come Here” ***9.08 (1.36)8.33 (2.20)9.83 (1.61)9.12 (1.68)9.04 (2.15)
“Happy
Birthday”
2.40 (0.52)1.88 (0.83)2.92 (0.61)2.79 (0.64)2.00 (0.81)
“Let’s Pose”2.05 (0.36)1.71 (0.45)2.38 (0.33)2.09 (0.34)2.00 (0.44)
“If You’re Happy and You Know it”2.02 (0.36)1.75 (0.58)2.29 (0.43)2.25 (0.45)1.79 (0.57)
Total27.20 (2.82)24.17 (4.55)30.24 (3.34)29.37 (3.48)25.04 (2.82)
Values indicated are mean (standard error; SE) frequency repeats per experimental session, further split in either the framing of appearance conditions. *** indicates that “Come Here” was utilized at a significantly higher rate than any other command (p < 0.001 ***), while all the other commands were issued at similar rates compared to another.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

de Visser, E.J.; Topoglu, Y.; Joshi, S.; Krueger, F.; Phillips, E.; Gratch, J.; Tossell, C.C.; Ayaz, H. Designing Man’s New Best Friend: Enhancing Human-Robot Dog Interaction through Dog-Like Framing and Appearance. Sensors 2022, 22, 1287. https://doi.org/10.3390/s22031287

AMA Style

de Visser EJ, Topoglu Y, Joshi S, Krueger F, Phillips E, Gratch J, Tossell CC, Ayaz H. Designing Man’s New Best Friend: Enhancing Human-Robot Dog Interaction through Dog-Like Framing and Appearance. Sensors. 2022; 22(3):1287. https://doi.org/10.3390/s22031287

Chicago/Turabian Style

de Visser, Ewart J., Yigit Topoglu, Shawn Joshi, Frank Krueger, Elizabeth Phillips, Jonathan Gratch, Chad C. Tossell, and Hasan Ayaz. 2022. "Designing Man’s New Best Friend: Enhancing Human-Robot Dog Interaction through Dog-Like Framing and Appearance" Sensors 22, no. 3: 1287. https://doi.org/10.3390/s22031287

APA Style

de Visser, E. J., Topoglu, Y., Joshi, S., Krueger, F., Phillips, E., Gratch, J., Tossell, C. C., & Ayaz, H. (2022). Designing Man’s New Best Friend: Enhancing Human-Robot Dog Interaction through Dog-Like Framing and Appearance. Sensors, 22(3), 1287. https://doi.org/10.3390/s22031287

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop