Next Article in Journal
Correction: Yi et al. SFS-AGGL: Semi-Supervised Feature Selection Integrating Adaptive Graph with Global and Local Information. Information 2024, 15, 57
Previous Article in Journal
Large Language Models (LLMs) in Engineering Education: A Systematic Review and Suggestions for Practical Adoption
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Factors for Customers’ AI Use Readiness in Physical Retail Stores: The Interplay of Consumer Attitudes and Gender Differences

Faculty of Economics and Business, University of Maribor, 2000 Maribor, Slovenia
*
Author to whom correspondence should be addressed.
Information 2024, 15(6), 346; https://doi.org/10.3390/info15060346
Submission received: 22 April 2024 / Revised: 29 May 2024 / Accepted: 4 June 2024 / Published: 12 June 2024
(This article belongs to the Section Information Applications)

Abstract

:
In addressing the nuanced interplay between consumer attitudes and Artificial Intelligence (AI) use readiness in physical retail stores, the main objective of this study is to test the impacts of prior experience, as well as perceived risks with AI technologies, self-assessment of consumers’ ability to manage AI technologies, and the moderator role of gender in this relationship. Using a quantitative cross-sectional survey, data from 243 consumers familiar with AI technologies were analyzed using structural equation modeling (SEM) methods to explore these dynamics in the context of physical retail stores. Additionally, the moderating impacts were tested after the invariance analysis across both gender groups. Key findings indicate that positive prior experience with AI technologies positively influences AI use readiness in physical retail stores, while perceived risks with AI technologies serve as a deterrent. Gender differences significantly moderate these effects, with perceived risks with AI technologies more negatively impacting women’s AI use readiness and self-assessment of the ability to manage AI technologies showing a stronger positive impact on men’s AI use readiness. The study concludes that retailers must consider these gender-specific perceptions and attitudes toward AI to develop more effective strategies for technology integration. Our research also highlights the need to address gender-specific barriers and biases when adopting AI technology.

1. Introduction

The rise of Artificial Intelligence (AI) in the retail sector marks a pivotal shift in how businesses engage with consumers and manage operations. As AI technologies carve out new pathways for interaction and efficiency, it becomes essential to understand their impact on consumer attitudes and readiness to adopt these technologies.
Our focus on the physical retail environment brings a distinctive dimension to the discourse on AI in retail, in contrast to the existing literature, which predominantly centers on digital or online interactions. We explore how in-store AI technologies influence consumer behavior, specifically analyzing gender differences—a facet that has not been extensively explored in previous research. This approach not only provides actionable insights for retailers in physical stores but also contributes to the academic discourse by highlighting the importance of context in the deployment of AI technologies.
The proliferation of AI has transformed numerous sectors, including marketing, where its capability to mimic human cognition—encompassing perception, reasoning, learning, and prediction—promises substantial enhancements [1]. Identified as a pivotal application of AI, the marketing sector leverages these technologies to refine strategies and improve consumer engagement [2]. However, the swift advancements in AI, driven by increased computational power and data availability [3], have ushered in both opportunities and significant challenges. Concerns regarding the inevitable replacement of human roles with AI technologies spotlight the need for a balanced exploration of AI’s advantages and societal impacts. Despite its benefits, such as enhanced marketing outcomes, efficiency improvements, and cost reductions, the integration of AI into marketing strategies raises crucial ethical issues, emphasizing the importance of safeguarding privacy and ensuring fair AI use.
To address these challenges and opportunities systematically, it is crucial to understand the factors influencing AI use readiness. Anica-Popa et al. [4] highlight the positive outcomes of AI use readiness, such as improved user experiences, advanced data analytics capabilities, and tailored product offerings. However, Mahmoud et al. [5] caution against overlooking the challenges and ethical considerations inherent in AI deployment, such as workforce displacement, privacy concerns, and potential societal impacts. These studies showcase the necessity for a granular examination of AI readiness factors, particularly as they relate to ethical considerations and consumer acceptance in physical retail environments, which is imperative for stakeholders seeking to navigate this dynamic landscape effectively. The research questions we pose are: How do prior experience, perceived risks, and self-assessment of the ability to manage AI technologies influence AI use readiness among consumers in physical retail stores? What is the role of gender in mediating the impact of consumers’ prior experience, perceived risks, and self-assessed ability to manage AI technologies on their readiness to use AI?
Drawing on a quantitative cross-sectional survey involving 243 AI-familiar consumers, this research leverages structural equation modeling (SEM) in AMOS to navigate the terrain of consumer attitudes toward AI. By incorporating the multifaceted perspectives on AI provided by Xu et al. [1] and the marketing-centric insights of Sterne [2], the study examines the relationships between key constructs influencing AI use readiness. Further, it resonates with the concerns that Buchanan et al. [6] raised regarding AI’s societal implications, ensuring a balanced exploration of AI’s benefits and challenges in the retail context, such as job displacement, security issues, privacy concerns, etc.
According to previous studies, we propose a positive link between prior AI experience with AI technologies and AI use readiness, compared to the deterrent effect of perceived risks with AI technologies. Furthermore, our analysis considers the role of gender differences in shaping these dynamics, revealing varying sensitivities to perceived risks and self-assessed capabilities to manage AI technologies. This aspect of our study echoes the findings of recent literature on digital transformation in retail by Crittenden [7] and Hansen [8], emphasizing the importance of tailoring strategies to different demographic groups to foster broader AI acceptance, and Nouraldeen [9], who uncovers that gender moderates the relationship between technology readiness and perceived usefulness in AI adoption.
Our study employs structural equation modeling (SEM) to analyze the complex interplay of consumer attitudes, prior experiences, perceived risks, and AI use readiness specifically within the physical retail context. This methodological approach provides a comprehensive analysis of effects among these factors including testing impacts in relationships according to gender, offering a level of detail not commonly found in the existing literature. To our knowledge, such a nuanced application of SEM in exploring these dynamics in physical retail settings has not been extensively undertaken in previous research. However, this application reveals only the difference in AI adoption between genders, but not the relational differences that are essential for strategically planning activities to enhance adoption rates. This work contributes to the ongoing academic and practical discourse on AI in retail by providing a nuanced understanding of consumer attitudes, underscored by the significance of gender differences. It builds on foundational research by Martinez-Lopez and Casillas [10] and Grewal [11], using an adapted methodology by Meuter et al. [12], providing a fresh viewpoint that enhances discussions in both academic and practical realms concerning strategies for implementing AI.
As AI continues to redefine the retail landscape, this study stands as a testament to the complex interplay of factors shaping consumer AI use readiness of these technologies, urging a thoughtful consideration of the ethical and societal dimensions of AI integration.

2. Literature Review

The evolution of artificial intelligence (AI) stands as a testament to humanity’s enduring quest to simulate and replicate cognitive faculties within machines. As delineated by Xu et al. [1], AI represents a sophisticated emulation of human intelligence, encompassing a spectrum of capabilities, from perception to reasoning, learning, planning, and prediction. This pursuit has historical roots, aligning with the rise of computing machinery. Yet, it has recently experienced significant growth, driven by remarkable advancements in computational capabilities and the widespread adoption of various technologies, including machine learning and natural language processing [3]. The rapidly evolving digital environment, characterized by swift technological innovation and transformative digitalization, underscores the prevalence and indispensability of AI across various domains, including marketing. Crittenden [7] explains the profound impact of digital transformation, portraying it as a dynamic force reshaping business practices and customer interactions, with AI emerging as a cornerstone technology driving this metamorphosis. In concurrence, Hansen [8] underscores the significant impact of AI on marketing paradigms, highlighting how technological progress has revolutionized marketing strategies and led to a need for industry practitioners to adjust their skills and expertise to meet the new demands brought about by AI in marketing. Martinez-Lopez and Casillas [10] highlight the potential of AI-based tools to enhance strategic decision-making, customer acquisition and retention, and marketing planning in business-to-business contexts. Amidst these advancements, ethical considerations remain paramount, as articulated by Buchanan et al. [6], who caution against overlooking the societal implications of AI development, including job displacement, security concerns, and privacy infringements.
Marketing researchers recognize the growing prevalence of interactions between consumers and AI, identifying significant potential benefits these interactions bring to consumers and their lives [13,14]. However, increased AI utilization also brings inherent tensions for consumers, including concerns about privacy, dehumanization, and even addiction [15].
The advantages of AI implementation in marketing have been highlighted by Haleem [16], who points out that neural networks develop dynamic tools for providers, facilitating the processing of large datasets and providing more meaningful insights to understand better and cater to refined customer segments. The precision of AI in targeted marketing, improved advertising efficiency, and campaign optimization, among others, has been found to yield significant benefits across various business environments, including business-to-consumer (B2C) and business-to-business (B2B) settings [16]. However, despite these positive aspects, concerns persist, particularly regarding trust issues and power asymmetries, emphasizing the need for comprehensive approaches to balance AI usage’s advantages and potential drawbacks in marketing [11].
Recent research elucidates the dynamics between consumer perceptions, demographic characteristics, and organizational readiness in AI adoption. Flavián et al. [17] illustrate that technological optimism can enhance consumer intentions to use AI services like robo-advisors, while technological discomfort, paradoxically, may also promote adoption by being offset by the perceived benefits of AI.
Further, the importance of how consumers are informed about such services profoundly impacts their acceptance, emphasizing the need for effective communication strategies. Tavera-Mesías et al. [18] extend this analysis to lower-income urban consumers, finding that perceived usefulness is the main driver of intention to use mobile payment apps, with gender differences notably influencing adoption pathways; specifically, optimism plays a critical role for women, reducing their technological insecurity.
Complementing these findings, Tehrani et al. [19] connect consumer attitudes to organizational capacities, identifying eight dimensions of AI readiness crucial for successful AI implementation: informational, environmental, infrastructural, participant, process, customer, data, and technological readiness.
Integrating AI into physical retail stores has also become a focal point of scholarly inquiry, reflecting its profound implications for consumer behavior and retail operations. In line with Strube et al. [20], who demonstrated that individuals’ task choices are largely driven by self-assessment motives, AI in retail could be leveraged to enhance consumers’ ability to assess products and services effectively. Lin [21] defines AI in retail as utilizing AI technologies and algorithms to harness customer data and drive business growth. This encompasses a spectrum of applications delineated by Chen et al. [22], including smart retail outlets, autonomous shopping processes, customized services, contactless transactions, data analytics, labor cost reduction, efficiency enhancement, and competitive advantage strategies. These multifaceted implementations underscore AI’s transformative potential within the retail sector, enabling personalized marketing strategies, optimized inventory management, and enhanced customer experiences.
Murugan and Kumar [23] found that AI-driven recommendation systems significantly impact consumer purchasing decisions by providing tailored product suggestions, enhancing real-time assistance through chatbots, and improving the overall shopping experience. This increased level of personalized service fosters customer satisfaction and repeat purchases, highlighting the importance of these systems in preparing customers for AI use in retail.
The existing literature offers significant insights into the role of artificial intelligence (AI) in retail environments and its impact on consumer experiences. Beyari and Garamoun [24] contribute to this understanding by demonstrating how AI tools can tailor product offerings and influence customer consideration sets through data-driven insights. This can enhance the effectiveness of AI-powered digital assistants in retail by providing a more customized and responsive consumer experience. Appreciating these findings is essential for devising effective strategies to optimize the deployment of AI-powered digital assistants in stores, ensuring they meet consumer needs and cultivate positive shopping environments.
Nouraldeen [9] explores how technology readiness (TR) and perceived usefulness positively influence AI adoption among accounting and auditing students, while perceived ease of use does not significantly affect their decision to adopt AI. This study also highlights that gender plays a moderating role in AI adoption, with males showing a higher tendency towards adopting AI compared to females.
Moore et al. [25] conducted an ethnographic inquiry shedding light on the complex dynamics observed between consumers and AI-powered digital assistants in retail settings. Their findings revealed a diverse spectrum of experiences, ranging from apprehension and social discomfort to amusement and enjoyment, highlighting the challenges and opportunities inherent in these interactions.
Expanding on this discourse, Chen et al. [22] conducted a study involving twenty in-depth interviews to explore consumer perceptions of artificial intelligence and its marketing communication. The findings revealed that consumers perceive artificial intelligence through a multidimensional and relational lens, focusing on functionality, emotions, and the comparison between artificial intelligence and humans.
Gursoy et al. [26] proposed a conceptual framework to understand consumer acceptance of AI devices within service contexts, emphasizing the interplay of factors such as social influence and hedonistic motivations. On the other hand, Pillai et al. [27] further explored the predictors of consumer purchase intentions in AI-driven retail environments, identifying factors like perceived enjoyment and adaptability as key drivers of consumer behavior.
Lv et al. [28] explored the impact of AI application aesthetics on consumer readiness to engage in emotional versus knowledge-based service tasks. Their research uncovered the contextual factors influencing consumer receptivity, indicating a preference for visually appealing interfaces in emotional tasks, while more functional designs were favored for knowledge-based activities. Additionally, Ho et al. [29] emphasize crucial governance and design considerations that require attention to ensure that emotional AI systems and devices effectively serve the welfare of individuals and communities.
Sharma et al. [30] found that effort expectancy, performance expectancy, facilitating conditions, and social influence positively influence customer adoption of autonomous decision-making processes. Additionally, they noted that collectivism strengthens the effect of social influence on customer attitudes, while uncertainty avoidance weakens the impact of performance and effort expectancy, as well as social influence. These findings underscore the complexity of consumer adoption of AI technologies, influenced by both technological expectations and cultural contexts. Building on this understanding, Kelly et al. [31] further contribute to the discourse by demonstrating that perceived usefulness, performance expectancy, attitudes, trust, and effort expectancy significantly and positively predict behavioral intention, willingness, and use behavior of AI across multiple industries. However, they also highlight a cultural caveat: the need for human contact in some scenarios cannot be replaced by AI, pointing to the limits of technological substitution in contexts where human interaction is highly valued. Additionally, the study by Song and Kim [32] reveals that consumers’ anxiety toward robots moderates the impact of humanoid robots’ social capabilities and appearance on attitudes towards human–robot interaction, indicating that perceived risks can significantly influence AI use readiness and acceptance. Interestingly, the study by Borau et al. [33] revealed a preference for the female chatbot over the male counterpart, as it is perceived as more human-like and capable of considering individual needs. These findings underscore the ethical dilemma encountered by AI designers and policymakers: while there is a concern about women being objectified in AI, imbuing AI entities with women’s traits tends to humanize them and increase their acceptability.
Ameen et al. [34] underscore the crucial role played by trust and perceived sacrifice in mediating the impact of perceived convenience, personalization, and AI-enabled service quality on the overall customer experience with AI. Furthermore, the research emphasizes the considerable influence of relationship commitment on the AI-enhanced customer experience. Furthermore, Guha et al. [35] mention that it is worth noting that the immediate effects of AI on retailing may not be as significant as portrayed in mainstream media. Additionally, AI is anticipated to achieve greater efficacy when it is employed to enhance managerial decisions rather than entirely supplanting them.
These studies collectively contribute to understanding the complex dynamics surrounding AI use readiness in retail and its implications for consumer behavior. By leveraging these insights, retailers can develop strategic initiatives aimed at harnessing the potential of AI-driven digital assistants to curate immersive shopping experiences that resonate with consumer expectations and preferences.
The literature in consumer behavior understands perceived risk as any activity by the consumer that will result in unpleasant consequences for them. This view has been adopted by Sweeney et al. [36], who state that perceived risk can be defined as the subjective expectation of a certain loss or the subjective assessment of consumers regarding the potential consequences of wrong decisions. Therefore, perceived risk plays a crucial role in consumer behavior, especially in the context of innovative products or technology [37], when consumers might be unsure about the functionality of a new product or service.
Jacoby and Kaplan [38] extended Bauer’s work and were the first to propose that perceived risk is a multidimensional concept that includes different types of risks, namely, financial risk, i.e., the risk that the consumer will “lose” money; psychological risk, i.e., the risk that a poor choice will negatively affect the consumer’s ego; physical risk, i.e., the risk that the consumer will harm their safety or the safety of others; functional risk, i.e., the risk that the product or service will not function as the consumer expected; social risk, i.e., the risk that the consumer’s status in their social environment (among friends, family, colleagues) will change; and perceived risk of losing time, i.e., the risk that the time spent will be wasted.
Recent studies have highlighted that perceived risk affects various consumer behaviors, including online shopping intentions, impulse buying, and customer loyalty [39]. Perceived risk remains a critical factor in consumers’ acceptance of new technologies and services [40]. Furthermore, recent literature on the acceptance of a certain new technology emphasizes perceived risk as an essential factor affecting consumers’ willingness to use technology [41]. Because AI is a recent, emerging, and sophisticated technology, we can understandably assume that the average consumer may not correctly understand how the technology works [42].
Self-assessment is defined by Panadero et al. [43] as “a wide variety of mechanisms and techniques through which students describe (i.e., assess) and possibly assign merit or worth to (i.e., evaluate) the qualities of their own learning processes and products”, while Eva and Regehr [44] state that self-assessment is often (implicitly or otherwise) conceptualized as a personal, unguided reflection on performance for the purposes of generating an individually derived summary of one’s own level of knowledge, skill, and understanding in a particular area. Ross [45], citing Klenowski [46], defines self-assessment as “the evaluation of judgment of ‘the worth’ of one’s performance and the identification of one’s strengths and weaknesses with a view to improving one’s learning outcomes”. Epstein et al. [47] define concurrent self-assessment as “ongoing moment-to-moment self-monitoring; self-monitoring refers to the ability to notice our own actions, curiosity to examine the effects of those actions, and willingness to use those observations to improve behavior and thinking in the future”.
Prior experience, as explored by Varma and Marler [48], is defined within the context of technology usage as a multifaceted construct that encompasses both the competence and duration of an individual’s engagement with technology. This concept is defined through multiple dimensions: competence, which includes technical skills and general literacy as highlighted in studies such as Gallivan et al. [49], and time-based factors, which refer to the duration and frequency of use as explored by Harrison et al. [50]. Varma and Marler’s study elucidates how these dimensions collectively influence technology acceptance, indicating that prior experience affects individuals’ adaptability to new technologies [48].
Prior experience, as defined by Yi and Choi [51], encompasses the knowledge, feelings, memories, evaluations, and skills acquired through participation in or observation of events, particularly involving technology use. They further specify that experience with technology, such as artificial intelligence (AI), includes direct interactions with AI products and services that shape users’ knowledge and emotional evaluations. According to Taylor and Todd [52], cited in Yi and Choi’s study, past experience is a significant determinant of behavioral intentions, with knowledge from past experiences playing a pivotal role in the intention formation process.

3. Conceptual Framework and Hypothesis

In our research, we examined four hypotheses within the context of consumer Artificial Intelligence (AI) use readiness in physical retail stores. The formulation and testing of these hypotheses draw upon the existing literature, aiming to contribute to the ongoing discourse in smart retail and AI use readiness.
The increasing integration of AI technologies within retail environments presents a significant shift in consumer interaction and the retail landscape. The foundational research conducted by Chen and Chang [22] highlights the critical role that consumer readiness and the convenience offered by smart technologies play in facilitating retail purchases. This study, along with subsequent research by Sohn [53] and Abed [54], underscores the positive impact of prior experience with AI technologies and similar technologies (e.g., IoT devices) on consumers’ perceptions of utility and ease of use, as well as their AI use readiness in physical retail stores. Specifically, Sohn’s findings indicate that AI’s ability to reduce consumer uncertainty about product fit significantly enhances the attractiveness of retail platforms. Similarly, Abed emphasizes how prior experience with technology-enhanced shopping, such as augmented reality (AR), can predispose consumers to be more receptive to new technological integrations, including AI, in retail. This body of research suggests that previous positive interactions with AI and related technologies can mitigate reservations about and enhance the acceptance of new AI implementations in physical stores. Although Bandeira et al. [55] emphasize that augmented reality has not been established as a leading marketing tool or a particularly profitable investment, it is deemed applicable in the contexts of blended marketing and immersive augmented reality, where it has brought out favorable reactions from participants. Liu et al. [56] further corroborate this perspective, exploring the dynamics of technology acceptance and consumer behavior in retail. Chen et al. [57] report that consumers view AI marketing communications as an unavoidable and largely acceptable aspect of their interactions with brands. If consumers find AI marketing communication generally acceptable, this acceptance might reflect a familiarity and comfort with AI, suggesting a positive prior experience that could enhance their readiness to engage with AI technologies more broadly, including in physical retail stores. This reinforces the hypothesis that prior experience is likely to influence consumers’ AI use readiness technologies in physical retail stores. Accordingly, we propose the first hypothesis:
H1: 
Consumers’ prior experience with AI technologies has a statistically significant positive impact on their AI use readiness in physical retail stores.
The consumer interface with AI technologies in physical retail stores raises significant questions about privacy and data confidentiality, which are becoming increasingly critical to consumer AI use readiness. The study by Schepman and Rodway [58] serves as a foundational reference in this discourse, highlighting consumer apprehensions about personal data mining by AI applications. This privacy concern is not isolated to AI but extends to related technologies such as the Internet of Things (IoT), as noted by Abed [54], suggesting a broader apprehension towards digital technologies’ encroachment on personal privacy. Further exploration by Joshi et al. [59] into the dynamics of retail system reliability and trust elucidates the integral role of perceived security and trustworthiness in consumer decision-making processes, particularly in AI use readiness. This body of work collectively underscores a significant barrier to AI use readiness in physical retail stores: the perceived risks regarding confidentiality and privacy. Donepudi [60] identifies several additional risks associated with the integration of AI technologies in retail environments, including customer concerns, technical issues, dependency on technology, and inadequate training. Additionally, Liang et al. [61] found that perceived performance risk negatively influences consumers’ attitudes towards AI and the key findings from Wang et al. [62] indicate that perceived risk significantly moderates the relationship between consumers’ attitudes towards using unmanned technology and their behavioral intentions to make purchases. The study by Choung et al. [63] also demonstrates that trust plays a significant role in determining users’ intention to use AI technologies and Hasan et al. [42] demonstrate that factors such as perceived risk, consumer trust, interaction, and novelty value significantly influence brand loyalty for AI-supported devices. Their findings emphasize the importance of considering these factors in assessing consumer readiness for AI adoption. Such perceptions can profoundly influence consumer trust and, by extension, their AI use readiness technologies, positing that concerns over data privacy might significantly hinder the acceptance of AI use readiness in physical retail stores. Hence, we formulate our second hypothesis:
H2: 
Perceived risks with AI technologies have a statistically significant negative impact on consumers’ AI use readiness in physical retail stores.
Chang and Chen [64] found that perceived ease of use significantly influences perceived usefulness and perceived enjoyment, which directly impact shopping intentions. Additionally, they observed that the effect of perceived ease of use on perceived usefulness and shopping intentions is amplified in customers with high technology readiness. This suggests that customers’ positive self-assessment of their capability to manage technology enhances their readiness to engage with it, supporting the notion that a self-assessed ability to handle AI technologies has a significant positive impact on AI use readiness in physical stores. Similarly, Alam et al. [65] found that self-efficacy significantly influences behavioral intention to use augmented reality technology in retail, mediated by customer attitudes towards technology use. Park and Zhang [66] found that technology readiness influences user attitudes and intentions toward unmanned stores indirectly through technology paradoxes; higher technology readiness leads to more positive attitudes and greater usage intentions. Jan et al. [67] also explored various factors influencing the adoption and resistance of AI-powered conversational agents in retail settings, identifying both motivators and inhibitors affecting customer behavior. Their findings reveal that motivators such as optimism and innovativeness, which can be seen as aspects of technology readiness, significantly enhance the likelihood of AI usage, whereas inhibitors like discomfort and insecurity detract from it. Similarly, Shim et al. [68] emphasize that consumer readiness significantly enhances perceptions of service quality in self-service technologies, affecting attitudes and usage intentions toward these technologies in restaurants. It highlights how consumer readiness, which can be seen as a form of self-assessment of their ability to manage technology, significantly enhances perceptions of service quality in self-service technologies. This enhancement in perceived service quality positively affects their attitudes and intentions towards using these technologies in restaurant settings. Yin et al. [69] further corroborate these observations by demonstrating that customers with higher technology readiness are more engaged and perform better in AI-enhanced service environments, particularly when they perceive a high degree of self-congruity and trust in such settings. The study by Wang and Zhao [70] illuminates the importance of consumer-related mitigators like innovativeness and self-efficacy in navigating the perceived risks associated with autonomous retail technologies. Specifically, their findings suggest that a higher degree of self-assessed ability to manage AI technologies correlates with reduced perceived risks and an enhanced willingness to engage with these technologies. This relationship between a consumer’s confidence in their technological capabilities and their propensity for AI use readiness forms the cornerstone of our understanding of consumer behavior in tech-enhanced retail stores. Consequently, we propose the third hypothesis:
H3: 
Customers self-assessment of their ability to manage AI technologies has a statistically significant positive impact on their AI use readiness in physical retail stores.
The examination of demographic factors, particularly gender, in the context of AI use readiness in physical retail stores, offers a nuanced understanding of consumer behavior. The foundational work by Meuter et al. [12] provides a comprehensive framework that underscores the influence of individual differences—including technological anxiety, need for interaction, and demographic characteristics—on technology acceptance. This framework, together with the findings from Joshi et al. [59], which indicate that demographic factors significantly impact consumer preferences and acceptance of new retail technologies, suggests that gender might play a mediating role in the relationship between prior experience with AI technologies, perceived risks with AI technologies, self-assessment of consumers’ ability to manage AI technologies, and AI use readiness in physical retail stores. Chung et al. [71] specifically found that AI models trained on gender-specific data show decreased performance when applied to the opposite gender, emphasizing the significant impact of gender as a mediating factor in AI model efficacy and highlighting its potential implications for AI use readiness in diverse applications. Noor et al. [72] further corroborate these views by showing that gender and prior experience with AI significantly affect the development of parasocial relationships with AI service agents, impacting subjective well-being and potentially AI use readiness. Adding to this perspective, Nouraldeen [9] uncovers that gender moderates the relationship between technology readiness and perceived usefulness in AI adoption. Although derived from a study within an educational setting, these insights about gender moderation could provide valuable implications for understanding how gender might similarly influence AI adoption and readiness in retail environments, where consumer interactions with technology also vary by demographic characteristics. We seek to explore whether there are statistically significant differences in the AI use readiness between male and female consumers in physical retail stores, influenced by their prior experience with AI technologies, perceived risks with AI technologies, and self-assessments of their ability to manage AI. Drawing from these insights, we introduce the fourth hypothesis:
H4: 
Gender mediates the relationships between (1) prior experience with AI technologies, (2) perceived risks with AI technologies, and (3) self-assessment of consumers’ ability to manage AI technologies, and their respective influences on AI use readiness in physical retail stores.

4. Materials and Methods

4.1. Measurement Instrument

For the empirical research, a structured questionnaire was developed as the primary measurement instrument. This instrument was based on quantitative cross-sectional research and included questions on both 7-point and 5-point Likert scales. AI use readiness was measured on a 5-point Likert scale with six items, perceived risks with AI technologies on a 7-point Likert scale with three items, self-assessment of consumers’ ability to manage AI technologies on a 7-point Likert scale with five items, and prior experience on a 7-point Likert scale with three items, reflecting our aim to match the scale’s complexity with the nuanced nature of each construct and optimize respondent experience. The 5-point scale for AI use readiness facilitates easier responses for less nuanced perceptions, while the 7-point scales allow for finer distinctions in more complex areas such as risk perception, self-assessment, and prior experience. AI use readiness is conceptually more straightforward, often presenting a clearer dichotomy between readiness states. The 5-point scale reduces cognitive load, facilitating easier and more decisive responses from participants, which is appropriate given the direct and less nuanced nature of readiness assessment. Conversely, the constructs of perceived risks with AI technologies, self-assessment of consumers’ ability to manage AI technologies, and prior experience with AI technologies were measured using 7-point Likert scales, because they require capturing a broader spectrum of nuanced perceptions and experiences. The 7-point scales were chosen to allow for more detailed differentiation in responses, essential for accurately assessing the complex dimensions of perceived risk, self-efficacy, and prior engagement with technology. This granularity is particularly valuable as it reduces central tendency biases and enhances the reliability of the data concerning these multifaceted constructs. The content validity was tested by including three experts from the fields of marketing research, services, and retail. The questionnaire’s construction drew upon the framework established by the study by Meuter et al. [12], ensuring that it comprehensively covered aspects such as consumers’ prior experience with AI technologies, perceived risks with AI technologies, and self-assessment of the ability to manage AI technologies. We designed the items to assess AI use readiness, which was reviewed by three academics in the field of marketing research and services marketing, and included demographic details, such as gender. The questionnaire was designed and disseminated using the online platform 1KA, targeting consumers’ attitudes towards the use of AI in physical retail stores.

4.2. Sample

To gather our sample, we utilized a purposive sampling strategy, drawing participants from the networks and social circles of the authors, supplemented by a snowball sampling technique where participants were encouraged to disseminate the questionnaire among their acquaintances, friends, and family. This approach resulted in a total of 243 respondents.
The target population for this study consisted of consumers who had engaged in shopping activities within physical retail stores over the past year and were familiar with the concept of AI.

4.3. Validity and Reliability of Measurement Scales

We utilized confirmatory factor analysis (CFA) to test both convergent and discriminant validity of our measurements. To ensure the reliability of our measurements, we assessed their composite reliability indicators (CRs). The composite reliability represents the correlation between a construct and an unweighted composite of its indicators, squared. To assess convergent validity, we calculated the average variance extracted (AVE), while for discriminant validity, we employed the Fornell–Larcker test and heterotrait–monotrait ratio of correlations (HTMT), as recommended by Fornell and Larcker [73] and Hensler, Ringle, and Sarstedt [74], respectively.
Table 1 demonstrates that all composite reliabilities surpassed the recommended value of 0.6 and are in the range from 0.84 to 0.88. Furthermore, all indicator loadings exceeded 0.6, and the average variance extracted (AVE) was greater than 0.5, indicating satisfactory convergent validity. Two tests were performed to assess discriminant validity. Firstly, the Fornell and Larcker [73] test (Table 2) revealed that the square roots of AVE for all latent variables were higher than the correlations between latent variables. Additionally, Table 3 showed that heterotrait–monotrait ratios of correlations (HTMT) met the cutoff criteria of 0.85. As per Henseler et al. [74], we can conclude that the requirements for discriminant validity were satisfied.

5. Results

The direct relationship between consumers’ prior experience with AI technologies and their AI use readiness technologies in physical stores was positive but at a lower significance level (β1 = 0.258; p < 0.01), and hypothesis H1 was confirmed. As can be observed from Table 4 and Figure 1, consumers who expressed perceived risks regarding confidentiality had a negative and strong impact on their AI use readiness in physical stores (β2 = −0.374; p < 0.001), meaning H2 was confirmed. Further analysis shows that differences exist between genders. In the case of females, the relationship is significant and strong (βfemale = −0.431; p < 0.001) and non-significant in the case of males (βmale = −0.074; n.s.). The results also showed that the self-assessment of consumers’ ability to manage AI technologies had a positive and significant impact on their readiness to use these technologies in physical stores (β3 = 0.257; p < 0.01). Therefore, hypothesis H3 was confirmed. Further analysis revealed differences between genders. In the case of males, the relationship is significant (p < 0.01) and stronger (βmale = 0.490) than in the case of females (βfemale = 0.204; p < 0.05).

5.1. Invariance between Groups

An invariance test was implemented to verify the uniform comprehension of constructs across the two groups. Our objective was to measure variations in the impacts of constructs, necessitating an evaluation of configural, metric, and scalar invariance. As Steenkamp and Baumgartner [75] suggested, attaining at least configural and partial metric invariance was important to facilitate valid comparisons of latent variable impacts.
We conducted several model comparisons between the female and male groups to test the invariance of the structural model and the difference in paths. First, we allowed structural weights to differ between groups within the measurement model. We found that according to the resulting well-fitting model (refer to Table 5), we achieved configural invariance (χ2/pdf = 346.082, IFI = 0.938, TLI = 0.924, CFI = 0.936, and RMSEA = 0.050). The constraining weights uniformly across groups also did not significantly deteriorate the fit (χ2/df = 360.22, IFI = 0.937, TLI = 0.927, CFI = 0.936, and RMSEA = 0.049), indicating full metric variance. The same was also true for structural covariances. Lastly, we tested full scalar variance and constrained residuals across the groups. We did not achieve full scalar variance, but this is not necessary for testing the differences between the latent impacts of both groups. Table 5 provides a detailed overview of the results.

5.2. Differences in the Latent Variable Impacts

Next in line was testing the difference in impacts of constructs between the female and the male group. Initially, the model was examined without imposing any restrictions between latent paths. Subsequently, equality constraints were introduced to the structural paths while maintaining any constraints derived from non-invariant metric invariance.
Table 5 presents the results of the free structural weights model, where paths were left free across both groups, and the constrained structural weights model, where constraints were imposed. As can be observed, there are statistically significant differences at p < 0.10 between both models and the free structural weights model was better than the constrained structural weights model. Following this, individual tests were conducted for each path, partially constrained models were assessed, and statistically significant differences were found for the impact of perceived risks with AI technologies on AI use readiness and self-assessment on AI use readiness. The partially constrained model was shown to have just as good a fit as the free structural weights model and better than the constrained structural weight model. The model has resulted in the following fit indices: χ2/df = 342.41, IFI = 0.934, TLI = 0.926, CFI = 0.933, and RMSEA = 0.049.
Table 4 shows differences between genders in terms of the impact of perceived risks with AI technologies and self-assessment on AI use readiness. The study found that perceived risks with AI technologies have a significantly negative impact on use for females (βfemale = −0.431; p < 0.001), while the impact is insignificant for males. The impact of self-assessment on use is also different across groups, namely stronger for males than females (βfemale = 0.204; p < 0.05; βmale = 0.490; p < 0.01).

6. Discussion

6.1. Theoretical Implications

Our research extends the domain of AI use readiness theories by providing a nuanced exploration of consumer AI use readiness in physical retail stores, with a focus on the interplay of gender differences. This research contributes to theoretical advancement in several key areas.
The findings of this study challenge and expand traditional Technology Acceptance Models (TAMs) by introducing and empirically testing the mediating role of gender differences in the context of AI use readiness in retail. While prior studies have predominantly centered on the direct effects of perceived usefulness and perceived ease of use, our research enriches this narrative by demonstrating how gender-specific perceptions—particularly towards perceived risks with AI technologies and self-efficacy—modulate the AI use readiness. This contributes to a more comprehensive understanding of the multifactorial nature of technology acceptance, advocating for the inclusion of gender as a critical variable in future models.
Our study empirically emphasizes the significant influence of perceived risks with AI technologies and self-efficacy on AI use readiness. By dissecting these constructs through a gendered lens, we provide evidence that supports and extends these theories within the context of emerging retail technologies. This explanation not only deepens the theoretical discourse on the psychological mechanisms supporting AI use readiness but also suggests that these mechanisms are differentially activated in men and women.
Our findings contribute to gender theories in AI use readiness, presenting gender not merely as a demographic variable but as a lens through which the complexities of AI use readiness in retail can be understood. The gender gap observed in the impact of perceived risk on AI adoption readiness raises important questions about the underlying factors driving these differences. Societal norms and gender roles may shape individuals’ perceptions of risk differently based on gender. Women, who often face greater societal pressure to prioritize safety and security, may be more attuned to potential risks associated with new technologies like AI. Key dimensions of perceived risk in AI, such as concerns about privacy, data security, and job displacement, may contribute to this gender gap. Additionally, individual experiences and exposure to technology may vary between genders, influencing their perceptions of risk and readiness to adopt AI. Women, who have historically been under-represented in STEM (science, technology, engineering, and mathematics) fields, may have less familiarity and comfort with technology compared to men. Consequently, they may perceive AI-related risks as more salient and consequential, resulting in greater hesitation or reluctance to embrace these technologies. Moreover, cultural and contextual factors may play a role in shaping gender differences in risk perception and AI adoption readiness.
The reason why self-assessment of technological competence has a stronger impact on AI adoption could be due to the fact that men, who often hold leadership roles in technology-related fields, tend to have more confidence and proficiency in managing AI technologies. As a result, they are more likely to embrace AI because they feel equipped to handle any challenges or complexities that may arise. On the other hand, women, who may have faced barriers and biases, may feel insecure about their technological skills and abilities, which weakens the link between self-assessment of competence and AI adoption readiness. This could also be related to social norms, as men are typically socialized to exhibit more confidence and assertiveness in their abilities, which could lead to overestimating their competence in managing AI technologies, thus strengthening the relationship between self-assessment of competence and AI adoption readiness. In contrast, women may feel more pressure to demonstrate humility and caution in their self-assessments, which could lead to understating their competence and a weaker association between self-assessment and readiness to adopt AI.

6.2. Managerial Implications

From a managerial standpoint, the insights gained from this study accentuate the necessity for retailers to adopt gender-sensitive approaches when integrating AI technologies in their operations. The study found that perceived risks with AI technologies negatively impact women’s AI use readiness more than men. Therefore, retailers should focus on creating gender-specific strategies. For women, enhancing security features and providing clear, transparent information about how AI technologies protect privacy could reduce perceived risks. For men, who showed a stronger self-assessment of the ability to manage AI technologies, retailers might focus on promoting AI features that cater to enhancing control and technical engagement. The retail staff should also be trained not only in AI technology but also in understanding gender-specific consumer concerns and responses to AI. This would enable them to provide better support and guidance to customers, addressing specific fears and enhancing the customer service experience by making it more personalized and secure.
Given that positive prior experiences with AI technologies impact greater AI use readiness, retailers should develop initiatives to increase AI familiarity among all consumers. This could include interactive AI experiences in-store or workshops that demonstrate the ease of use and benefits of AI, potentially reducing perceived risks and improving self-assessment scores across genders.
The findings suggest that both perceived risks and self-assessment abilities significantly influence AI readiness. Retailers should tailor their communication to address these two factors distinctly. For instance, promotion materials could emphasize robust security measures to mitigate perceived risks and showcase easy-to-use AI interfaces to boost confidence among consumers who may feel less adept at using such technologies. They should continually collect and analyze consumer feedback specifically focused on their AI interactions. This feedback should be stratified by gender to tailor future technologies and strategies more effectively, ensuring they align with evolving consumer expectations and readiness levels.

6.3. Limitations and Further Research

While our findings offer insightful contributions to the understanding of gender differences in AI readiness among retail consumers, certain limitations must be acknowledged. The cross-sectional nature of our survey methodology precludes us from making causal inferences or tracking the evolution of consumer attitudes towards AI over time. Additionally, the focus on a singular cultural and geographical context may limit the applicability of our conclusions across different regions or retail environments, suggesting a need for caution in generalizing these results. Future research endeavors could address these limitations through longitudinal studies that examine the trajectory of consumer attitudes towards AI in retail, providing a dynamic perspective on AI use readiness. Moreover, investigating the impact of other demographic and psychographic variables on AI readiness could unveil further layers of complexity in consumer technology interactions. Expanding the research to include diverse cultural settings would also enrich our understanding of the global implications of AI in retail.
Finally, this study bridges the gap between consumer psychology, marketing, and information systems by leveraging interdisciplinary theories to explain the phenomena observed. The intersection of gender psychology with technology acceptance models offers a fruitful ground for cross-disciplinary research, suggesting that the exploration of AI use readiness and consumer technology interactions can benefit from a more holistic approach that incorporates insights from multiple academic domains.
Considering these contributions, we call for further scholarly exploration into the gendered dimensions of technology acceptance, advocating for more granular, context-specific investigations that can unravel the complex tapestry of factors influencing consumer AI use readiness in retail. Future theoretical models should consider the multifaceted interplay of psychological, sociocultural, and technological factors to provide a more nuanced understanding of consumer behavior in the digital era.

Author Contributions

Conceptualization, N.K. and A.P.; methodology, N.K., A.P. and B.M.; software, B.M. and A.P.; validation, N.K., A.P. and B.M.; formal analysis, N.K., A.P. and B.M.; resources, N.K. and B.M.; writing—review and editing, N.K., A.P. and B.M.; funding acquisition, B.M. All authors have read and agreed to the published version of the manuscript.

Funding

The authors acknowledge the financial support from the Slovenian Research and Innovation Agency (research core funding No. P5-0023 (A), Entrepreneurship for Innovative Society).

Institutional Review Board Statement

The study was conducted as per the guidelines of the Declaration of Helsinki. The research questionnaire was anonymous and voluntary, and data were analyzed only on the aggregate level.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Contact the corresponding author for access to the data presented in this study.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Xu, Y.; Liu, X.; Cao, X.; Huang, C.; Liu, E.; Qian, S.; Liu, X.; Wu, Y.; Dong, F.; Qiu, C.-W.; et al. Artificial intelligence: A powerful paradigm for scientific research. Innovation 2021, 2, 100179. [Google Scholar] [CrossRef] [PubMed]
  2. Sterne, J. Artificial Intelligence for Marketing: Practical Applications; John Wiley & Sons: Hoboken, NJ, USA, 2017. [Google Scholar]
  3. Bornet, P.; Barkin, I.; Wirtz, J. Intelligent Automation: Welcome to the World of Hyperautomation: Learn How to Harness Artificial Intelligence to Boost Business & Make our World More Human; World Scientific: Singapore, 2021. [Google Scholar]
  4. Anica-Popa, I.; Anica-Popa, L.; Rădulescu, C.; Vrîncianu, M. The integration of artificial intelligence in retail: Benefits, challenges and a dedicated conceptual framework. Amfiteatru Econ. 2021, 23, 120–136. [Google Scholar] [CrossRef]
  5. Mahmoud, A.B.; Tehseen, S.; Fuxman, L. The dark side of artificial intelligence in retail innovation. In Retail Futures; Emerald Publishing Limited: Leeds, UK, 2020; pp. 165–180. [Google Scholar]
  6. Buchanan, B.G. A (very) brief history of artificial intelligence. Ai Mag. 2005, 26, 53. [Google Scholar]
  7. Crittenden, W.F.; Biel, I.K.; Lovely, W.A. Embracing Digitalization: Student Learning and New Technologies. J. Mark. Educ. 2019, 41, 5–14. [Google Scholar] [CrossRef]
  8. Hansen, D.; Shneiderman, B.; Smith, M.A. Analyzing Social Media Networks with NodeXL: Insights from a Connected World; Morgan Kaufmann: Burlington, MA, USA, 2010. [Google Scholar]
  9. Nouraldeen, R.M. The impact of technology readiness and use perceptions on students’ adoption of artificial intelligence: The moderating role of gender. Dev. Learn. Organ. Int. J. 2023, 37, 7–10. [Google Scholar] [CrossRef]
  10. Martínez-López, F.J.; Casillas, J. Artificial intelligence-based systems applied in industrial marketing: An historical overview, current and future insights. Ind. Mark. Manag. 2013, 42, 489–495. [Google Scholar] [CrossRef]
  11. Grewal, D.; Guha, A.; Satornino, C.B.; Schweiger, E.B. Artificial intelligence: The light and the darkness. J. Bus. Res. 2021, 136, 229–236. [Google Scholar] [CrossRef]
  12. Meuter, M.L.; Bitner, M.J.; Ostrom, A.L.; Brown, S.W. Choosing among alternative service delivery modes: An investigation of customer trial of self-service technologies. J. Mark. 2005, 69, 61–83. [Google Scholar] [CrossRef]
  13. Mariani, M.M.; Perez-Vega, R.; Wirtz, J. AI in marketing, consumer research and psychology: A systematic literature review and research agenda. Psychol. Mark. 2022, 39, 755–776. [Google Scholar] [CrossRef]
  14. Pitardi, D.; Meloni, D.; Olivo, F.; Loprevite, D.; Cavarretta, M.C.; Behnisch, P.; Brouwer, A.; Felzel, E.; Ingravalle, F.; Capra, P.; et al. Alexa, she’s not human but… Unveiling the drivers of consumers’ trust in voice-based artificial intelligence. Psychol. Mark. 2021, 38, 626–642. [Google Scholar] [CrossRef]
  15. Lobschat, L.; Mueller, B.; Eggers, F.; Brandimarte, L.; Diefenbach, S.; Kroschke, M.; Wirtz, J. Corporate digital responsibility. J. Bus. Res. 2021, 122, 875–888. [Google Scholar] [CrossRef]
  16. Haleem, A.; Javaid, M.; Qadri, M.A.; Singh, R.P.; Suman, R. Artificial intelligence (AI) applications for marketing: A literature-based study. Int. J. Intell. Netw. 2022, 3, 119–132. [Google Scholar] [CrossRef]
  17. Flavián, C.; Pérez-Rueda, A.; Belanche, D.; Casaló, L.V. Intention to use analytical artificial intelligence (AI) in services–the effect of technology readiness and awareness. J. Serv. Manag. 2022, 33, 293–320. [Google Scholar] [CrossRef]
  18. Tavera-Mesías, J.F.; van Klyton, A.; Collazos, A.Z. Technology readiness, mobile payments and gender-a reflective-formative second order approach. Behav. Inf. Technol. 2023, 42, 1005–1023. [Google Scholar] [CrossRef]
  19. Tehrani, A.N.; Ray, S.; Roy, S.K.; Gruner, R.L.; Appio, F.P. Decoding AI readiness: An in-depth analysis of key dimensions in multinational corporations. Technovation 2024, 131, 102948. [Google Scholar] [CrossRef]
  20. Strube, M.J.; Lott, C.L.; Lê-Xuân-Hy, G.M.; Oxenberg, J.; Deichmann, A.K. Self-evaluation of abilities: Accurate self-assessment versus biased self-enhancement. J. Personal. Soc. Psychol. 1986, 51, 16–25. [Google Scholar] [CrossRef] [PubMed]
  21. Lin, W.W.K. AI Is Revolutionizing the Retail Industry. ResearchGate. 2023. Available online: https://www.researchgate.net/publication/370266715_AI_is_revolutionizing_the_retail_industry (accessed on 19 March 2024).
  22. Chen, J.; Chang, Y.W. How smart technology empowers consumers in smart retail stores? The perspective of technology readiness and situational factors. Electron. Mark. 2023, 33, 1. [Google Scholar] [CrossRef] [PubMed]
  23. Murugan, T.S.; Kumar, C.M. Unveiling the Evolution: Impact of Artificial Intelligence on Consumer Buying Behaviors in Online Retail Purchase. Educ. Adm. Theory Pract. 2024, 30, 1072–1078. [Google Scholar]
  24. Beyari, H.; Garamoun, H. The effect of artificial intelligence on end-user online purchasing decisions: Toward an integrated conceptual framework. Sustainability 2022, 14, 9637. [Google Scholar] [CrossRef]
  25. Moore, S.; Bulmer, S.; Elms, J. The social significance of AI in retail on customer experience and shopping practices. J. Retail. Consum. Serv. 2022, 64, 102755. [Google Scholar] [CrossRef]
  26. Gursoy, D.; Chi, O.H.; Lu, L.; Nunkoo, R. Consumers acceptance of artificially intelligent (AI) device use in service delivery. Int. J. Inf. Manag. 2019, 49, 157–169. [Google Scholar] [CrossRef]
  27. Pillai, R.; Sivathanu, B.; Dwivedi, Y.K. Shopping intention at AI-powered automated retail stores (AIPARS). J. Retail. Consum. Serv. 2020, 57, 102207. [Google Scholar] [CrossRef]
  28. Lv, X.; Luo, J.; Liang, Y.; Liu, Y.; Li, C. Is cuteness irresistible? The impact of cuteness on customers’ intentions to use AI applications. Tour. Manag. 2022, 90, 104472. [Google Scholar] [CrossRef]
  29. Ho, M.T.; Mantello, P.; Ghotbi, N.; Nguyen, M.H.; Nguyen, H.K.T.; Vuong, Q.H. Rethinking technological acceptance in the age of emotional AI: Surveying Gen Z (Zoomer) attitudes toward non-conscious data collection. Technol. Soc. 2022, 70, 102011. [Google Scholar] [CrossRef]
  30. Sharma, S.; Islam, N.; Singh, G.; Dhir, A. Why do retail customers adopt artificial intelligence (AI) based autonomous decision-making systems? IEEE Trans. Eng. Manag. 2022, 71, 1846–1861. [Google Scholar] [CrossRef]
  31. Kelly, S.; Kaye, S.A.; Oviedo-Trespalacios, O. What factors contribute to the acceptance of artificial intelligence? A systematic review. Telemat. Inform. 2023, 77, 101925. [Google Scholar] [CrossRef]
  32. Song, C.S.; Kim, Y.K. The role of the human-robot interaction in consumers’ acceptance of humanoid retail service robots. J. Bus. Res. 2022, 146, 489–503. [Google Scholar] [CrossRef]
  33. Borau, S.; Otterbring, T.; Laporte, S.; Fosso Wamba, S. The most human bot: Female gendering increases humanness perceptions of bots and acceptance of AI. Psychol. Mark. 2021, 38, 1052–1068. [Google Scholar] [CrossRef]
  34. Ameen, N.; Tarhini, A.; Reppel, A.; Anand, A. Customer experiences in the age of artificial intelligence. Comput. Hum. Behav. 2021, 114, 106548. [Google Scholar] [CrossRef]
  35. Guha, A.; Grewal, D.; Kopalle, P.K.; Haenlein, M.; Schneider, M.J.; Jung, H.; Moustafa, R.; Hegde, D.R.; Hawkins, G. How artificial intelligence will affect the future of retailing. J. Retail. 2021, 97, 28–41. [Google Scholar] [CrossRef]
  36. Sweeney, J.C.; Soutar, G.N.; Johnson, L.W. The role of perceived risk in the quality-value relationship: A study in a retail environment. J. Retail. 1999, 75, 77–105. [Google Scholar] [CrossRef]
  37. Gao, S.; Li, Y.; Guo, H. Understanding the adoption of bike sharing systems: By combining technology diffusion theories and perceived risk. J. Hosp. Tour. Technol. 2019, 10, 464–478. [Google Scholar] [CrossRef]
  38. Jacoby, J.; Kaplan, L.B. The Components of Perceived Risk. In Purdue Papers in Consumer Psychology; ACR Special Volumes SV-02; Purdue University: West Lafayette, IN, USA, 1972. [Google Scholar]
  39. Wu, L.; Chiu, M.L.; Chen, K.W. Defining the determinants of online impulse buying through a shopping process of integrating perceived risk, expectation-confirmation model, and flow theory issues. Int. J. Inf. Manag. 2020, 52, 102099. [Google Scholar] [CrossRef]
  40. Wei, Y.; Wang, C.; Zhu, S.; Xue, H.; Chen, F. Online purchase intention of fruits: Antecedents in an integrated model based on technology acceptance model and perceived risk theory. Front. Psychol. 2018, 9, 1521. [Google Scholar] [CrossRef] [PubMed]
  41. Han, M.C.; Kim, Y. Why consumers hesitate to shop online: Perceived risk and product involvement on Taobao.com. J. Promot. Manag. 2017, 23, 24–44. [Google Scholar] [CrossRef]
  42. Hasan, R.; Shams, R.; Rahman, M. Consumer trust and perceived risk for voice-controlled artificial intelligence: The case of Siri. J. Bus. Res. 2021, 131, 591–597. [Google Scholar] [CrossRef]
  43. Panadero, E.; Brown, G.T.; Strijbos, J.W. The future of student self-assessment: A review of known unknowns and potential directions. Educ. Psychol. Rev. 2016, 28, 803–830. [Google Scholar] [CrossRef]
  44. Eva, K.W.; Regehr, G. “I’ll never play professional football” and other fallacies of self-assessment. J. Contin. Educ. Health Prof. 2008, 28, 14–19. [Google Scholar] [CrossRef] [PubMed]
  45. Ross, J.A. The reliability, validity, and utility of self-assessment. Pract. Assess. Res. Eval. 2019, 11, 10. [Google Scholar]
  46. Klenowski, V. Student self-evaluation processes in student-centred teaching and learning contexts of Australia and England. Assess. Educ. 1995, 2, 145–163. [Google Scholar] [CrossRef]
  47. Epstein, R.M.; Siegel, D.J.; Silberman, J. Self-monitoring in clinical practice: A challenge for medical educators. J. Contin. Educ. Health Prof. 2008, 28, 5–13. [Google Scholar] [CrossRef] [PubMed]
  48. Varma, S.; Marler, J.H. The dual nature of prior computer experience: More is not necessarily better for technology acceptance. Comput. Hum. Behav. 2013, 29, 1475–1482. [Google Scholar] [CrossRef]
  49. Gallivan, M.J.; Spitler, V.K.; Koufaris, M. Does information technology training really matter? A social information processing analysis of coworkers’ influence on IT usage in the workplace. J. Manag. Inf. Syst. 2005, 22, 153–192. [Google Scholar] [CrossRef]
  50. Harrison, A.W.; Rainer, R.K.; Hochwarter, W.A., Jr.; Thompson, K.R. Testing the self-efficacy-performance-linkage of social-cognitive theory. J. Soc. Psychol. 1997, 137, 79–87. [Google Scholar] [CrossRef] [PubMed]
  51. Yi, M.; Choi, H. What drives the acceptance of AI technology? The role of expectations and experiences. arXiv 2023, arXiv:2306.13670. [Google Scholar]
  52. Taylor, S.; Todd, P. Assessing IT Usage: The Role of Prior Experience. MIS Q. 1995, 19, 561–570. [Google Scholar] [CrossRef]
  53. Sohn, S. Consumer perceived risk of using autonomous retail technology. J. Bus. Res. 2024, 171, 114389. [Google Scholar] [CrossRef]
  54. Abed, S.S. Literature Review of Theory-Based Empirical Research Examining Consumers’ Adoption of IoT. In Proceedings of the International Working Conference on Transfer and Diffusion of IT, Nagpur, India, 15–16 December 2024; Springer: Cham, Switzerland, 2024; pp. 3–14. [Google Scholar]
  55. Bandeira, F.; Cardoso, A.; Cairrão, Á. The wearable world in the palm of our hand: The perceived importance of augmented reality in marketing strategies. Int. J. Bus. Soc. Res. (IJBSR) 2013, 3, 117–124. [Google Scholar]
  56. Liu, R.; Balakrishnan, B.; Saari, E.M. How AR Technology is Changing Consumer Shopping Habits: From Traditional Retail to Virtual Fitting. Acad. J. Sci. Technol. 2024, 9, 140–144. [Google Scholar] [CrossRef]
  57. Chen, H.; Chan-Olmsted, S.; Kim, J.; Mayor Sanabria, I. Consumers’ perception on artificial intelligence applications in marketing communication. Qual. Mark. Res. Int. J. 2022, 25, 125–142. [Google Scholar] [CrossRef]
  58. Schepman, A.; Rodway, P. Initial validation of the general attitudes towards Artificial Intelligence Scale. Comput. Hum. Behav. Rep. 2020, 1, 100014. [Google Scholar] [CrossRef] [PubMed]
  59. Joshi, A.; Pani, A.; Sahu, P.K.; Majumdar, B.B.; Tavasszy, L. Gender and generational differences in omnichannel shopping travel decisions: What drives consumer choices to pick up in-store or ship direct? Res. Transp. Econ. 2024, 103, 101403. [Google Scholar] [CrossRef]
  60. Donepudi, P.K. Robots in Retail Marketing: A Timely Opportunity. Glob. Discl. Econ. Bus. 2020, 9, 97–106. [Google Scholar] [CrossRef]
  61. Liang, Y.; Lee, S.H.; Workman, J.E. Implementation of artificial intelligence in fashion: Are consumers ready? Cloth. Text. Res. J. 2020, 38, 3–18. [Google Scholar] [CrossRef]
  62. Wang, I.C.; Liao, C.W.; Lin, K.P.; Wang, C.H.; Tsai, C.L. Evaluate the consumer acceptance of AIoT-based unmanned convenience stores based on perceived risks and technological acceptance models. Math. Probl. Eng. 2021, 2021, 4416270. [Google Scholar] [CrossRef]
  63. Choung, H.; David, P.; Ross, A. Trust in AI and its role in the acceptance of AI technologies. Int. J. Hum. Comput. Interact. 2023, 39, 1727–1739. [Google Scholar] [CrossRef]
  64. Chang, Y.W.; Chen, J. What motivates customers to shop in smart shops? The impacts of smart technology and technology readiness. J. Retail. Consum. Serv. 2021, 58, 102325. [Google Scholar] [CrossRef]
  65. Alam, S.S.; Susmit, S.; Lin, C.Y.; Masukujjaman, M.; Ho, Y.H. Factors affecting augmented reality adoption in the retail industry. J. Open Innov. Technol. Mark. Complex. 2021, 7, 142. [Google Scholar] [CrossRef]
  66. Park, H.J.; Zhang, Y. Technology readiness and technology paradox of unmanned convenience store users. J. Retail. Consum. Serv. 2022, 65, 102523. [Google Scholar] [CrossRef]
  67. Jan, I.U.; Ji, S.; Kim, C. What (de) motivates customers to use AI-powered conversational agents for shopping? The extended behavioral reasoning perspective. J. Retail. Consum. Serv. 2023, 75, 103440. [Google Scholar] [CrossRef]
  68. Shim, H.S.; Han, S.L.; Ha, J. The effects of consumer readiness on the adoption of self-service technology: Moderating effects of consumer traits and situational factors. Sustainability 2020, 13, 95. [Google Scholar] [CrossRef]
  69. Yin, D.; Li, M.; Qiu, H. Do customers exhibit engagement behaviors in AI environments? The role of psychological benefits and technology readiness. Tour. Manag. 2023, 97, 104745. [Google Scholar] [CrossRef]
  70. Wang, Q.; Ji, X.; Zhao, N. Embracing the power of AI in retail platform operations: Considering the showrooming effect and consumer returns. Transp. Res. Part E Logist. Transp. Rev. 2024, 182, 103409. [Google Scholar] [CrossRef]
  71. Chung, H.; Park, C.; Kang, W.S.; Lee, J. Gender bias in artificial intelligence: Severity prediction at an early stage of COVID-19. Front. Physiol. 2021, 12, 778720. [Google Scholar] [CrossRef] [PubMed]
  72. Noor, N.; Rao Hill, S.; Troshani, I. Artificial intelligence service agents: Role of parasocial relationship. J. Comput. Inf. Syst. 2022, 62, 1009–1023. [Google Scholar] [CrossRef]
  73. Fornell, C.; Larcker, D.F. Evaluating structural equation models with unobservable variables and measurement error. J. Mark. Res. 1981, 18, 39–50. [Google Scholar] [CrossRef]
  74. Hensler, J.; Ringle, M.C.; Sarstedt, M. A new criterion for assesing discriminant validity in variance-based structural equation modeling. J. Acad. Mark. Sci. 2015, 43, 115–135. [Google Scholar] [CrossRef]
  75. Steenkamp, J.B.E.; Baumgartner, H. Assessing measurement invariance in cross-national consumer research. J. Consum. Res. 1998, 25, 78–90. [Google Scholar] [CrossRef]
Figure 1. Conceptual Model with Path Coefficients of AI Use Readiness in Physical Retail Stores.
Figure 1. Conceptual Model with Path Coefficients of AI Use Readiness in Physical Retail Stores.
Information 15 00346 g001
Table 1. Indicators, means, standard deviations with factor loadings, CR, and AVE.
Table 1. Indicators, means, standard deviations with factor loadings, CR, and AVE.
Items and ConstructsMeanStandard DeviationsFactor Loadings CRAVE
Prior experience with AI technologies
I use many products and services supported by artificial intelligence.4.231.670.88
I do not have much experience using artificial intelligence technologies.4.521.780.650.840.63
In everyday life, I usually use many artificial intelligence technologies.4.281.690.84
Perceived risks with AI technologies
The use of artificial intelligence technologies invades my privacy.4.301.850.87
I am afraid that the use of artificial intelligence technologies in physical stores reduces the confidentiality of my data.4.571.880.780.860.68
In general, the use of artificial intelligence technologies is risky.4.451.750.82
Self-assessment of consumer’ ability to manage AI technologies
I am confident in my ability to use artificial intelligence technologies.5.011.650.87
I am fully capable of using artificial intelligence technologies.5.061.670.84
I do not feel that I am skilled enough for the task of using artificial intelligence technologies.5.141.660.680.880.59
My past experiences increase my confidence that I will be able to successfully use artificial intelligence technologies.4.761.710.61
The use of artificial intelligence technologies is within my capabilities.5.241.570.81
AI use readiness (for specific technologies)
Autonomous shopping processes4.762.080.78
Data collection3.891.880.60
Self-service terminals5.761.530.660.860.50
Electronic mirrors4.732.050.71
Chatbots3.962.160.72
Smart shelves4.981.810.76
Table 2. Correlations between latent variables, and Fornell and Larcker’s test.
Table 2. Correlations between latent variables, and Fornell and Larcker’s test.
1.2.3.4.
1. Prior experience with AI technologies0.795
2. Perceived risks with AI technologies−0.362 ***0.822
3. Self-assessment of consumers’ ability to manage AI technologies0.661 ***−0.430 ***0.767
4. AI use readiness0.563 ***−0.578 ***0.588 ***0.706
Significance of correlations: *** p < 0.001. Bolded values represent square roots of AVE.
Table 3. HTMT correlation matrix.
Table 3. HTMT correlation matrix.
1.2.3.
1. Prior experience with AI technologies
2. Perceived risks with AI technologies0.366
3. Self-assessment of consumers’ ability to manage AI technologies0.6880.459
4. AI use readiness0.5710.5950.610
Table 4. Results of the structural model for the entire sample and single groups (females and males).
Table 4. Results of the structural model for the entire sample and single groups (females and males).
AllSig.FemaleSig.MaleSig.
H1: Prior experience with AI technologies -> AI use readiness0.258p < 0.010.264p < 0.010.267p < 0.01
H2: Perceived risks with AI technologies -> AI use readiness−0.374p < 0.001−0.431 *p < 0.001−0.074 *n.s.
H3: Self-assessment of consumers’ ability to manage AI technologies-> AI use readiness0.257p < 0.010.204 *p < 0.050.490 *p < 0.01
* Significant differences between groups exist.
Table 5. Invariance test results and comparisons between structural models.
Table 5. Invariance test results and comparisons between structural models.
Measurement Modelχ2dfχ2/df sig.IFITLICFIRMSEA
Configural invariance346.082226 0.9380.9240.9360.050
Full metric invariance360.217239p = 0.3640.9370.9270.9360.049
Structural covariances377.631249p = 0.1100.9330.9260.9320.049
Full scalar invariance422.674266p > 0.0010.9170.9150.9170.053
Structural Modelχ2dfχ2/df sig.IFITLICFIRMSEA
Free structural weights370.679245 0.9340.9260.9330.049
Constrained structural weights377.606248p = 0.0740.9320.9250.9310.050
Partially constrained structural weights372.412246p = 0.1880.9340.9260.9330.049
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Kolar, N.; Milfelner, B.; Pisnik, A. Factors for Customers’ AI Use Readiness in Physical Retail Stores: The Interplay of Consumer Attitudes and Gender Differences. Information 2024, 15, 346. https://doi.org/10.3390/info15060346

AMA Style

Kolar N, Milfelner B, Pisnik A. Factors for Customers’ AI Use Readiness in Physical Retail Stores: The Interplay of Consumer Attitudes and Gender Differences. Information. 2024; 15(6):346. https://doi.org/10.3390/info15060346

Chicago/Turabian Style

Kolar, Nina, Borut Milfelner, and Aleksandra Pisnik. 2024. "Factors for Customers’ AI Use Readiness in Physical Retail Stores: The Interplay of Consumer Attitudes and Gender Differences" Information 15, no. 6: 346. https://doi.org/10.3390/info15060346

APA Style

Kolar, N., Milfelner, B., & Pisnik, A. (2024). Factors for Customers’ AI Use Readiness in Physical Retail Stores: The Interplay of Consumer Attitudes and Gender Differences. Information, 15(6), 346. https://doi.org/10.3390/info15060346

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop