1. Introduction
The existence and transmission of health misinformation have led to severe consequences to personal health, social media platform operation, and social stability [
1,
2]. For instance, when the Zika virus broke out, misinformation related to this virus attracted widespread attention on Facebook and was more popular than correct, reliable information [
3]. The expenditure for launching unnecessary informational promotion activities to correct such misinformation grew significantly [
4]. Furthermore, in the wake of the COVID-19 outbreak, more than 600 people died in Iran after they drank high levels of alcohol in the mistaken belief that it would protect them against the virus [
5,
6]. Hence, the public’s belief in misinformation leads to more dangerous consequences than ignorance [
4].
In recent years, the Internet has become the main source of information, and with a surge in demand for health information, most people choose to browse the Internet to obtain it. According to a report published by the Pew Research Center, 72% of adults in the US have searched for at least one type of health information on the Internet [
7]. Because of its convenience, the Internet meets the public’s demands for having face-to-face consultations with professional medical and nursing personnel [
8]. Nonetheless, it is also responsible for the prevalence of health misinformation [
9], because traditional quality-control mechanisms, such as professional editors, are excluded from the information-generation process. Particularly, the development of Web 2.0 has changed Internet users from passive information consumers to users who actively generate content on websites such as Weibo, Zhihu, and YouTube [
4]. Misinformation is consequently widely transmitted and starts trending on social media platforms [
10]. Notably, health information is one of the information sources that attracts the most attention but is most severely affected by misinformation [
11].
UNESCO, in its working documents and reports, defines misinformation as unintentional misinformation disseminated with confidence of its authenticity, usually with no apparent intention of profit behind it [
12,
13]. The consensus of the scientific community provides a relatively clear distinction between true information and misinformation. Health misinformation, which is the focus of this study, is contrary to the cognitive consensus of the scientific community on a certain phenomenon [
14]. There are three modes of handling misinformation. The first mode is preemptive prevention. In this mode, true information is transmitted to the public before it is subject to misinformation. Therefore, relevant studies focus on the acceptance of health information [
15]. The second mode utilizes the withdrawal or deletion of misinformation. Some studies have prioritized the identification of misinformation and discontinution of its dissemination in time [
16]. Although this mode is conducive to mitigating the impact of misinformation, it does not eliminate it [
17]. The third, and often the most effective, mode when individuals pursue accuracy motivational goals in processing scientific information includes explanation and correction. This mode is the focus of this study. Correction provides information about beliefs that individuals may hold, stemming from previous contact and communication [
18].
The public’s belief in misinformation leads to more dangerous consequences than ignorance [
4]. Thus, to successfully right users’ cognitive errors, corrective health information must convince them. This study differs from past studies in terms of information adoption, because correcting misinformation means the original beliefs held by information recipients must be changed. Misinformation rebuttals are messages telling individuals to ignore or disbelieve previous information [
18]. In other words, persuasive knowledge is required for information recipients under high-level cognitive conflicts. Additionally, professional knowledge is needed for such refusal of health misinformation to occur, because it is more difficult for the public to cognitively process health information than other types of information. To our knowledge, no prior study evaluated the impact of information recipients’ cognitive conflicts and knowledge self-confidence while refuting health misinformation.
This study aims to explore an effective way to correct health misinformation and eliminate its adverse effects. Although some information on social media can be misleading, or even deceptive, the more people feel they can trust health information, the more they are willing to accept it [
19,
20]. We constructed a theoretical model for credibility-oriented determinants refuting misinformation based on the elaboration likelihood model. The partial least squares (PLS) method was used to verify the proposed model. The findings of this study complement previous research on social media and online health. Alternatively, this study can help relevant practitioners (i.e., government officials, platform managers, medical and health service personnel, and educators) find the most effective correctional interventions for governing health misinformation.
2. Research Model and Hypotheses
The rebuttal of misinformation falls in the category of persuasion information, which aims to correct the public’s misunderstandings and provide knowledge about the truth of matters. The elaboration likelihood model (ELM), proposed by psychologists Petty and Cacioppoti, is one of the most authoritative theories in the field of knowledge persuasion [
21]. It has been widely used to describe how people process information and form their attitudes toward behaviors. According to the ELM, persuasion can be achieved through one or both of the following routes: the central and peripheral routes. The central route of persuasion focuses on information factors. In this route, the information recipient inputs a large number of cognitive resources for the elaborate processing of information to produce the perception of contacted information [
22]. In comparison, the peripheral route of persuasion focuses on irrelevant factors, such as the source and presentation of information. In this route, the recipient processes information on the low level [
23]. Both routes signify that one’s attitudes take shape or vary according to intrinsic information processing capabilities [
24]. Previous empirical studies have verified the application of the ELM in various fields. For example, the ELM and technology acceptance model (TAM) can be combined to study how knowledge-based employees evaluate information and accept advice [
24]. Chung et al. studied the adoption of tourist information on social media through the ELM and explored the moderating roles of social presence [
25]. Tseng and Wang investigated the information adoption process on tourist websites regarding cognitive risks through the integrated model of ELM and perceived usefulness [
26].
According to the three main factors of the effectiveness of information transmission, the influencing factors of the credibility of refusals can be divided into the information source, the information itself, and the information receiver. Based on the ELM, this study constructs a structural equation model for relevant determiners. Specifically, information quality is used as the central route, while source credibility is regarded as the peripheral route. As for the recipient, we explore the moderating role of the public’s cognitive conflicts and knowledge self-confidence in central and peripheral routes.
2.1. Perceived Information Quality
Perceived information quality is defined as the values and proof persuasion of information [
24,
27,
28]. According to the application of the ELM on information adoption, information quality affects one’s attitudes through the central route. Information quality affects the degrees of perceived usefulness [
24,
27] and trust [
21] in information, as well as users’ attitudes and willingness [
29]. High-quality information has a significant impact on the persuasion effect [
30], and it plays a role in changing people’s attitudes, even when they are concerned about privacy [
22]. Conversely, low-quality, irrational, and non-persuasive information has no significant impact on recipients’ attitudes [
31]. Excessive advertisements and misleading health information on social media make it more difficult for users to identify whether the information is true or false. Health information with high perceived quality would have an increasingly vital role in determining people’s trust in health information [
15]. Typically, the information quality of Internet health information is generally affected by relevance, understandability (i.e., clarity and readability), adequacy (i.e., sufficiency, completeness, and necessity), and usefulness [
29,
32]. The following hypotheses are put forward based on existing studies:
Hypothesis 1 (H1): Perceived information quality would have a positive relationship with the perceived credibility of rebuttals concerning health misinformation.
Hypothesis 1a (H1a): Information relevance would have a positive relationship with perceived information quality.
Hypothesis 1b (H1b): Information understandability would have a positive relationship with perceived information quality.
Hypothesis 1c (H1c): Information adequacy would have a positive relationship with perceived information quality.
Hypothesis 1d (H1d): Information usefulness would have a positive relationship with perceived information quality.
2.2. Perceived Source Credibility
Source credibility refers to the degree of credibility of the information sender as perceived by the information recipient [
23]. Thus, it represents an attitude toward the information source and is irrelevant in terms of the information itself [
33]. According to questionnaire survey and experimental study, the impact of perceived source credibility on people’s attitudes and information accepting behaviors is widely accepted. This notion affects the adoption of tourist information from user-generated content on social media [
25] and the evaluation of online health information [
34]. Source credibility plays a vital role in improving users’ experiences and enhancing their behavioral intentions in the virtual community [
35]. Individuals are more inclined to believe information from a highly reliable source rather than a source with low reliability [
36]. The misinformation rebuttals on the Internet do not exist independently but are overshadowed by a large number of true or false information flows [
37]. Understandably, source credibility allows people to handle information through the peripheral route rather than rely on complicated cognitive processing [
21]. Many virtual communities infer the credibility of a knowledge source through the user influence system based on historical contributions and published records [
38]. The user’s authority is valid for judging the source credibility of microblog information [
39]. The recipient’s decision-making process will be more affected by the provider if the knowledge provider has a high level of professional knowledge [
40]. The perceived source reliability of health information mainly depends on the expertise (i.e., competence, skill, and knowledge) and authority (i.e., reputation, status, and influences) of information publishers as perceived by the public. The following hypotheses are put forward based on existing literature:
Hypothesis 2 (H2): Perceived source credibility would have a positive relationship with the perceived credibility of health misinformation rebuttals.
Hypothesis 2a (H2a): Source expertise would have a positive relationship with perceived source credibility.
Hypothesis 2b (H2b): Source authority would have a positive relationship with perceived source credibility.
2.3. Moderating Effect of Cognitive Conflict
Regarding the persuasion field, many studies have shown that people accept information more easily when it is consistent with what they consider to be correct. The consensus not only enhances users’ trust in provided information but also effectively influences the recipient’s opinions, attitudes, and beliefs [
41]. After receiving new information, people immediately evaluate whether it is compatible with the logic of other facts and cognitive beliefs. If the information conflicts with the original perception, people are more likely to resist changing their original beliefs [
42,
43]. Therefore, conflicts with original perception may make it less likely to successfully correct misinformation.
If the information is inconsistent with one’s beliefs, it may trigger negative emotions. Further, examining information inconsistent with one’s beliefs is not as smooth as evaluating information consistent with those beliefs. Typically, conveniently deciphered information is more familiar and more easily accepted. Conversely, inconvenience triggers negative feelings and urges people to examine the information more carefully [
44,
45]. This process requires more effort, motivation, and cognitive resources [
4]. Consequentially, such people may seek help from transmitters’ evaluations of reliability. When the information received by consumers counters their preconceived perceptions, stronger correlations between emotional trust and behavioral intentions are formed [
46]. The following hypotheses are proposed based on existing studies:
Hypothesis 3a (H3a): Cognitive conflict would moderate the relationship between perceived information quality and perceived information credibility.
Hypothesis 3b (H3b): Cognitive conflict would moderate the relationship between perceived source credibility and perceived information credibility.
2.4. Moderating Effect of Knowledge Self-Confidence
The impact of the perceived information quality on one’s attitudes varies according to situations and is affected by personal abilities in specific circumstances [
47]. Knowledge self-confidence refers to a self-assessment of the degree to which individuals think they understand relevant scientific knowledge [
48,
49]. Perceived information quality is a subjective evaluation of information content and depends on the individual’s previous experience and professional knowledge [
21]. Both one’s knowledge and skills can be employed to handle information. In some cases, the content of information is read, processed, and considered, and in other cases, the content may be neglected entirely. Such differences may result from recipients’ different interpretations of knowledge content [
24,
50]. In the peripheral route, impacts are mainly created through simple decision-making standards and clues such as reputation, charisma, or appeal [
22]. Individuals may use such clues because they do not want to invest necessary cognitive resources or do not make an effort owing to limited capacities. When judging the authenticity of information through source credibility, users need not have complicated cognitive processing for strongly professional health information. Non-expert users are more inclined to rely on what are known as marginal clues (i.e., source credibility) [
51,
52]. Evaluating the credibility of health misinformation rebuttals requires more professional knowledge. Source credibility may be the most pivotal factor for non-experts to evaluate information [
24]. Hence, the following hypotheses are proposed based on existing studies:
Hypothesis 4a (H4a): Knowledge self-confidence moderates the relationship between perceived information quality and perceived information credibility.
Hypothesis 4b (H4b): Knowledge self-confidence moderates the relationship between perceived source credibility and perceived information credibility.
Based on the analysis above, this study presents the research model illustrated in
Figure 1.
3. Methods
This study cited the example of rebuttals of health misinformation on the Sina microblogging platform. The PLS structural equation model was used to verify the hypothesis model. The model structure is characterized as reflective first order.
The questionnaire comprised three parts. The first part collected respondents’ personal information and studied respondents’ original understanding of one type of health information, namely, judging whether there are cognitive mistakes. We asked participants, “Do you think bone soup can supplement calcium?” The respondents answering with “Yes” (cognitive errors) were screened into the second and third parts.
In the second part, a situational questionnaire was used to describe specific response situations to the participants through pictures and texts. This allowed participants to imagine themselves in the situation and their responses to be subsequently measured. This method can reduce the influence of biases caused by factors, such as memory and comprehension.
Figure 2 shows the health misinformation rebuttals to the respondents. We asked respondents to answer questions based on their actual perceptions of the experimental situation. For example, the respondents’ perceptions of source credibility were mainly their perceptions of the credibility of Sina Weibo and DX Doctor, and this differed between respondents.
The third part included the measurement items for each variable. The present scale was mainly derived from the mature measurement scale in the existing literature. The initial scale for the study situation in this study was designed based on the characteristics of health misinformation rebuttals and was improved through pre-investigation procedures (see
Table 1). A score of one to seven was given ranging from total negative to total positive.
For data collection, we utilized So Jump, an online questionnaire survey platform that randomly sends the questionnaire to users, and eventually collected 415 valid questionnaires from 22 November to 3 December 2019. To ensure the quality of the questionnaire, we paid the respondents. There were 166 men (40%) and 249 women (60%). Regarding the age distribution, most respondents (i.e., 384) were aged between 18 and 40 years, accounting for 92.53% of the total sample size. Most of the respondents had bachelor’s degrees and belonged to various industries. The demographics for the research sample are presented in
Table 2.
4. Data Analysis and Results
4.1. Non-Response Bias
Non-response bias refers to the fact that a respondent’s failure to answer the questionnaire due to various reasons may lead to bias in the research results. Armstrong and Overton (1977) argued that late responders are more likely to be similar to non-responders than early responders [
55]. This study compared whether there were significant differences in occupation and education between early respondents (207 respondents who completed the questionnaire first) and later respondents (208 respondents who completed the questionnaire later). The independent sample t-test results showed no significant difference between the early and later stage respondents in occupation and education (
p > 0.05) [
56]. This indicates that the non-response bias in this study was not obvious and could be ignored.
4.2. Common Method Bias
When all data are from the same questionnaire, there may be common method bias, which affects the effectiveness of the study. This study examined the potential existence of common method bias through several procures. First, Harman’s single factor test was used. The results showed that the first (largest) factor accounted for 35.883% of the variance, no single factor explained more than 40% of the variance, and all factors explained 73.358% of the variance [
57]. Then, the marker variable method was used to add a variable theoretically unrelated to other latent variables to the model [
58,
59]. The test showed that the label variable has no significant influence on the variables in the original model. Therefore, common method bias was not a key issue in this study.
4.3. Assessment of Reliability and Validity
We tested the indicator reliability, convergent validity, and discriminant validity to ensure the measurement results were reliable and valid. Reliability was verified through Cronbach’s alpha (CA) and composite reliability (CR). As shown in
Table 3, CA and CR were both larger than 0.7, denoting high reliability of the data [
60].
Convergent validity was evaluated through item loadings, CR, and average variance extracted (AVE). As illustrated in
Table 3 and
Table 4, the values of item loadings and CR were larger than 0.7, and the AVE values were larger than 0.5, meaning the data had satisfactory convergent validity [
61].
Discriminant validity was appraised by comparing the square root of the AVE of each construct to the inter-construct correlations, and by comparing the item loadings to the cross-loadings. Moreover, the heterotrait-monotrait ratio (HTMT) is usually no more than 0.85, and when the concepts of perspectives are similar, the HTMT threshold can be extended to 0.90 [
62].
Table 5 shows that the square root of the AVE of each construct was greater than the inter-construct correlations. Similarly,
Table 4 depicts that all item loadings were higher on their factor than on any other factor.
Table 6 shows that most HTMT was no more than 0.85, and all HTMT was no more than 0.90. These results confirmed the discriminant validity.
4.4. Assessment of the Structural Model
The results of the hypotheses based on the t-values, confidence intervals, and value of f-squared are presented in
Table 7. The path coefficients and explained variance of the structural model are revealed in
Figure 3. The
of the perceived credibility of rebuttals, perceived information quality, and perceived source credibility were 0.469, 0.334, and 0.442, respectively. The
of the perceived credibility of rebuttals, perceived information quality, and perceived source credibility were 0.614, 0.499, and 0.566, respectively. The structural model showed high prediction accuracy. The perceived information quality (
) on the central route and perceived source credibility (
) on the peripheral route significantly affected the perceived credibility of the rebuttals of health misinformation. Hypotheses 1 and 2 were thus supported.
Further, information relevance (), understandability (), and usefulness () all significantly affected perceived information quality; however, the impact of information adequacy remained insignificant (). Thus, Hypotheses 1a, 1b, and 1d were also supported, while hypothesis 1c was rejected. Additionally, information source expertise () and authority () had a significant impact on perceived source credibility; therefore, Hypotheses 2a and 2b were supported. Cognitive conflict negatively moderated the impact of perceived information quality on perceived information credibility (), and positively moderated the impact of perceived source credibility on perceived information credibility (). Hypothesis 3a and 3b were consequently supported. Moreover, knowledge self-confidence played a negative moderating role in the relationship between perceived information quality and perceived information credibility (). However, its moderating role in the relationship between source credibility and perceived information credibility was insignificant (), suggesting that Hypothesis 4a was valid, while Hypothesis 4b was invalid.
5. Discussion
The perceived credibility of health misinformation rebuttals can be improved from two aspects: information content itself and information source. In addition to authoritative experts having credibility recognized more by the public, they have a stronger ability to produce high-quality debunking information. Under some circumstances, the influence of the information itself on information credibility may be more important than the information source. Moreover, under circumstances wherein the harmful effects of health misinformation are relatively weak (not enough to attract the attention of authorities and many experts), the science involved in correcting health information is relatively simple, or it is difficult to verify that information through practice (e.g., measurement of calcium content in bone soup). Cho et al. (2011) found that information from different sources may have the same influence, for example, whether the recipient is told that the information comes from research funded by “ExxonMobil” or “people like you” [
63]. These findings suggest that the factors of source reliability may be overlooked sometimes. Additionally, the crux of the information is often easier to remember than the source, and compelling stories from unreliable sources may be recalled or accepted long after the sources are forgotten [
4]. However, it is difficult to accurately judge whether the health information is credible or not based only on the information content itself [
64]. Focusing on the information source can reduce the sharing of false information [
65]. Nonetheless, the combined effect of information quality and source reliability can enhance information credibility to a greater extent. Caulfield recommended that accurate information of value to the public must be shared and called on scientists to participate in science-related communication on social media [
66].
Information quality can be improved by enhancing information relevance (i.e., meeting the audience’s demands and attracting the public’s interest), understandability (i.e., ensuring rational logic and high readability), and usefulness (i.e., helping the public solve problems with practically and feasibly) [
29,
32]. Public demand for information adequacy is not high. Information overload, similar to information scarcity, can be a hindrance; therefore, a full and comprehensive exposition of information is not necessary in some cases. People should take action to disseminate high-quality information globally that is accurate, easy to digest, engaging, and easy to share on mobile devices. Information must be customized to the recipient, since different people can perceive the same things in dramatically different ways. Encouraging individuals with large numbers of followers to share corrective or high-quality information and encouraging scientists to communicate more with the public on social media platforms may be effective strategies for combating false health information online.
The overwhelming information deluge on social media has affected the public’s understanding, and it is particularly significant to make use of expert resources to ensure that high-quality misinformation rebuttals stand out. The credibility of information sources often enhances the persuasiveness of communication. Ideally, the public should maintain a high level of trust for reliable sources, while trust in unreliable sources should be reduced. In practice, however, it often is difficult for the public to determine a source’s reliability. When people are confused about who can provide accurate information, it would be helpful to provide users with clearer indicators of the reliability of a source. Therefore, social media could rate information sources, such as through expert ratings (expert ratings of articles) and user rating of articles or information sources. Messages are popular mainly because influencers share them with their audiences [
67]. Communicators with high credibility should pay attention to the authenticity of health information before they disseminate it to their followers, as individuals and businesses with a large social media audience have a greater responsibility to verify the accuracy of any health information they share. They should refer to points from real medical reports and authoritative experts, only forward information from trusted health knowledge providers, and make full use of the authority effect and platform effect to improve information credibility.
Although the cognitive conflict of the information receiver has a significant moderating role in the two paths of information quality and source credibility, the moderating effects are completely different. When cognitive conflict is high, the influence of information quality on perceived information credibility is hampered, while the effect of source credibility on perceived information credibility is enhanced. People evaluate the logical compatibility of the information they receive with other facts and beliefs. Once information is accepted, it is highly impervious to change. From the perspective of cognitive consistency, this resistance stems from subsequent inconsistencies in the information that result from the refusal to admit that previous information is false. Thus, conflict with existing knowledge reduces the likelihood that it will be successfully corrected. If there is insufficient evidence that the original perception is wrong, the easiest way to resolve this conflict may be to revert to the original idea and ignore the corrective input.
In general, people tend to hold on to what they already know. Changing perceptions entails additional motivation and cognitive resources, and if the topic is not of interest to the public, it is very difficult to change predominant misconceptions [
4]. Therefore, for the receivers of information with high cognitive conflict, only high-quality persuasive arguments can convince them to change their original attitudes and beliefs, such as solid evidence and logic of eloquence. At this point, it is easier to persuade a person or medium that is highly trusted by the receiver of the message to disprove the misinformation. Evidently, inconsistent information can trigger negative emotions and increase the difficulty for the receiver to process the information. Nonetheless, trust in an information sender can supplement positive emotions without any effort to process the information, and it can be easily gauged by the marginal clue of source credibility.
The knowledge self-confidence of information receivers is not significant in moderating the relationship between source reliability and perceived information credibility, but it weakens the influence of information quality on perceived information credibility. Moreover, higher cognitive receivers need higher-quality persuasive arguments to persuade them to change their original beliefs. Contrariwise, this phenomenon may be attributable to the professionalism of health information. Hence, an audience with higher knowledge self-confidence trusts its original cognition or professional level but does not exclude that which it believes is inaccurate. Reliance on false knowledge is not the same as ignorance, which is lack of relevant knowledge. Ignorance can have significant adverse effects on decisions, but these effects may not be as severe as the impact of trust in false knowledge. When people lack knowledge, they often rely on simple heuristics for making decisions. Overall, they have a relatively low level of confidence in decisions based solely on heuristics [
68,
69]. In other words, ignorance rarely leads to strong support for an idea, but if there is a high level of confidence in one’s knowledge, this support is often powerful. For example, individuals who oppose the scientific evidence for climate change most strongly are usually those who think they are experts on the subject [
70]. Pre-existing scientific knowledge may influence the interpretation of newly received scientific knowledge [
4]. When individuals are confident in their knowledge, but they cannot make a reasonable explanation with their original knowledge, they may be more difficult to be persuaded. This may be because an audience with high knowledge self-confidence has higher requirements regarding scientific explanations and information quality.
The public should be aware of the limitations of its cognition, and confidence in existing knowledge should not be an obstacle to new ideas. Social media should pay close attention to the “water army” of the Internet, use strict measures to prevent misinformation from spreading, and increase the speed at which misinformation rebuttals are spread by changing the manner in which information flow is presented. Thus, with these changes, the public could receive correct information before being subjected to misinformation.
6. Conclusions
To address health misinformation and enhance public health, the results of this study suggest that perceived credibility of health misinformation rebuttals can be improved by enhancing the information quality and information source. Information quality has a stronger influence on information credibility than information source under some circumstances. Information quality can be improved by enhancing information relevance, understandability, and usefulness. Source reliability can be improved by enhancing source expertise and authority. However, the cognitive conflict and knowledge self-confidence of information receivers weaken the influence of information quality on information credibility. In contrast, cognitive conflict can strengthen the influence of source credibility on information credibility. The public should be aware of the limitations of its cognition. Governments, platform managers, medical and health service personnel, and educators should combine the effect of information quality and source reliability to enhance information credibility.
This study constructed a theoretical model of the factors influencing the perceived credibility of health misinformation rebuttals on social media. First, this study analyzed the influencing factors from three perspectives, including information, information source, and the information recipient. Not only will this study enrich the application of the ELM theory but also expand the governance and mitigation of misinformation on social media. Second, information quality and source credibility are cardinal to changing the public’s perception. As it is more difficult to correct existing misinformation than accept new knowledge, the model in this study is a theoretical trial carried out solely to refute health misinformation on social media. Lastly, the target audience of the rebuttal of health misinformation has high cognitive conflicts and widely different knowledge self-confidence. This study focused on analyzing the moderating roles of the knowledge self-confidence and the cognitive conflict on correction paths. It is a theoretical complement to the persuasion field and is of great significance for understanding the public’s attitudes and changing their behaviors.
In practice, this study is of significance for the government, platform managers, medical and health service personnel, and educators. First, information quality and source credibility play a prime role in changing the public’s belief. Information publishers should provide high-quality, understandable, useful, and convincing information. Besides, the information should be transmitted by highly reliable transmitters, including the government or authoritative medical institutions. This will help enhance the public’s perception of information credibility. Second, the existence of cognitive conflict increases the difficulty of refuting misinformation. Misinformation refutation by credible communicators, such as governments and authoritative medical experts trusted by information receivers, can get twice the result with half the effort. Social media platforms that share health information should control the quality of information and identify and filter unhealthy information in a timely manner. Moreover, credit rating and other markers are used to distinguish the credibility of information publishers and reduce the difficulty in users’ cognitive processing. Lastly, users have different degrees of understanding of health professional information, and users with high knowledge self-confidence are the groups that find it more difficult to change their attitude. On the one hand, social media platforms can obtain correct information before users are exposed to health misinformation by means such as changing information presentation order. On the other hand, they need to push high-quality misinformation rebuttals from different sources more frequently to the hard-to-persuade groups.
This research has some limitations that should be overcome to expand future studies. First, it studied representative factors for the information, information source, and receiver. Notably, the effect of correcting health misinformation is influenced by numerous factors. Thus, more effort should be made in discovering other factors to improve the model—that is, the model can be combined with psychological factors to explore the psychological process of the public’s acceptance or rejection of the beliefs change and with sociological factors to explore solutions to the echo chamber situation. Second, we studied a single situation (i.e., bone soup, no calcium) through a questionnaire survey. The obtained results were based on the specific set up of the survey, and the applicability of the suggestions to other scenarios needs further consideration. Hence, other situations can be designed to study health misinformation according to the standards for levels of risks or different social and cultural backgrounds. Further, behavioral experiments can be conducted to manipulate signals about source credibility, social identity, and perceived importance of the information itself.
Author Contributions
Conceptualization, Y.S. and B.Z.; methodology, Y.S. and B.Z.; software, Y.S.; validation, Y.S. and B.Z.; formal analysis, Y.S.; investigation, Y.S.; resources, Y.S.; data curation, Y.S.; writing—original draft preparation, Y.S.; writing—review and editing, Y.S. and B.Z.; visualization, Y.S.; supervision, B.Z.; project administration, B.Z.; funding acquisition, B.Z. All authors have read and agreed to the published version of the manuscript.
Funding
This research was funded by the Major Program of National Fund of Philosophy and Social Science of China, grant number 18VZL010.
Institutional Review Board Statement
Ethical review and approval were waived for this study, due to this study does not involve moral and ethical issues, and tends to the field of economy and management.
Informed Consent Statement
Informed consent was obtained from all subjects involved in the study.
Conflicts of Interest
The authors declare no conflict of interest.
References
- Sabbagh, C.; Boyland, E.; Hankey, C.; Parrett, A. Analysing Credibility of UK Social Media Influencers’ Weight-Management Blogs: A Pilot Study. Int. J. Environ. Res. Public Health 2020, 17, 9022. [Google Scholar] [CrossRef]
- Li, Y.; Twersky, S.; Ignace, K.; Zhao, M.; Purandare, R.; Bennett-Jones, B.; Weaver, S.R. Constructing and communicating COVID-19 stigma on Twitter: A content analysis of Tweets during the early stage of the COVID-19 outbreak. Int. J. Environ. Res. Public Health 2020, 17, 6847. [Google Scholar] [CrossRef]
- Sharma, M.; Yadav, K.; Yadav, N.; Ferdinand, K.C. Zika virus pandemic: Analysis of Facebook as a social media health information platform. Am. J. Infect. Control. 2017, 45, 301–302. [Google Scholar] [CrossRef]
- Lewandowsky, S.; Ecker, U.K.H.; Seifert, C.M.; Schwarz, N.; Cook, J. Misinformation and its correction: Continued influence and successful debiasing. Psychol. Sci. Public Int. 2012, 13, 106–131. [Google Scholar] [CrossRef]
- Trew, B. Coronavirus: Hundreds Dead in Iran from Drinking Methanol Amid Fake Reports It Cures Disease. Independent. Available online: https://www.independent.co.uk/news/world/middle-east/iran-coronavirus-methanol-drink-cure-deaths-fake-a9429956.html (accessed on 29 April 2020).
- Duplaga, M. The determinants of conspiracy beliefs related to the COVID-19 pandemic in a nationally representative sample of Internet users. Int. J. Environ. Res. Public Health 2020, 17, 7818. [Google Scholar] [CrossRef] [PubMed]
- Pew Internet Research. The Social Life of Health Information. Available online: https://www.pewresearch.org/fact-tank/2014/01/15/the-social-life-of-health-information/ (accessed on 15 January 2014).
- Yun, G.W.; Morin, D.; Park, S.; Joa, C.Y.; Labbe, B.; Lim, J.; Lee, S.; Hyun, D. Social media and flu: Media twitter accounts as agenda setters. Int. J. Med. Inform. 2016, 91, 67–73. [Google Scholar] [CrossRef] [PubMed]
- Adams, S.A. Revisiting the online health information reliability debate in the wake of “web 2.0”: An inter-disciplinary literature and website review. Int. J. Med. Inform. 2010, 79, 391–400. [Google Scholar] [CrossRef] [PubMed]
- Balmas, M. When fake news becomes real: Combined exposure to multiple news sources and political attitudes of inefficacy, alienation, and cynicism. Commun. Res. 2014, 41, 430–454. [Google Scholar] [CrossRef]
- Le, H.T.; Nguyen, D.N.; Beydoun, A.S.; Le, X.T.T.; Nguyen, T.T.; Pham, Q.T.; Ta, N.T.K.; Nguyen, Q.T.; Nguyen, A.N.; Hoang, M.T.; et al. Demand for Health Information on COVID-19 among Vietnamese. Int. J. Environ. Res. Public Health 2020, 17, 4377. [Google Scholar] [CrossRef]
- Salaverría, R.; Buslón, N.; López-Pan, F.; León, B.; Erviti, M.C. Disinformation in times of pandemic: Typology of hoaxes on Covid-19. El Prof. De La Inf. 2020, 29, e290315. [Google Scholar]
- UNESCO. Journalism, ‘Fake News’ & Disinformation: Handbook for Journalism Education and Training; Unesco Publishing: Paris, France, 2018; Available online: https://en.unesco.org/sites/default/files/journalism_fake_news_disinformation_print_friendly_0_0.pdf (accessed on 3 September 2019).
- Swire-Thompson, B.; Lazer, D. Public Health and Online Misinformation: Challenges and Recommendations. Annu. Rev. Public Health 2020, 41, 433–451. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Chou, C.H.; Wang, Y.S.; Tang, T.I. Exploring the determinants of knowledge adoption in virtual communities: A social influence perspective. Int. J. Inform. Manag. 2015, 35, 364–376. [Google Scholar] [CrossRef]
- Krittanawong, C.; Narasimhan, B.; Virk, H.U.H.; Narasimhan, H.; Tang, W.H.W. Misinformation dissemination in twitter in the covid-19 era. Am. J. Med. 2020, 133, 1367–1369. [Google Scholar] [CrossRef]
- Ecker, U.K.H.; Lewandowsky, S.; Swire, B.; Chang, D. Correcting false information in memory: Manipulating the strength of misinformation encoding and its retraction. Psychon. Bull. Rev. 2011, 18, 570–578. [Google Scholar] [CrossRef] [PubMed]
- Bolsen, T.; Druckman, J.N. Counteracting the Politicization of Science. J. Commun. 2015, 65, 745–769. [Google Scholar] [CrossRef]
- Lim, S.H.; Kim, D. The role of trust in the use of health infomediaries among university students. Inform. Health Soc. Care 2012, 37, 92–105. [Google Scholar] [CrossRef]
- Karlova, N.; Fisher, K.E. A social diffusion model of misinformation and disinformation for understanding human information behaviour. Inf. Res. 2013, 18, 1–17. [Google Scholar]
- Huo, C.; Ma, F.; Qiu, Y.; Wang, Y. Exploring the determinants of health knowledge adoption in social media: An intention-behavior-gap perspective. Inform. Dev. 2018, 34, 346–363. [Google Scholar]
- Angst, C.M.; Agarwal, R. Adoption of electronic health records in the presence of privacy concerns: The elaboration likelihood model and individual persuasion. MIS Quart. 2009, 33, 339–370. [Google Scholar] [CrossRef] [Green Version]
- Petty, R.E.; Schumann, C.D. Central and peripheral routes to advertising effectiveness: The moderating role of involvement. J. Consum. Res. 1983, 10, 135–146. [Google Scholar] [CrossRef] [Green Version]
- Sussman, S.W.; Siegal, W.S. Informational influence in organizations: An integrated approach to knowledge adoption. Inform. Syst. Res. 2003, 14, 47–65. [Google Scholar] [CrossRef] [Green Version]
- Chung, N.; Han, H.; Koo, C. Adoption of travel information in user-generated content on social media: The moderating effect of social presence. Behav. Inform. Technol. 2015, 34, 902–919. [Google Scholar] [CrossRef]
- Tseng, S.Y.; Wang, C.N. Perceived risk influence on dual-route information adoption processes on travel websites. J. Bus. Res. 2016, 69, 2289–2296. [Google Scholar] [CrossRef]
- Bhattacherjee, A.; Sanford, C. Influence processes for information technology acceptance: An elaboration likelihood model. MIS Quart. 2006, 30, 805–825. [Google Scholar] [CrossRef] [Green Version]
- Yoo, D.K.; Vonderembse, M.A.; Ragu-Nathan, T.S. Knowledge quality: Antecedents and consequence in project teams. J. Know. Manag. 2011, 15, 329–343. [Google Scholar]
- Zahedi, F.; Song, J. Dynamics of trust revision: Using health infomediaries. J. Manag. Inform. Syst. 2008, 24, 225–248. [Google Scholar] [CrossRef]
- Mak, B.; Schmitt, B.H.; Lyytinen, K. User participation in knowledge update of expert systems. Inform. Manag. 1997, 32, 55–63. [Google Scholar] [CrossRef]
- Luo, C.; Luo, X.; Schatzberg, L.; Sia, C.L. Impact of informational factors on online recommendation credibility: The moderating role of source credibility. Decis. Support Syst. 2013, 56, 92–102. [Google Scholar] [CrossRef]
- Laugesen, J.; Hassanein, K.; Yuan, Y. The impact of Internet health information on patient compliance: A research model and an empirical study. J. Med. Internet Res. 2015, 17, e143. [Google Scholar] [CrossRef] [Green Version]
- Gunther, A.C. Biased press or biased public: Attitudes toward media coverage of social groups. Public Opin. Quart. 1992, 56, 147–167. [Google Scholar] [CrossRef]
- Kim, H.; Park, S.Y.; Bozeman, I. Online health information search and evaluation: Observations and semi-structured interviews with college students and maternal health experts. Health Inf. Libr. J. 2011, 28, 188–199. [Google Scholar] [CrossRef] [PubMed]
- Hsu, H.Y.; Tsou, H.T. Understanding customer experiences in online blog environments. Int. J. Inf. Manag. 2011, 31, 510–523. [Google Scholar] [CrossRef]
- Zhang, W.; Watts, S.A. Capitalizing on content: Information adoption in two online communities. J. Assoc. Inf. Syst. 2008, 9, 73–94. [Google Scholar]
- Tormala, Z.L.; Clarkson, J.J. Assimilation and contrast in persuasion: The effects of source credibility in multiple message situations. Pers. Soc. Psychol. B. 2007, 33, 559–571. [Google Scholar] [CrossRef] [PubMed]
- Cheung, M.Y.; Luo, C.; Sia, C.L.; Chen, H. Credibility of electronic word-of-mouth: Informational and normative determinants of on-line consumer recommendations. Int. J. Electron. Commer. 2009, 13, 9–38. [Google Scholar] [CrossRef]
- Zhang, L.; Peng, T.Q.; Zhang, Y.P.; Wang, X.H.; Zhu, J.J.H. Content or context: Which matters more in information processing on microblogging sites. Comput. Hum. Behav. 2014, 31, 242–249. [Google Scholar] [CrossRef]
- Chu, S.; Kim, Y. Determinants of consumer engagement in electronic word-of-mouth (eWOM) in social networking sites. Int. J. Advert. 2011, 30, 47–75. [Google Scholar] [CrossRef]
- Flanagin, A.J.; Metzger, M.J. Trusting expert-versus user-generated ratings online: The role of information volume, valence, and consumer characteristics. Comput. Hum. Behav. 2013, 29, 1626–1634. [Google Scholar] [CrossRef]
- Kappes, A.; Harvey, A.H.; Lohrenz, T.; Montague, P.R.; Sharot, T. Confirmation bias in the utilization of others’ opinion strength. Nat. Neurosci. 2020, 23, 130–137. [Google Scholar] [CrossRef]
- Thornhill, C.; Meeus, Q.; Peperkamp, J.; Berendt, B. A digital nudge to counter confirmation bias. Front. Big Data 2019, 2, 11. [Google Scholar]
- Schwarz, N.; Sanna, L.J.; Skurnik, I.; Yoon, C. Metacognitive experiences and the intricacies of setting people straight: Implications for debiasing and public information campaigns. Adv. Exp. Soc. Psychol. 2007, 39, 127–161. [Google Scholar]
- Song, H.; Schwarz, N. Fluency and the detection of distortions: Low processing fluency attenuates the Moses illusion. Soc. Cogn. 2008, 26, 791–799. [Google Scholar] [CrossRef]
- Zhang, K.Z.K.; Cheung, C.M.K.; Lee, M.K.O. Examining the moderating effect of inconsistent reviews and its gender differences on consumers’ online shopping decision. Int. J. Inf. Manag. 2014, 34, 89–98. [Google Scholar] [CrossRef]
- Kruglanski, A.W.; Thompson, E.P. Persuasion by a single route: A view from the unimodal. Psychol. Inq. 1999, 10, 83–109. [Google Scholar] [CrossRef]
- Jiang, S.; Beaudoin, C.E. Health literacy and the internet: An exploratory study on the 2013 HINTS survey. Comput. Hum. Behav. 2016, 58, 240–248. [Google Scholar] [CrossRef]
- Tracey, J.; Arroll, B.; Barham, P.; Richmond, D. The validity of general practitioners’ self assessment of knowledge: Cross sectional study. BMJ Clin. Res. 1997, 315, 1426–1428. [Google Scholar] [CrossRef] [Green Version]
- Chaiken, S.; Eagly, A.H. Communication modality as a determinant of message persuasiveness and message comprehensibility. J. Pers. Soc. Psychol. 1976, 34, 605–614. [Google Scholar] [CrossRef]
- Lord, K.R.; Lee, M.S.; Sauer, P.L. The combined influence hypothesis: Central and peripheral antecedents of attitude toward the Ad. J. Advert. 1995, 24, 73–85. [Google Scholar] [CrossRef]
- Petty, R.E.; Cacioppo, J.T.; Goldman, R. Personal involvement as a determinant of argument-based persuasion. J. Pers. Soc. Psychol. 1981, 41, 847–855. [Google Scholar] [CrossRef]
- Wu, P.; Wang, Y. The influences of electronic word-of-mouth message appeal and message source credibility on brand attitude. Asia Pac. J. Market. Logist. 2011, 23, 448–472. [Google Scholar] [CrossRef] [Green Version]
- Langfred, C.W. The downside of self-management: A longitudinal study of the effects of conflict on trust, autonomy, and task interdependence in self-managing teams. Acad. Manag. J. 2007, 50, 885–900. [Google Scholar] [CrossRef] [Green Version]
- Armstrong, J.S.; Overton, T.S. Estimating nonresponse bias in mail surveys. J. Mark. Res. 1977, 14, 396–402. [Google Scholar] [CrossRef] [Green Version]
- Park, T.; Ryu, D. Drivers of technology commercialization and performance in SMEs. Manag. Decis. 2015, 53, 338–353. [Google Scholar] [CrossRef]
- Farivar, S.; Turel, O.; Yuan, Y. A trust-risk perspective on social commerce use: An examination of the biasing role of habit. Internet Res. 2017, 27, 586–607. [Google Scholar] [CrossRef]
- Chin, W.W.; Thatcher, J.B.; Wright, R.T. Assessing common method bias: Problems with the ULMC technique. MIS Quart. 2012, 36, 1003–1019. [Google Scholar] [CrossRef] [Green Version]
- Shiau, W.L.; Yuan, Y.; Pu, X.; Ray, S.; Chen, C.C. Understanding fintech continuance: Perspectives from self-efficacy and ECT-IS theories. Ind. Manag. Data Syst. 2020, 120, 1659–1689. [Google Scholar] [CrossRef]
- Fornell, C.; Larcker, D.F. Evaluating structural equation models with unobservable variables and measurement error. J. Mark. Res. 1981, 18, 39–50. [Google Scholar] [CrossRef]
- Falk, A.; Kosfeld, M. The hidden costs of control. Am. Econ. Rev. 2006, 96, 1611–1630. [Google Scholar]
- Henseler, J.; Ringle, C.M.; Sarstedt, M. A new criterion for assessing discriminant validity in variance-based structural equation modeling. J. Acad. Mark. Sci. 2015, 43, 115–135. [Google Scholar] [CrossRef] [Green Version]
- Cho, C.H.; Martens, M.L.; Kim, H.; Rodrigue, M. Astroturfing global warming: It isn’t always greener on the other side of the fence. J. Bus. Ethics 2011, 104, 571–587. [Google Scholar] [CrossRef]
- Del Vicario, M.; Bessi, A.; Zollo, F.; Petroni, F.; Scala, A.; Caldarelli, G.; Stanley, H.E.; Quattrociocchi, W. The spreading of misinformation online. Proc. Natl. Acad. Sci. USA 2016, 113, 554–559. [Google Scholar] [CrossRef] [Green Version]
- Kim, A.; Dennis, A.R. Says who? The effects of presentation format and source rating on fake news in social media. MIS Quart. 2019, 43, 1025–1039. [Google Scholar] [CrossRef]
- Caulfield, T. Pseudoscience and COVID-19—We’ve had enough already. Nature. 2020. Available online: https://www.nature.com/articles/d41586-020-01266-z (accessed on 27 April 2020).
- Goel, S.; Anderson, A.; Hofman, J.; Watts, D.J. The structural virality of online diffusion. Manag. Sci. 2016, 62, 180–196. [Google Scholar]
- Glöckner, A.; Bröder, A. Processin information and additional cues: A model-based analysis of choice, confidence, and response time. Judgm. Decis. Mak. 2011, 6, 23–42. [Google Scholar]
- Neys, W.D.; Cromheeke, S.; Osman, M. Biased but in doubt: Conflict and decision confidence. PLoS ONE 2011, 6, e15954. [Google Scholar]
- Leiserowitz, A.; Maibach, E.; Roser-Renouf, C.; Hmielowski, J.D. Politics and global warming: Democrats, Republicans, Independents, and the Tea Party Yale University and George Mason University New Haven, CT: Yale Project on Climate Change Communication. Available online: http://environment.yale.edu/climate/files/politicsglobalwarming2011.pdf (accessed on 5 June 2019).
| Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).