Method for Detecting Far-Right Extremist Communities on Social Media
Abstract
:1. Introduction
2. State-of-the-Art
2.1. The Peculiarities of Far-Right Extremist Communities
- belief in the specific inferiority and the superiority of other individuals and groups; promotion of the segregation principle: the separation of people into groups considered “superior” and groups considered “inferior”, with different bases: gender, age, status, place of residence, race, and others;
- though various far-right communities and movements differ in many ways, they share and promote “national preferences” (hence, nationalism);
- the idea of egalitarianism: the far-right regards social inequalities and corresponding social hierarchies as inevitable, natural, or even preferable;
- the broad landscape we call the far-right relies on supremacism and nativism;
- promotion of oppressive policies, genocide, xenophobia, authoritarianism, anti-immigration and anti-integration attitudes;
- many far-right groups believe in conspiracy theories as a severe threat to national sovereignty and (or) personal freedom, and they also maintain the conviction that their personal and (or) national way of life is under threat.
2.2. Far-Right Online Radicalization: Specifics of the Study
2.3. Key Opportunities and Limitations in the Creation of Automated Online Radicalization Research Tools
- Data extraction–level limitations. The primary way to extract raw social media data is to work with application programming interfaces (APIs) developed by social media owners via data provision methods. The rules for API use are set by the social media themselves, including the permissible frequency of requests, the amount of data provided in response to the request, and others. Using several social media as primary data sources entails multi-agent acquisition subsystem development. Apart from being technically challenging, especially for small research teams, it also means that we depend entirely on social media owners;
- Data processing–level limitations. Despite the extraordinary amount of publicly available data on social media, the amount is still insufficient. There is no explicit information about the nature of the connections between users and communities. There is no possibility of verifying the available information (as a consequence, it is impossible to evaluate the accuracy of models based on machine learning methods) (Tang and Liu 2010). Thus, we are in a situation where we cannot ignore the available online information as it can potentially improve the accuracy of the scientific worldview, but we also cannot base decisions solely on online data;
- Data interpretation–level limitations. The development of artificial intelligence methods and their accompanying use increases the qualification requirements for researchers. Additional competence in development programs and new educational trajectories are necessary in this case, but a qualitative formalization of accumulated experience and knowledge allows the transit to algorithm development.
2.4. Far-Right Extremist Communities in Russia
- There was a decline in criminal activity but a growing share of more dangerous violence;
- Hate crimes have become even more concealed;
- At least 45 people suffered from racist and other ideologically motivated violence;
- The number of right-wing attacks on political, ideological, or “stylistic” opponents was significantly lower than the year before;
- The number of attacks on ideological sites decreased;
- The proportion of dangerous acts—explosions and arson—increased during the year;
- The theme of the threats from the far-right remained topical; photos, personal data of anti-fascists, left-wing activists, independent journalists, and law enforcement officers, and threats against them appeared on the social media pages of these organizations and groups.
2.5. Specific Markers for the Promotion of Far-Right Extremist Ideology in the Online Environment
- The construction of a collective identity to maintain group cohesion and attract new members;
- Extrapolation of radical prejudices (e.g., racism) into “rational” claims focused on ethnic, national, linguistic, and religious minorities;
- Funding of individual values and motives that can stimulate active involvement in far-right communities;
- Seeking significance and status;
- Networking with like-minded individuals for offline and online mobilization and recruiting new members;
- The crucial role of ideology in justifying violent action;
- Charismatic leadership as a stimulus to increase organizational strength;
- Background conditions—social, political, economic, and others.
3. The Architecture of the VKontakte Social Network Analysis System
4. The Calendar-Correlation Analysis (CCA) Algorithm of Social Network Community Activity
- (a)
- It is not possible to extract retrospective data on community activity, and it is only possible to estimate the total number of views, likes, reposts, and comments since the publication date of the post;
- (b)
- The community may show abnormal activity compared to regular activity before a significant date and after;
- (c)
- Community activity on significant dates can be random or standard, and this is the case not only for radical communities.
- Aabs—absolute community activity, which is a superposition of views, likes, comments, and reposts;
- d—an important date for the far-right ideological platforms;
- k—a variable to denote the boundaries of the time interval in question.
- (a)
- The expert user builds a “knowledge base” (a list of keywords, expressions, and dates, including defining their relationships);
- (b)
- The user launches the primary keyword search function;
- (c)
- Pre-processing of the results (deleting closed, inactive, “empty” communities);
- (d)
- The method of calendar-correlation analysis is used to refine the list of identified far-right communities;
- (e)
- The expert analyzes the results and enters them into the knowledge base, if necessary;
- (f)
- The user builds a set of groups for further search of related communities (satellites);
- (g)
- The results of the search for the satellite communities are analyzed. If necessary, the expert refines the information in the knowledge base.
5. Experimental Testing of CCA and Results Discussion
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- AIVD. 2019. AIVD-Jaarverslag 2019. Available online: https://www.aivd.nl/documenten/jaarverslagen/2020/04/29/jaarverslag-2019 (accessed on 15 March 2022).
- Anderson, Chris. 2012. Towards a sociology of computational and algorithmic journalism. New Media & Society 15: 1005–21. [Google Scholar] [CrossRef]
- Barhamgi, Mahmoud, Abir Masmoudi, Raul Lara-Cabrera, and David Camacho. 2018. Social networks data analysis with semantics: Application to the radicalization problem. Journal of Ambient Intelligence and Humanized Computing. Available online: https://link.springer.com/article/10.1007/s12652-018-0968-z (accessed on 15 March 2022). [CrossRef]
- Bjørgo, Tore, and Jacob A. Ravndal. 2019. Extreme-Right Violence and Terrorism: Concepts, Patterns, and Responses. ICCT. Available online: https://icct.nl/app/uploads/2019/09/Extreme-Right-Violence-and-Terrorism-Concepts-Patterns-and-Responses.pdf (accessed on 15 March 2022).
- Borum, Randy. 2004. Psychology of Terrorism. Office of Justice Programs. Available online: https://www.ojp.gov/sites/g/files/xyckuh241/files/media/document/208552.pdf?height=921.6&q=psychology-of-terrorism%3FTB_iframe%3Dtrue&width=921.6 (accessed on 15 March 2022).
- Borum, Randy. 2011. Radicalization into violent extremism I: A review of social science theories. Journal of Strategic Security 4: 7–36. [Google Scholar] [CrossRef] [Green Version]
- Botometer. 2021. Botometer an OSoMe Project. Available online: https://botometer.osome.iu.edu (accessed on 15 March 2022).
- Braniff, William. 2017. Recasting and Repositioning CVE as a Grand Strategic Response to Terrorism. START. Available online: https://www.start.umd.edu/news/recasting-and-repositioning-cve-grand-strategic-response-terrorism (accessed on 15 March 2022).
- Cherniy, Vasilii. 2021. Social Networks in Russia: Figures and Trends, Fall 2021. Brand Analytics. Available online: https://br-analytics.ru/blog/social-media-russia-2021/ (accessed on 15 March 2022).
- Conway, Maura. 2017. Determining the Role of the Internet in Violent Extremism and Terrorism: Six Suggestions for Progressing Research. Studies in Conflict & Terrorism 40: 77–98. [Google Scholar] [CrossRef] [Green Version]
- Forelle, Michelle, Phil Howard, Andrés Monroy-Hernández, and Saiph Savage. 2015. Political Bots and the Manipulation of Public Opinion in Venezuela. Available online: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2635800 (accessed on 15 March 2022).
- Garcet, Serge. 2021. Understanding the Psychological Aspects of the Radicalisation Process: A Sociocognitive Approach, Forensic Sciences Research. Available online: https://www.tandfonline.com/doi/full/10.1080/20961790.2020.1869883.pdf (accessed on 15 March 2022). [CrossRef]
- Gaudette, Tiana, Ryan Scrivens, and Vivek Venkatesh. 2020. The Role of the Internet in Facilitating Violent Extremism: Insights from Former Right-Wing Extremists, Terrorism and Political Violence. Terrorism and Political Violence. Available online: https://www.tandfonline.com/doi/full/10.1080/09546553.2020.1784147?scroll=top&needAccess=true (accessed on 15 March 2022). [CrossRef]
- Gilani, Zafar, Ekaterina Kochmar, and Jon Crowcroft. 2017. Classification of Twitter Accounts into Automated Agents and Human Users. Paper present at the 2017 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining (ASONAM), Sydney, Australia, July 3–August 3. [Google Scholar] [CrossRef] [Green Version]
- Golikov, Leonid. 2021. Utilitarian Necessity and Desire in the Russian Nationalist Discourse (Based on the Content from VKontakte Social Group “Sputnik i Pogrom”). Available online: https://www.gramota.net/articles/phil210511.pdf (accessed on 15 March 2022).
- GTI. 2019. Global Terrorism Index 2019. Available online: https://www.visionofhumanity.org/wp-content/uploads/2020/11/GTI-2019-web.pdf (accessed on 15 March 2022).
- GTI. 2020. Global Terrorism Index 2020. Available online: https://visionofhumanity.org/wp-content/uploads/2020/11/GTI-2020-web-1.pdf (accessed on 15 March 2022).
- Hall, Margeret, Michael Logan, Gina S. Ligon, and Douglas C. Derrick. 2019. Do Machines Replicate Humans? Toward a Unified Understanding of Radicalizing Content on the Open Social Web. Policy & Internet 12: 109–38. [Google Scholar] [CrossRef]
- Hamm, Mark, and Ramon Spaaij. 2015. Lone Wolf Terrorism in America: Using Knowledge of Radicalization Pathways to Forge Prevention Strategies. Available online: https://www.ojp.gov/pdffiles1/nij/grants/248691.pdf (accessed on 15 March 2022).
- Hashemi, Mahdi, and Margeret Hall. 2019. Detecting and Classifying Online Dark Visual Propaganda. Image and Vision Computing 89: 95–105. [Google Scholar] [CrossRef]
- Heide, Liesbeth, Charlie Winter, and Shiraz Maher. 2018. The Cost of Crying Victory: Policy Implications of the Islamic State’s Territorial Collapse. ICCT. Available online: https://icsr.info/wp-content/uploads/2019/01/ICSR-ICCT-Feature_The-Cost-of-Crying-Victory-Policy-Implications-of-the-Islamic-State’s-Territorial-Collapse.pdf (accessed on 15 March 2022).
- Hofmann, David C. 2018. How “Alone” are Lone-Actors? Exploring the Ideological, Signaling, and Support Networks of Lone-Actor Terrorists. Journal Studies in Conflict & Terrorism 43: 657–78. [Google Scholar] [CrossRef]
- Holt, Thomas J., Joshua D. Freilich, and Steven M. Chermak. 2020. Examining the Online Expression of Ideology among Far-Right Extremist Forum Users. Terrorism and Political Violence 34: 364–84. [Google Scholar] [CrossRef]
- Jasser, Greta, Megan Kelly, and Ann-Kathrin Rothermel. 2020. Male Supremacism and the Hanau Terrorist Attack: Between Online Misogyny and Extreme Right Violence. ICCT. Available online: https://icct.nl/publication/male-supremacism-and-the-hanau-terrorist-attack-between-online-misogyny-and-far-right-violence/ (accessed on 15 March 2022).
- Jensen, Michael A., Anita A. Seate, and Patrick A. James. 2018. Radicalization to Violence: A Pathway Approach to Studying Extremism. Terrorism and Political Violence 32: 1067–90. [Google Scholar] [CrossRef]
- Johnson, Joseph. 2021. Global Digital Population as of January 2021. Statista. Available online: https://www.statista.com/statistics/617136/digital-population-worldwide/ (accessed on 15 March 2022).
- Karpova, Anna Yu, Aleksei O. Savelev, Alexandr D. Vilnin, and Denis V. Chaykovskiy. 2019. New technologies to identify alt-right extremist communities in social media. Vestnik tomskogo gosudarstvennogo universi-teta-filosofiya-sotsiologiya-politologiya-tomsk state. University Journal of Philosophy Sociology and political Science 52: 138–46. [Google Scholar] [CrossRef]
- Karpova, Anna Yu, Aleksei O. Savelev, Alexandr D. Vilnin, and Denis V. Chaykovskiy. 2020. Studying Online Radicalization of Youth through Social Media (Interdisciplinary Approach). Monitoring of Public Opinion: Economic and Social Changes 3: 159–81. [Google Scholar] [CrossRef]
- Kozitsin, Ivan, Alexander Chkhartishvili, Artemii Marchenko, Dmitrii Norkin, Sergei Osipov, Ivan Uteshev, Vyacheslav Goiko, Roman Palkin, and Michail Myagkov. 2020. Modeling Political Preferences of Russian Users Exemplified by the Social Network Vkontakte. Mathematical Models and Computer Simulations 12: 185–94. [Google Scholar] [CrossRef]
- Kutner, Samantha. 2020. Swiping Right: The Allure of hYper Masculinity and Cryptofascism for Men Who Join the Proud Boys. ICCT. Available online: https://icct.nl/app/uploads/2020/05/Swiping-Right-The-Allure-of-Hyper-Masculinity-and-Cryptofascism-for-Men-Who-Join-the-Proud-Boys.pdf (accessed on 15 March 2022).
- Kuznetsov, Sergei A., Anna Yu Karpova, and Aleksei O. Savelev. 2021. Automated detection of ultra-right communities’ cross-links in a social network. Vestnik tomskogo gosudarstvennogo universiteta-filosofiya-sotsiologiya-politologiya-tomsk state univer-sity. Journal of Philosophy Sociology and Political Science 59: 156–66. [Google Scholar]
- LaFree, Gary. 2013. Lone-Offender Terrorists. Criminology and Public Policy 12: 59–62. [Google Scholar] [CrossRef]
- Lewis, Rebecca. 2018. Alternative Influence: Broadcasting the Reactionary Right on YouTube. Available online: https://datasociety.net/wp-content/uploads/2018/09/DS_Alternative_Influence.pdf (accessed on 15 March 2022).
- McCauley, Clark, and Sophia Moskalenko. 2008. Mechanisms of political radicalisation: Pathways toward terrorism. Terrorism and Political Violence 20: 415–33. [Google Scholar] [CrossRef]
- McCauley, Clark, and Sophia Moskalenko. 2017. Understanding political radicalization: The two-pyramids model. American Psychologist 72: 205–16. [Google Scholar] [CrossRef]
- Minjust. 2021. The Ministry of Justice of the Russian Federation Federal List of Extremist Materials. Available online: https://minjust.gov.ru/ru/extremist-materials/ (accessed on 15 March 2022).
- MSC. 2020. Munich Security Report 2020. Available online: https://securityconference.org/assets/user_upload/MunichSecurityReport2020.pdf (accessed on 15 March 2022).
- NAC. 2021. List of Public and Religious Associations, and Other Non-Profit Organizations. Available online: http://nac.gov.ru/zakonodatelstvo/sudebnye-resheniya/perechen-nekommercheskih-organizaciy-v.html (accessed on 15 March 2022).
- Neumann, Peter R. 2009. Old and New Terrorism. Cambridge: Polity Press, Available online: https://www.wiley.com/en-us/Old+and+New+Terrorism-p-9780745643755 (accessed on 15 March 2022).
- Pipiya, Karina. 2019. Monitoring Xenophobic Attitudes. Available online: https://bit.ly/3xJFpHN (accessed on 15 March 2022).
- Poupin, Perrine. 2021. Social media and state repression: The case of VKontakte and the anti-garbage protest in Shies, in Far Northern Russia. First Monday 26. Available online: https://journals.uic.edu/ojs/index.php/fm/article/view/11711 (accessed on 15 March 2022). [CrossRef]
- PST. 2020. Nasjonal Trusselvurdering. Available online: https://www.pst.no/globalassets/artikler/utgivelser/2020/nasjonal-trusselvurdering-2020-print.pdf (accessed on 15 March 2022).
- Rahwana, Iyad, Jacob W. Crandall, and Jean-Francois Bonnefon. 2020. Intelligent machines as social catalysts. Proceedings of the National Academy of Sciences 117: 7555–57. [Google Scholar] [CrossRef] [Green Version]
- Ravndal, Jacob A., and Tore Bjørgo. 2018. Investigating Terrorism from the Extreme Right: A Review of Past and Present Research. Perspectives on Terrorism 12: 5–22. Available online: https://www.universiteitleiden.nl/binaries/content/assets/customsites/perspectives-on-terrorism/2018/issue-6/a1-ravndal-and-bjorgo.pdf (accessed on 15 March 2022).
- Reuter, Ora, and David Szakonyi. 2015. Online social media and political awareness in authoritarian regimes. British Journal of Political Science 45: 29–51. [Google Scholar] [CrossRef] [Green Version]
- Sanovich, Sergey, Denis Stukal, and Joshua A. Tucker. 2018. Turning the virtual tables: Government strategies for addressing online opposition with an application to Russia. Comparative Politics 50: 435–82. [Google Scholar] [CrossRef] [Green Version]
- Savelev, Aleksei O., Anna Yu Karpova, Denis V. Chaykovskiy, Alexandr D. Vilnin, Anastasia Yu Kaida, Sergei A. Kuznetsov, Lev O. Igumnov, and Nataliya G. Maksimova. 2021. The high-level overview of social media content search engine. IOP Conference Series: Materials Science and Engineering 1019: 012097. [Google Scholar] [CrossRef]
- Siebel, Thomas M. 2019. Digital Transformation: Survive and Thrive in an Era of Mass Extinction. New York: RosettaBooks. [Google Scholar]
- Statista Research Department. 2022. Social media—Statistics & Facts. Statista. January. Available online: https://www.statista.com/topics/1164/social-networks/#topicHeader__wrapper (accessed on 15 March 2022).
- Stella, Massimo, Emilio Ferrara, and Manlio De Domenico. 2018. Bots Increase Exposure to Negative and Inflammatory Content in Online Social Systems. Available online: https://www.pnas.org/content/115/49/12435 (accessed on 15 March 2022).
- Stieglitz, Stefan, Milad Mirbabaie, Björn Ross, and Christoph Neuberger. 2018. Social media analytics—Challenges in topic discovery, data collection, and data preparation. International Journal of Information Management 39: 156–68. [Google Scholar] [CrossRef]
- Sureka, Ashish, and Swati Agarwal. 2014. Learning to classify hate and extremism promoting tweets. Paper present at the 2014 IEEE Joint Intelligence and Security Informatics Conference, The Hague, The Netherlands, September 24–26. [Google Scholar] [CrossRef]
- Tang, Lei, and Huan Liu. 2010. Community Detection and Mining in Social Media. In Synthesis Lectures on Data Mining and Knowledge Discovery. Morgan & Claypool: Available online: https://www.morganclaypool.com/doi/abs/10.2200/S00298ED1V01Y201009DMK003 (accessed on 15 March 2022). [CrossRef]
- Turner, Matthew D. 2011. A Simple Ontology for the Analysis of Terrorist Attacks. Technical Report. Available online: https://digitalrepository.unm.edu/ece_rpts/41/ (accessed on 15 March 2022).
- Urman, Aleksandra. 2019. News Consumption of Russian Vkontakte Users: Polarization and News Avoidance. International Journal of Communication 13: 5158–82. [Google Scholar]
- Varol, Onur, Emilio Ferrara, Clayton A. Davis, Filippo Menczer, and Alessandro Flammini. 2017. Online Human-Bot Interactions: Detection, Estimation, and Characterization. Paper present at the International AAAI Conference on Web and Social Media, Montreal, QC, Canada, May 15–18; Available online: https://arxiv.org/pdf/1703.03107.pdf (accessed on 15 March 2022).
- Vilnin, Alexandr D., Anastasia Yu Kaida, Anna Yu Karpova, Sergei A. Kuznetsov, Nataliya G. Maksimova, Aleksei O. Savelev, and Denis V. Chaykovskiy. 2021. Calendar-Correlation Analysis of the Activity of Social Network Communities. Certificate of State Registration of the Computer Program 2021662860 Dated 6 August 2021. Available online: https://elibrary.ru/download/elibrary_46484745_77252480.PDF (accessed on 15 March 2022).
- Wadhwa, Pooja, and M. P. S. Bhatia. 2013. Tracking on-line radicalization using investigative data mining. Paper present at the 2013 National Conference on Communications (NCC), New Delhi, India, February 15–17. [Google Scholar] [CrossRef]
- Weiler, Andreas, Michael Grossniklaus, and Marc H. Scholl. 2016. Situation monitoring of urban areas using social media data streams. Information Systems 57: 129–41. [Google Scholar] [CrossRef] [Green Version]
- Wendelberg, Linda. 2021. An Ontological Framework to Facilitate Early Detection of ‘Radicalization’ (OFEDR)-A Three World Perspective. Journal of Imaging 7: 60. [Google Scholar] [CrossRef]
- Whiting, Tim, Alvika Gautam, Jacob Tye, Michael Simmons, Jordan Henstrom, Mayada Oudah, and Jacob W. Crandall. 2021. Confronting barriers to human-robot cooperation: Balancing efficiency and risk in machine behavior. iScience 24: 101963. [Google Scholar] [CrossRef]
- Xie, Daniel, Xu Jiejun, and Tsai-Ching Lu. 2016. Automated Classification of Extremist Twitter Accounts Using Content-Based and Network-Based Features. Paper present at the 2016 IEEE International Conference on Big Data (Big Data), Washington, DC, USA, December 5–8. [Google Scholar] [CrossRef]
- Yudina, Natalia. 2020. Report: Ultra-Right Criminal Activity. Hate Crimes and Countering Them in Russia in 2019. Available online: https://bit.ly/3jVHvMJ (accessed on 15 March 2022).
Subclass | β | 1 − β | Number of Communities within the Dataset |
---|---|---|---|
Alt-Right | 0.67 | 0.33 | 6 |
Nazis | 0.5 | 0.5 | 12 |
Neo-Pagans | 0.91 | 0.09 | 67 |
Manosphere | 0.95 | 0.05 | 19 |
Far-Right | 0.78 | 0.22 | 155 |
№ | Subscribers | Age | Gender, % | Posts | ||||
---|---|---|---|---|---|---|---|---|
Median | Average | Not Specified, % | Male | Female | Over the Year | Daily Average | ||
1 | 735,113 | 26 | 29.94 | 66.49% | 50.20% | 49.80% | 9195 | 24.71 |
2 | 293,537 | 31 | 33.57 | 66.27% | 66.24% | 33.76% | 2336 | 6.27 |
3 | 89,134 | 21 | 27.97 | 62.34% | 28.24% | 71.76% | 1786 | 4.80 |
4 | 20,485 | 32 | 35.67 | 66.49% | 87.68% | 12.32% | 110 | 0.29 |
5 | 17,828 | 33 | 37.85 | 65.61% | 82.89% | 17.11% | 5152 | 13.85 |
6 | 17,281 | 35 | 39.05 | 61.77% | 82.10% | 17.90% | 1944 | 5.22 |
7 | 28,999 | 22 | 33.09 | 57.56% | 90.67% | 9.33% | 2323 | 6.24 |
8 | 8792 | 36 | 39.67 | 62.82% | 79.80% | 20.20% | 1324 | 3.55 |
9 | 10,215 | 26 | 33.13 | 61.82% | 85.16% | 14.84% | 1947 | 5.23 |
10 | 6590 | 25 | 36.74 | 61.96% | 88.98% | 11.02% | 316 | 0.84 |
11 | 6317 | 34 | 39.5 | 70.06% | 85.88% | 14.12% | 62 | 0.16 |
12 | 3516 | 34 | 38.66 | 74.29% | 85.86% | 14.14% | 18 | 0.04 |
13 | 5985 | 23 | 34.3 | 57.59% | 90.69% | 9.31% | 712 | 1.91 |
14 | 4066 | 28 | 37.23 | 61.78% | 87.16% | 12.84% | 45 | 0.12 |
15 | 1344 | 40 | 43.53 | 55.06% | 84.15% | 15.85% | 236 | 0.63 |
16 | 8673 | 34 | 38.79 | 66.67% | 86.15% | 13.85% | 469 | 1.25 |
17 | 7233 | 25 | 33.98 | 58.72% | 83.49% | 16.51% | 416 | 1.11 |
18 | 14,591 | 33 | 37.55 | 62.99% | 82.75% | 17.25% | 180 | 0.48 |
19 | 9433 | 36 | 39.97 | 60.03% | 87.33% | 12.67% | 589 | 1.58 |
20 | 47,652 | 25 | 31.92 | 56.64% | 75.67% | 24.33% | 2013 | 5.41 |
21 | 1155 | 36 | 39.39 | 57.14% | 62.77% | 37.23% | 830 | 2.23 |
22 | 16,448 | 35 | 39.03 | 53.34% | 86.89% | 13.11% | 37 | 0.09 |
23 | 3300 | 37 | 40.84 | 68.06% | 74.64% | 25.36% | 2629 | 7.06 |
24 | 24,219 | 33 | 37.87 | 59.81% | 85.05% | 14.95% | 1765 | 4.74 |
25 | 80,236 | 37 | 39.51 | 60.67% | 76.04% | 23.96% | 3221 | 8.65 |
26 | 85,574 | 36 | 39.35 | 67.74% | 62.59% | 37.41% | 2002 | 5.38 |
27 | 60,125 | 38 | 40.89 | 61.59% | 63.00% | 37.00% | 3143 | 8.44 |
28 | 80,484 | 38 | 40.92 | 61.91% | 65.92% | 34.08% | 4669 | 12.55 |
29 | 66,489 | 32 | 35.14 | 64.36% | 72.40% | 27.60% | 807 | 2.16 |
30 | 21,209 | 31 | 34.81 | 64.35% | 80.81% | 19.19% | 28 | 0.07 |
31 | 576 | 33 | 36.09 | 77.95% | 52.60% | 47.40% | 575 | 1.54 |
32 | 50,355 | 38 | 41.99 | 63.22% | 69.00% | 31.00% | 3016 | 8.10 |
33 | 245,091 | 39 | 41.06 | 63.18% | 45.70% | 54.30% | 5487 | 14.74 |
34 | 63,106 | 33 | 34.89 | 71.78% | 55.35% | 44.65% | 933 | 2.50 |
35 | 58,156 | 24 | 27.45 | 50.95% | 59.78% | 40.22% | 5171 | 13.90 |
36 | 4543 | 32 | 38.1 | 57.45% | 85.41% | 14.59% | 724 | 1.94 |
37 | 9850 | 35 | 39.51 | 68.36% | 81.40% | 18.60% | 439 | 1.18 |
38 | 3054 | 21 | 35.59 | 54.78% | 86.25% | 13.75% | 171 | 0,45 |
39 | 4114 | 29 | 36.76 | 53.33% | 83.50% | 16.50% | 323 | 0.86 |
40 | 51,149 | 37 | 40.41 | 61.50% | 80.78% | 19.22% | 6942 | 18.66 |
41 | 30,091 | 20 | 32.87 | 55.24% | 79.85% | 20.15% | 807 | 2.16 |
42 | 18,017 | 22 | 32.91 | 54.00% | 89.09% | 10.91% | 6728 | 18.08 |
43 | 19,634 | 36 | 39.53 | 59.88% | 57.57% | 42.43% | 2134 | 5.73 |
44 | 51,204 | 39 | 41.34 | 55.77% | 63.79% | 36.21% | 4921 | 13.22 |
45 | 143,657 | 39 | 41.61 | 66.19% | 42.87% | 57.13% | 744 | 2 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Karpova, A.; Savelev, A.; Vilnin, A.; Kuznetsov, S. Method for Detecting Far-Right Extremist Communities on Social Media. Soc. Sci. 2022, 11, 200. https://doi.org/10.3390/socsci11050200
Karpova A, Savelev A, Vilnin A, Kuznetsov S. Method for Detecting Far-Right Extremist Communities on Social Media. Social Sciences. 2022; 11(5):200. https://doi.org/10.3390/socsci11050200
Chicago/Turabian StyleKarpova, Anna, Aleksei Savelev, Alexander Vilnin, and Sergey Kuznetsov. 2022. "Method for Detecting Far-Right Extremist Communities on Social Media" Social Sciences 11, no. 5: 200. https://doi.org/10.3390/socsci11050200
APA StyleKarpova, A., Savelev, A., Vilnin, A., & Kuznetsov, S. (2022). Method for Detecting Far-Right Extremist Communities on Social Media. Social Sciences, 11(5), 200. https://doi.org/10.3390/socsci11050200