Social Bots As an Instrument of Influence in Social Networks: Typologization Problems

Abstract

Nowadays, in the field of social bots investigations, we can observe a new research trend — a shift from a technology-centered to sociology-centered interpretations. It leads to the creation of new perspectives for sociology: now the phenomenon of social bots is not only considered as one of the efficient manipulative technologies but has a wider meaning: new communicative technologies have an informational impact on the social networks space. The objective of this research is to assess the new approaches of the established social bots typologies (based on the fields of their usage, objectives, degree of human behavior imitation), and also consider the ambiguity and controversy of the use of such typologies using the example of botnets operating in the VKontakte social network. A method of botnet identification is based on comprehensive methodology developed by the authors which includes the frequency analysis of published messages, botnet profiling, statistical analysis of content, analysis of botnet structural organization, division of content into semantic units, forming content clusters, content analysis inside the clusters, identification of extremes — maximum number of unique texts published by botnets in a particular cluster for a certain period. The methodology was applied for the botnet space investigation of Russian online social network VKontakte in February and October 2018. The survey has fixed that among 10 of the most active performing botnets, three botnets were identified that demonstrate the ambiguity and controversy of their typologization according to the following criteria: botnet “Defrauded shareholders of LenSpetsStroy” — according to the field of their usage, botnet “Political news in Russian and Ukrainian languages” — according to their objectives, botnet “Ksenia Sobchak” — according to the level of human behavior imitation. The authors identified the prospects for sociological analysis of different types of bots in a situation of growing accessibility and routinization of bot technologies used in social networks.


Keywords: social bots, botnets, classification, VKontakte social network

References
[1] Аbokhodair, N., Yoo, D. and McDonald, D. W. (2015, March). Dissecting a Social Botnet. Presented at Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing - CSCW ’15, Canada, Vancouver, Association for Computing Machinery, New York, NY, United States. pp. 839–851.

[2] Arnaudo, D. (2017). Computational Propaganda in Brazil: Social Bots During Elections. Oxford: Oxford University Press, p. 39.

[3] Bessi, A. and Ferrara, E. (2016). Social Bots Distort the 2016 US Presidential Election Online Discussion. First Monday, vol. 21, issue 11-7. https://firstmonday.org/article/view/7090/5653. [Accessed January 12 2020].

[4] Blondel, V. D., et al. (2008). Fast Unfolding of Communities in Large Networks. Journal of Statistical Mechanics: Theory and Experiment, vol. 10, pp. 1-12.

[5] Boshmaf, Y., et al. (2011, December). The Socialbot Network: When Bots Socialize for Fame and Money. Presented at Proceedings of the 27th Annual Computer Security Applications Conference. Orlando, Florida, USA, Association for Computing Machinery, New York, NY, United States. pp. 1-10.

[6] Boshmaf, Y., et al. (2013). Design and Analysis of a Social Botnet. Computer Networks: The International Journal of Computer and Telecommunications Networking, vol. 57, issue 2, pp. 556–578.

[7] Chavoshi, N., Hamooni, H. and Mueen, A. (2016, December). DeBot: Twitter Bot Detection via Warped Correlation. Presented at 2016 IEEE 16th International Conference on Data Mining (ICDM), Barcelona, Spain, IEEE Computer Society, Los Alamitos, CA, USA. pp. 817–822.

[8] Chu, Z., et al. (2010, December). Who is Tweeting on Twitter: Human, Bot, or Cyborg? Presented at Twenty-Sixth Annual Computer Security Applications Conference, ACSAC 2010, Austin, Texas, USA, Association for Computing Machinery, New York, NY, United States. pp. 21– 30.

[9] Everett, R. M., Nurse, J. R. C. and Erola, A. (2016, April). The Anatomy of Online Deception: What Makes Automated Text Convincing? Presented at Proceedings of the 31st Annual ACM Symposium on Applied Computing, Pisa, Italy, Association for Computing Machinery, New York, NY, United States. pp. 1115–1120.

[10] Ferrara, E., et al. (2016). The Rise of Social Bots. Communications of the ACM, vol. 59, issue 7, pp. 96–104.

[11] Gorwa, R. and Guilbeault, D. (2018). Unpacking the Social Media Bot: A Typology to Guide Research and Policy. Policy & Internet, vol. 9999, pp. 1-30.

[12] Hofeditz, L., et al. (2019, June). Meaningful Use of Social Bots? Possible Applications in Crisis Communication During Disasters. Presented at Proceedings of the 27th European Conference on Information Systems (ECIS2019), Stockholm & Uppsala, Sweden, Stockholm University, Kista, Sweden. pp. 1-16.

[13] Howard, P. N., et al. (2017). Junk News and Bots during the U.S. Election: What Were Michigan Voters Sharing over Twitter? Data Memo. Oxford: Project on Computational Propaganda.

[14] Howard, P. N. (2003). Digitizing the Social Contract: Producing American Political Culture in the Age of New Media. The Communication Review, vol. 6, issue 3, pp. 213–245.

[15] Mitter, S., Wagner, C. and Strohmaier, M. (2013, May). A Categorization Scheme for Social Bot Attacks in Online Social Networks. Presented at Proceedings of the 3rd ACM Web Science Conference, Paris, France, Association for Computing Machinery, New York, NY, United States. pp. 1-6.

[16] Pasquale, F. (2016). The Black Box Society: The Secret algorithms That Control Money and Information. Cambridge: Harvard University Press, pp. 320.

[17] Ratkiewicz, J., et al. (2011, March). Truthy: Mapping the Spread of Astroturf in Microblog Streams. Presented at Proceedings of the 20th International Conference Companion on World Wide Web, Hyderabad, India, Association for Computing Machinery, New York, NY, United States. pp. 249-252.

[18] Rowley, J. (2000). Product Searching with Shopping Bots. Internet Research, vol. 10, issue 3, pp. 203–214.

[19] Savage, S., Monroy-Hernandez, A. and Hollerer, T. (2016, February). Botivist: Calling volunteers to action using online bots. Presented at Proceedings of the 19th ACM Conference on Computer Supported Cooperative Work & Social Computing (CSCW ’16), San Francisco, CA, USA, Association for Computing Machinery, New York, NY, United States. pp. 813-822.

[20] Sia, S., et al. A Comparative Study of Chinese Online Agents on Facebook – an Anti–Taiwan Independence Expedition. (Forthcoming)

[21] Stieglitz, S., et al. (2017, December). Do Social Bots Dream of Electric Sheep? A Categorisation of Social Media Bot Accounts. Presented at Australasian Conference on Information Systems, Hobart, Australia, University of Tasmania, Australia. pp. 1-11.

[22] Vasilkova, V. V. and Legostaeva, N. I. (2019). Social Bots in Political Communication. Bulletin of the RUDN University. Series: SOCIOLOGY, vol. 19, issue 1, pp. 121—133.

[23] Vasilkova, V. V., Legostaeva, N. I. and Radushevsky, V. B. (2019). Thematic Landscape of the Bot Space of the VKontakte Social Network. Journal of Sociology and Social Anthropology, vol. 22, issue 4, pp. 202–245.

[24] Velázquez, E., Yazdani, M. and Suárez–Serrato, P. (2018, February). Socialbots Supporting Human Rights. Presented at AIES ’18 Proceedings of the 2018 AAAI/ACM Conference on AI, Ethics, and Society, New Orleans, LA, USA, Association for Computing Machinery, New York, NY, United States. pp. 290-296.

[25] Woolley, S. C. (2016). Automating Power: Social Bot Interference in Global Politics. First Monday, vol. 21, issue 4. https://firstmonday.org/article/view/6161/5300. [Accessed January 12 2020].

[26] Woolley, S. C. and Howard, P. N. (Eds.). (2018). Computational Propaganda: Political Parties, Politicians, and Political Manipulation on Social Media. Oxford: Oxford University Press, pp. 263.