Research Project: Human Communication Beyond Words: Russian Media, Dr Anna Wilson

Research Project: Human Communication Beyond Words: Russian Media, Dr Anna Wilson (formerly Pleshakova)

Multimodal analysis is the analysis of verbal, sound, and visual aspects of human communication. It combines the forces of cognitive linguistics, discourse, communication and computational analyses. It is a ground-breaking new area of research. It is also extremely challenging due to the complex nature of human communication and the interdisciplinary, collaborative, creative and labour-intensive approach required. This project conducts research on Russian multimodal communication within the framework of the Red Hen Lab – a unique virtual data science laboratory at UCLA which hosts a number of multimodal datasets and coordinates a range of networking activities with a global consortium of researchers working on multimodal communication. Red Hen Lab researchers work on identifying challenges and offering solutions related to systematic analysis of multimodal datasets. Red Hen Lab creates very large international media datasets in a variety of languages, analytical and statistical tools, high-performance computing cluster resources, and multimodal processing pipelines. Red Hen Lab creates and hosts NewsScape - a digital collection of around 420,000 television news programs in multiple languages - English, Spanish, French, both European and Brazilian Portuguese, Russian, Polish, Czech, Italian, Norwegian, Swedish, Danish, German, Arabic, Mandarin, with a total of more than 4 billion words in metadata files, supported by the aforementioned pipelines of computational tools (www.redhenlab.org). Since January 2016, Dr Wilson and the Red Hen Lab team have been developing large datasets comprised of Russian TV news shows and news related talk shows and working on customizing the existing and developing new computational tools for Russian data, including natural language processing tools, optical character recognition, and computer vision tools for facial expressions and gesture recognition and identification, within the framework of one past and one current US National Science Foundation grants, a current five-year German Humboldt Foundation grant, and Google Summer of Code (GSoC) for the last four years. In 2019, working together with Dr Peter Uhrig (Erlangen) and Dr Hale (OII), Dr Wilson created a new Russian multimodal dataset incorporating news-related broadcasts from all Russian national TV channels. The new corpus contains 55 million words from 7,500 hours of video, and is more than 3 times the size of the previous Russian holdings in the Red Hen Lab. The videos have been automatically subtitled and the lexical content of the subtitles lemmatized and morphologically annotated. The verbal corpus can be displayed with the precise video links displayed in an overlay window, and annotated using Rapid Annotator, a breakthrough allowing annotation of videos and corresponding text much faster than with previously used technologies.

 

The ground for this project was set through conducting a number of small research projects having resulted in workshops and conferences: “Cognitive Linguistic Methods in Cultural Analysis: Interdisciplinary Perspective” (June 2011); “Languages, Media and Politics: Cognitive Linguistic Methods in Discourse Analysis” (September 2013), the “2015 - Slavic Cognitive Linguistics Conference” (December 2015); the KE workshop “Russian Media, Multimodal Communication, and (Critical) Discourse Analysis” (January 2016); “Red Hen Lab and Oxford Project on Multimodal Communication and Discourse Analysis: Current Project and Future Collaborations” (April 2017); “Disinformation Research and Policy - Bringing the community together” (January 2019); “A Multi- and Inter-Disciplinary Approach to Disinformation Research and Policy” (March 2019) as well as a number of publications.

List of selected publications, and work in progress:

  1. Wilson, A. Multimodal Viewpoint Blending in Russian Media: A Right to Voice One’s Opinion (an article, to be submitted to Multimodal Communication De Gruyter Mouton)
  1. Wilson, A. Ambiguity in Russian News Parody (a article to be submitted to Journal of Communication OUP).
  1. (2018) Toward an Infrastructure for Data­driven Multimodal Communication Research.  Linguistics Vanguard: A Multimodal Journal for the Language Sciences. Volume 4, Issue 1. De Gruyter. Co-authors: Francis Steen, Cristóbal Pagán Cánovas, Anders Hougaard, Jungseock Joo, Inés Olza, Anna Pleshakova, Soumya Ray, Peter Uhrig, Javier Valenzuela, Jacek Woźny, & Mark Turner. https://www.degruyter.com/view/j/lingvan.2018.4.issue-1/lingvan-2017-0041/lingvan-2017-0041.xml?format=INT
  2. Pleshakova, A. (2018). Cognitive Approaches: Media, Mind, and Culture. In C. Cotter and D. Perrin (eds.) The Routledge Handbook on Language and Media. London and New York: Routledge. Pp. 77-93 
  3. Pleshakova, A. (2016). Meta-parody in contemporary Russian media: viewpoint blending behind Dmitry Bykov’s 2009 poem “Infectious”. Lege Artis. Language yesterday, today, tomorrow. Vol. 1, issue 1, June 2016, pp. 202-274. De Gruyter Open. Available at https://www.degruyter.com/view/j/lart.2016.1.issue-1/issue-files/lart.2016.1.issue-1.xml
  4. Pleshakova, A. (2014), Strike, Accident, Risk, and Counterfactuality: Hidden Meanings of the Post-Soviet Russian News Discourse of the Nineties via Conceptual Blending. Language and Cognition: An Interdisciplinary Journal of Language and Cognitive Science. Vol. 6, issue 3, September 2014, pp. 301-306. Cambridge: Cambridge University Press.
  5. Pleshakova, A. (2010), Werewolves in Epaulettes, in: F. Parril, V. Tobin, M. Turner (eds.). ‘Meaning, Form, and Body’. CSLI, Stanford. 2010.