Should we trust Russian surveys?

Eemil Mitikka,
Doctoral Researcher,
Aleksanteri Institute, University of Helsinki,
Finland

Social surveys have been an important part of social sciences ever since George Gallup successfully predicted the re-election of Franklin D. Roosevelt. They aim to represent the opinions of a population, so that politicians, scholars and ordinary citizens could get a grasp of what is going on in society.

With the development of statistical methods and the survey industry, the demand for social surveys has grown substantially. However, whereas surveys are thought to represent public sentiment somewhat accurately in a democratic context, there are more doubts about their reliability in undemocratic set-ups. Some critics maintain that surveys on non-democratic societies are unreliable and biased, because undemocratic political atmosphere is believed to distort opinion climate. As a researcher who studies Russia and uses survey data, I face the claim about biased and unreliable Russian surveys regularly, both in everyday and professional discussions.

In most cases, Russian surveys are criticized for fabricated or exaggerated numbers, preference falsification and unrepresentative samples. Firstly, some critics maintain certain sensitive questions akin to “do you trust Vladimir Putin to cause fabrication of figures, because Russian polling agencies face pressure to fake the numbers in the Kremlin’s favor to bolster its legitimacy. Secondly, a state-controlled media environment is thought to prime the Russian public to express exaggerated support for the establishment. Thirdly, survey respondents are believed to misrepresent their true opinions because of the fear of adverse consequences for giving socially undesirable answers. Finally, general social apathy and distrust in the polling industry is hypothesized to lower participation in surveys, which leads to under-representation of some social groups in the Russian survey data.

The above mentioned concerns are relevant, but they do not establish adequate grounds for labelling the Russian survey data as unreliable and unusable. It seems unlikely that survey numbers are made up because data from major Russian polling agencies (FOM, VTsIOM and the Levada Center) often paint rather similar pictures on many social questions. Although FOM and VTsIOM have close ties with the Kremlin, the Levada Center is known for co-operating with respected international organizations and scholars. Public opinion data from afore mentioned organizations seems to also quite consistently follow the political trends and events in Russia. This indicates that it is unlikely that these numbers are simply pulled out of thin air.

Moreover, although the Kremlin has the means to mold public opinion, it is unclear how successful these efforts are. The state does own 90% of mass media in Russia, but the efforts to take control over the internet have been far less successful. At the same time, the penetration of the internet is widespread and every third Russian receives information about world events from online resources. Thus, although the Kremlin seeks to affect public sentiments through mass media, it cannot fully control how citizens consume this information and how perceptions are formed.

Concerns about preference falsification in Russian surveys are relevant. An undemocratic societal context causes a particularly gnawing doubt that respondents might hide their true opinions while answering survey questions to conform social norms. Yet, as already noted earlier, Russian public opinion data seems to go quite consistently hand in hand with political trends. In 2018, for example, large-scale protests erupted across Russia in response to government’s pension reform, which points out that Russians are not afraid to speak out against policies they dislike. Moreover, a recent experimental study by Timothy Frye and others (2017) suggests preference falsification to be limited in contemporary Russia.

Regarding the problem of under-representative samples, it is true that low response rates may cause under-representation of some socio-economic groups; if only certain kind of people respond in surveys, the validity of the results becomes questionable. However, it is important to note that response rates in surveys are low globally, and missing data are replaceable to some extent by imputation techniques. Furthermore, many alternative data for studying public opinion – such as social media data – suffer also from the under-representation problem, and they can actually be even less representative than traditional survey data.

To answer the question posed in the title, it is obvious that we should not trust blindly Russian surveys. Yet, since alternative ways to study mass attitudes are limited, surveys maintain their functionality and relevance in public opinion studies. Naturally, it is possible that better methods to study public sentiments will occur in the future. In the meantime, however, traditional surveys serve as valuable tools in analyzing societies – including contemporary Russia.

Expert article 2668

> Back to Baltic Rim Economies 1/2020