- Левада-Центр - https://www.levada.ru -

Are Meaningful Public Opinion Polls Possible in Today’s Russia?

Denis Volkov

Ever since Vladimir Putin announced the start of a special military operation (SMO), public opinion polls in Russia have been criticized with renewed vigor. Usually, critics give the following reasons why Russian polls cannot be trusted. Some argue that in 2022, the response rate has declined sharply and people’s reluctance to participate in surveys has increased. Others say that respondents interrupt interviews as soon as the conversation turns to events in Ukraine. Some say that only supporters of the government are now participating in the polls, while opposition-minded citizens prefer to abstain. Others speak of survey list experiments, which seem to show lower support rates, and claim that these show the “real mood” of the people. Finally, the most extreme critics of polls say that polls are not relevant because they do not show what people “really think.” Let us take a look at each of these allegations in turn.

At the Levada Center, we use the AAPOR recommendations to calculate the response rate for each survey we conduct. In 2022, the average response rate in our regular all-Russian door-to-door survey was 27 percent. This is slightly less than in 2021 (31 percent on average), but higher than the averages of previous years (25 percent in 2020, 20 percent in 2019). For now, let us leave aside the discussion about what kind of response rate is considered sufficient (in the United States, for example, 9 percent has been considered an acceptable level of reach for telephone surveys). Importantly, the response rate has not changed much in the past year (just as attitudes toward the surveys themselves have not changed). If things were different, surveys according to the usual methodology would become impossible: interviewers would not take on a task that was obviously impossible, or the cost of their work would become prohibitive. But that has not happened.

We looked into the problem of interrupted interviews at the end of last year. The analysis showed that this indicator is stable and has barely changed over time. In each survey on questions related to Ukraine and the “special operation” conducted in 2022, only 2 to 7 interviews were interrupted, which is an insignificant number on the scale of the entire survey. Moreover, “Ukrainian” questions do not differ in this sense from questions on other topics. In most cases, if respondents have already agreed to answer, they will go through the survey to the end, especially if it is a face-to face interview. Therefore, there appear to be no grounds for questioning the quality of the survey data on the basis of interrupted interviews.

Nor have claims that only supporters of the authorities take part in the polls been confirmed to date. Looking at the results of the panel survey of respondents who have taken part in Levada Center surveys we were unable to confirm the assumption that people who repeatedly participate in telephone surveys assess the events more positively, nor that respondents who do not approve of the activities of the country’s leadership more often refuse to participate in surveys (and, thus, that public opinion polls capture the views only of those who are prepared to make contact and answer polling questions). In other words, the increased support for the authorities and their decisions in 2022 reflects actual changes in public sentiment rather than any shortcomings of the survey instrument.

As for survey list experiments, which seem to indicate lower support for the authorities and the SMO, their results cannot always be interpreted unambiguously. Researchers who conducted similar experiments on mass support for Vladimir Putin in Russia in 2015–2021 warn against such an interpretation of the results of their experiments. The figures obtained as a result of such experiments coincide with the indicators of unconditional support for military action and power. But this does not mean that those who have doubts and show weak or conditional support do not really support the SMO. There is a whole set of factors that encourage doubters to side with the majority. To reduce everything to a fear of answering pollsters would be a gross oversimplification.

Finally, there is a thesis that in repressive Russian conditions, respondents will never say what they “really think.” But we never conduct surveys using a polygraph, and we only record what people are willing to share with the interviewer. Thus, pollsters get information not about people’s innermost thoughts, but about their public attitudes. However, this should be sufficient to understand and explain their behavior in public. One can hardly contest the fact that the pressure of the Russian state on the individual has recently increased. The main goal of such pressure is obviously to change people’s behavior, discouraging them from criticizing the authorities or participating in protests. And this works. But the results of the polls say exactly the same.

And besides, if changes in public opinion occur, they are—as a rule—of a systemic nature: changes in the level of support for the authorities are accompanied by changes in answers to questions about mood, hopes, and economic behavior. Such changes are unlikely to be driven by a fear of answering the questionnaire (provided that the proportion of those participating in surveys remains more or less constant). Thus, the growth in the approval ratings of the president and government in February–March 2022 and high levels of support for the SMO were accompanied by an increase in general optimism, enthusiasm, and jingoism.

Moreover, an analysis of long-term trends in public opinion showed already at the end of 2021 and beginning of 2022 that if a military conflict with the West were to break out, the majority of Russian society would be on the side of the president and the government. By that time, the main contours of Russian society’s attitude toward this conflict had already taken shape: three quarters were sure that the United States and Ukraine were to blame for the escalation, while only one-third showed sympathy for Ukraine. Vladimir Putin’s approval rating was already 71 percent in mid-February (in March it rose to 83 percent). The main gaps—between the largest cities and the rest of the country, between young and old, and between TV viewers and Internet users—were already visible. Polls showed that although Russian society was afraid of the conflict, it was internally ready for it.

Furthermore, even in the spring, there were early signs of people’s adaptation to the situation. This manifested itself first in focus groups and then in surveys (as there is no need to set qualitative and quantitative survey methods up in opposition; indeed, we should use them in combination). It was possible to accurately describe society’s reaction to the mobilization immediately after its announcement based on the materials of previous studies. By the end of September, one could already say that Russian society had come to terms with the first wave of mobilization — and this was abundantly clear by the end of the year, when the mood largely returned to “pre-mobilization” levels.

All of the above allows us to say that doubts about the quality of polls in today’s Russia remain largely ungrounded. Analysis of the situation and forecasts based on regular sociological research have shown their effectiveness. Indeed, such analysis is much more accurate than some of the most-cited journalistic speculations, which very often do not come true. Of course, one must be careful when using survey data: the survey projects of political activists and no-names might do more to confuse than to illuminate the situation. But it is fair to say that if we discount opinion polls in general, we deprive ourselves of one of the few proven tools for understanding Russian society.

РАССЫЛКА ЛЕВАДА-ЦЕНТРА

Подпишитесь, чтобы быть в курсе последних исследований!

Выберите список(-ки):