RECSM frequently organizes seminars about methodological issues and substantive social science research here at University Pompeu Fabra.

 

Upcoming 2021 webinars 

 

May 18, 2021, at 11h (CET).

Mobile optimization strategies: effects on web survey participation

Presenter: Marc Asensio (RECSM-Universitat Pompeu Fabra)

Abstract: 
In recent years, smartphone penetration has continuously increased around the world and a growing part of the population relies exclusively on smartphones to access the internet. Coverage rates for web surveys are better than ever, but to maximize response rates, efforts must be made to adapt survey designs to accommodate ‘smartphone dependent’ participants.  Furthermore, to capitalize on new data collection opportunities offered by mobile devices, there is an interest in actively encouraging and normalizing responding to surveys on mobiles in the general population. A range of mobile optimization strategies are available for this purpose, but not much is known about their relative effectiveness and impact on survey costs and errors. We address this question in the present study, through a comparison of three strategies to optimize mobile device experience in web surveys, which were implemented in a probability-based, three-wave election study conducted in Switzerland in 2019: (1) the standard approach of providing a URL to a browser-based survey and optimizing the display of the questionnaire on smartphones (N= 8000); (2) adapting the invitation to promote mobile response and providing a QR code to access the survey (N= 1088); and (3) providing a QR code to download and participate via a smartphone application (N=1087). We compare the three mobile optimization strategies to draw conclusions about their relative impact on a) response rates and sample composition – overall and on mobile devices; b) estimates for target survey variables; and c) the progression of fieldwork, to draw conclusions about which strategy provides the best balance in terms of cost efficiency and representation of the target population.

Join this webinar via Zoom:
https://zoom.us/j/96393526919?pwd=UnlBcWpaU3ZPVEQ0VUc0V25zMGY3Zz09

Meeting ID: 963 9352 6919
Passcode: 094854

 

Past 2021 webinars 

January 19, 2021, at 11h.
Open question formats: Comparing the suitability of requests for text and voice answers in smartphone surveys [Abstract]

Presenter: Jan Karem Höhne (University of Mannheim, RECSM-Universitat Pompeu Fabra)
Annelies Blom (University of Mannheim), Konstantin Gavras (University of Mannheim), Melanie Revilla (RECSM-Universitat Pompeu Fabra), Leonie Rettig (University of Mannheim)

Abstract:

While surveys provide important standardized data about the population with large samples, they are limited regarding the depth of the data provided. Although surveys can offer open answer formats, the completeness of and detail provided in these formats is often limited, particularly in self-administered web surveys, for several reasons: On the one hand, respondents find it difficult to express their attitudes in open answer formats by keying in the answers. Respondents also keep their answers short or skip such questions altogether. On the other hand, survey designers seldom encourage respondents to elaborate on their open answers, because the ensuing coding and analysis have long been conducted manually. This makes the process time-consuming and expensive, reducing the attractiveness of such formats. However, technological developments for surveys on mobile devices, particularly smartphones, offer the collection of voice instead of text answers, which may facilitate answering questions with open answer formats and provide richer data. Additionally, new developments in automated speech-to-text transcription and text coding and analysis allow the proper handling of open answers from large-scale surveys. Given these new research opportunities, we address the following research question: How do requests for voice answers, compared to requests for text answers, affect response behavior and survey evaluations in smartphone surveys? We conducted an experiment in a smartphone survey (N = 2,400) using the opt-in Omninet Panel (Forsa) in Germany in December 2019 and January 2020. From their panel, Forsa drew a quota sample based on age, education, gender, and region (East and West Germany) to match the German population on these demographic characteristics. To collect respondents’ voice answers, we developed the JavaScriptand PHP-based “SurveyVoice (SVoice)” tool that records voice answers via the microphone of smartphones. We randomly assign respondents to answer format conditions (i.e., text or voice) and ask them six questions dealing with the perception of the most important problem in Germany as well as attitudes towards the current German Chancellor and several German political parties. In this study, we compare requests for text and voice answers in smartphone surveys with respect to several aspects: First, we investigate item nonresponse (i.e., item missing data) as an indicator of primarily low data quality. Second, we investigate response times (i.e., the time elapsing between question presentation on the screen and the time until the survey page was submitted) as an indicator of respondent burden. Finally, we investigate respondents’ survey evaluations (i.e., level of interest and level of difficulty stated by respondents) as an indicator of survey satisfaction. This experiment aims to test the feasibility of collecting voice answers for open-ended questions as an alternative data source in contemporary smartphone surveys. In addition, it explores whether and to what extent voice answers collected through the built-in microphones, compared to open answers entered via the keyboard of smartphones, represent a sound methodological substitute.

 

February 23, 2021, at 11h via Zoom.

​Are you paying for or with quality? Survey participation due to monetary incentives and measurement quality – Evidence from the GESIS Panel.

Presenter: Hannah Schwarz (RECSM-Universitat Pompeu Fabra)

Abstract:
In times of decreasing response rates, monetary incentives are increasingly used to motivate individuals to participate in surveys. Receiving an incentive can affect respondents’ motivation to take a survey and, consequently, their survey taking behaviour. On the one hand, the resulting extrinsic motivation might undermine intrinsic motivation thus leading respondents to invest less effort into answering a survey. On the other hand, monetary incentives could make respondents more eager to invest effort into answering a survey, as they feel they are compensated for doing so. This study aims to assess whether there are differences in measurement quality between respondents who are motivated to take surveys due to the received incentive and respondents for who this is not a reason for participation. We implemented two Multitrait-Multimethod (MTMM) experiments in the probability-based GESIS Panel in Germany (2019) to be able to estimate the measurement quality of 18 questions asked to panelists. By coding panelists’ open answers to a question about their reasons for participation, we distinguish panelists who state that they are motivated by the incentive from those who do not. We analyse the MTMM experiments for these two groups separately and compare the resulting measurement quality estimates. 

 

March 16, 2021, at 13h (CET) via Zoom..

Affective polarization: its measurement in multi-party contexts and its relationship with ideology

Presenter: Josep Maria Comellas (RECSM - Universitat Pompeu Fabra)

Mariano Torcal (RECSM Director - Universitat Pompeu Fabra)

Abstract:

Affective polarization broadly refers to the extent that individuals feel sympathy towards in-groups and antagonism towards out-groups. While this topic has been extensively studied in the United States, affective polarization has increasingly received comparative attention in an attempt to study this phenomenon in multi-party settings. In the first part of the presentation, we revise some of the main indices proposed in the literature to measure affective polarization, and we explain the ones that we have implemented using different datasets (CNEP, CSES, E-DEM). Then, in the second part, we present a paper that is focused on the relationship between ideology and affective polarization. Concretely, we test the predominance of identity over issues in explaining affective polarization in a multi-party system, taking advantage of an original panel dataset (E-DEM, 2018-2019) collected in Spain. The main results show that ideological identity and affective polarization strongly reinforce each other over time, polarizing society in identity terms but no so much due to conflicts emerging for issue positioning and sorting. Issue-based ideology exerts more modest affective polarizing effects, and only among those individuals whose positions in concrete issues are quite in line with their ideological identity.

 

April 6, 2021, at 15h (CET) via Zoom.

[MCSQ]: The Multilingual Corpus of Survey Questionnaires.

Presenter: Danielly Sorato (RECSM-Universitat Pompeu Fabra)

Abstract: 
The Multilingual Corpus of Survey Questionnaires (MCSQ) is the first publicly available corpus of international survey questionnaires, comprising survey items from the European Social Survey (ESS), European Values Study (EVS), and the Survey of Health Ageing and Retirement in Europe (SHARE). The recently released Version 2.0 (entitled Mileva Marić-Einstein) is composed of questionnaires of the aforementioned studies in the (British) English source language and their translations into eight languages, namely Catalan, Czech, French, German, Norwegian, Portuguese, Spanish, and Russian, as well as 29 language varieties (e.g. Swiss-French). The MCSQ is a relevant digital artefact, that allows researchers in the fields of social sciences and linguistics to quickly search and compare survey items in a myriad of languages.
The MCSQ was developed in the SSHOC (Social Sciences & Humanities Open Cloud) Project. It forms part of the EU Horizon 2020 Research and Innovation Programme (2014-2020) and is conducted under Grant Agreement  No. 823782.
The digitalized survey items are an interesting resource for survey research, translation studies, lexicology, among others. In this seminar, we present the corpus characteristics and showcase applications of the MCSQ. 

Previous years Previous years