Eficiencia en el tiempo de respuesta y calidad de los datos en encuestas en línea

Abstract

El uso de encuestas digitales se intensificó durante el confinamiento por la pandemia del SARS-CoV-2, facilitando la recolección de información en la investigación científica. Sin embargo, la incorporación de un mayor número de preguntas ha extendido el tiempo de respuesta, sin considerar su incidencia en la calidad de los datos obtenidos. Este estudio examina la relación entre el tiempo de respuesta y la calidad de los datos en encuestas digitales aplicadas a investigaciones sobre liderazgo. A partir de un diseño cuantitativo no experimental, se analizó una muestra de 224 participantes que completaron un cuestionario adaptado del MLQ con 32 ítems. La regresión lineal múltiple, con género y edad como variables de control, evidenció una relación positiva entre el tiempo de respuesta y la calidad de los datos, aunque al superar los 20 minutos esta tendencia se revierte. También se observó que la edad y el género influyen significativamente en la relación analizada. Se concluye que un mayor tiempo de respuesta mejora la calidad de los datos en encuestas breves, por lo que se recomienda continuar investigando este efecto en cuestionarios más extensos

Author Biographies

Jorge Andrés Izaguirre Olmedo

Magister en Finanzas y Proyectos Corporativos; Estudiante de Doctorado, Escuela de posgrados, Universidad San Ignacio de Loyola, Lima – Perú; Email: jorge.izaguirre@epg.usil.pe; ORCID: https://orcid.org/0000-0001-5178-8641

Dennys Patricia Jordán Correa

Magister en Educación; Licenciada en Comunicación Social; docente, Universidad Internacional del Ecuador, Guayaquil – Ecuador; Email: dejordanco@uide.edu.ec; ORCID: https://orcid.org/0000-0001-6962-0855

Tania Yolanda Palacios Sarmiento

Doctorando en Proyectos; Magister en negocios internacionales y comercio exterior; Docente, Universidad Internacional del Ecuador - Ecuador; Email: tapalaciossa@uide.edu.ec; ORCID: https://orcid.org/0000-0001-5111-6319.

References

Andreadis, I., & Kartsounidou, E. (2020). The impact of splitting a long online questionnaire on data quality. Survey Research Methods, 14(1), 31–42. https://doi.org/10.18148/srm/2020.v14i1.7294

Avolio B., & Bass, B. (2004). Multifactor Leadership Questionnaire. Instrument (Leader and Rater Form) and Scoring Guide (Form 5X-Short). English and Spanish versions. Mind Garden, Inc.

Bauer, I., Kunz, T., & Gummer, T. (2025). Plain language in web questionnaires: effects on data quality and questionnaire evaluation. International Journal of Social Research Methodology, 28(1), 57–69. https://doi.org/10.1080/13645579.2023.2294880

Caria, A., Delogu, M., Meleddu, M., & Sotgiu, G. (2024). People inflows as a pandemic trigger: Evidence from a quasi-experimental study. Economics and Human Biology, 52(101341), 101341. https://doi.org/10.1016/j.ehb.2023.101341

Cernat, A., Sakshaug, J., Christmann, P., & Gummer, T. (2024). The impact of survey mode design and questionnaire length on measurement quality. Sociological Methods & Research, 53(4), 1873–1904. https://doi.org/10.1177/00491241221140139

Chauliac, M., Willems, J., Gijbels, D., & Donche, V. (2023). The prevalence of careless response behaviour and its consequences on data quality in self-report questionnaires on student learning. Frontiers in education, 8. https://doi.org/10.3389/feduc.2023.1197324

Cornesse, C., & Blom, A. G. (2023). Response quality in nonprobability and probability-based online panels. Sociological Methods & Research, 52(2), 879–908. https://doi.org/10.1177/0049124120914940

Creswell, J. W., & Creswell, J. D. (2018). Research Design: Qualitative, Quantitative, and Mixed Methods Approaches. (5td. Edition). SAGE Publications.

De Rada, V. D. (2022). Strategies to improve response rates to online surveys. Papers: Revista de sociología, 107(4), e3073. https://doi.org/10.5565/rev/papers.3073

Décieux, J. P. (2024). Sequential on-device multitasking within online surveys: A data quality and response behavior perspective. Sociological Methods & Research, 53(3), 1384–1411. https://doi.org/10.1177/00491241221082593

Décieux, J. P., & Sischka, P. E. (2024). Comparing data quality and response behavior between smartphone, tablet, and computer devices in responsive design online surveys. SAGE Open, 14(2). https://doi.org/10.1177/21582440241252116

Ganassali, S. (2008) The Influence of the Design of Web Survey Questionnaires on the Quality of Responses. Survey Research Methods, 2(1), 21–32. https://doi.org/10.18148/srm/2008.v2i1.598

Garbarski, D., Dykema, J., Yonker, J. A., Bae, R. E., & Rosenfeld, R. A. (2025). Improving the Measurement of Gender in Surveys: Effects of Categorical Versus Open-Ended Response Formats on Measurement and Data Quality Among College Students. Journal of Survey Statistics and Methodology, 13(1), 18–38. https://doi.org/10.1093/jssam/smae043

Gibson, A. M. & Bowling, N. A. (2020). The Effects of Questionnaire Length and Behavioral Consequences on Careless Responding. European Journal of Psychological Assessment, 36(2), 410–420. https://doi.org/10.1027/1015-5759/a000526

Goerres, A., & Höhne, J. K. (2023). Evaluating the response effort and data quality of established political solidarity measures: a pre-registered experimental test in an online survey of the German adult resident population in 2021. Quality & Quantity, 57(6), 1–17. https://doi.org/10.1007/s11135-022-01594-4

Gummer, T., Bach, R., Daikeler, J., & Eckman, S. (2021). The relationship between response probabilities and data quality in grid questions. Survey Research Methods, 15(1), 65–77. https://doi.org/10.18148/srm/2021.v15i1.7727

Hernández, R., Fernández, C., y Baptista, M. (2014). Metodología de la investigación. McGraw Hill Education.

Ito, D., & Todoroki, M. (2021). Evaluating the quality of online survey data collected in 2018 in the USA: Univariate, bivariate, and multivariate analyses. International Journal of Japanese Sociology, 30(1), 140–162. https://doi.org/10.1111/ijjs.12117

Korytnikova, N. V. (2021). Paradata as indicators of online survey data quality: Classification experience. Sotsiologicheskie issledovaniia, 3, 111–120. https://doi.org/10.31857/s013216250010298-0

Kunz, T., & Gummer, T. (2025). Effects of objective and perceived burden on response quality in web surveys. International Journal of Social Research Methodology, 28(4), 385–395. https://doi.org/10.1080/13645579.2024.2393795

Lawlor, J., Thomas, C., Guhin, A. T., Kenyon, K., Lerner, M. D., UCAS Consortium, & Drahota, A. (2021). Suspicious and fraudulent online survey participation: Introducing the REAL framework. Methodological Innovations, 14(3), 205979912110504. https://doi.org/10.1177/20597991211050467

Meitinger, K., Behr, D., & Braun, M. (2019). Using apples and oranges to judge quality? Selection of appropriate cross-national indicators of response quality in open-ended questions. Social Science Computer Review, 089443931985984. https://doi.org/10.1177/0894439319859848

Nur, A. A., Leibbrand, C., Curran, S. R., Votruba-Drzal, E., & Gibson-Davis, C. (2024). Managing and minimizing online survey questionnaire fraud: Lessons from the triple C project. International Journal of Social Research Methodology, 27(5), 613–619. https://doi.org/10.1080/13645579.2023.2229651

Peytchev, A. & Peytcheva, E. (2017). Reduction of measurement error due to survey length: Evaluation of the split questionnaire design approach. Survey Research Methods, 11(4), 361–368. https://doi.org/10.18148/srm/2017.v11i4.7145

Schneider, S., Lee, P.-J., Hernandez, R., Junghaenel, D. U., Stone, A. A., Meijer, E., Jin, H., Kapteyn, A., Orriens, B., & Zelinski, E. M. (2024). Cognitive Functioning and the Quality of Survey Responses: An Individual Participant Data Meta-Analysis of 10 Epidemiological Studies of Aging. Journals of Gerontology - Series B Psychological Sciences and Social Sciences, 79(5). https://doi.org/10.1093/geronb/gbae030

Sischka, P. E., Décieux, J. P., Mergener, A., Neufang, K. M., & Schmidt, A. F. (2022). The impact of forced answering and reactance on answering behavior in online surveys. Social Science Computer Review, 40(2), 405–425. https://doi.org/10.1177/0894439320907067

Staliunaite, I.R., Valvoda, J. & Satoh, K. (2024) Comparative Study of Explainability Methods for Legal Outcome Prediction. NLLP 2024 - Natural Legal Language Processing Workshop 2024, Proceedings of the Workshop, pp. 243–258.

Vehovar, V., Couper, M. P., & Čehovin, G. (2023). Alternative layouts for grid questions in PC and mobile web surveys: An experimental evaluation using response quality indicators and survey estimates. Social Science Computer Review, 41(6), 2122–2144. https://doi.org/10.1177/08944393221132644

Wang, Y., Chen, X., & Zhou, X. (2024). A new method for identifying low-quality data in perceived usability crowdsourcing tests: Differences in questionnaire scores. International Journal of Human-Computer Interaction, 40(22), 7297–7313. https://doi.org/10.1080/10447318.2023.2263694

Wind, S. A., & Lugu, B. (2024). Combining nonparametric and parametric item response theory to explore data quality: Illustrations and a simulation study. Applied Measurement in Education, 37(2), 109–131. https://doi.org/10.1080/08957347.2024.2345592
Published
2025-07-28
How to Cite
Izaguirre Olmedo, J. A., Jordán Correa, D. P., & Palacios Sarmiento, T. Y. (2025). Eficiencia en el tiempo de respuesta y calidad de los datos en encuestas en línea. Revista Venezolana De Gerencia, 30(13), 407-419. https://doi.org/10.52080/rvgluz.30.especial13.27