[en] Many low-stakes assessments, such as international large-scale surveys, are administered during time-limited testing sessions and some test-takers are not able to endorse the last items of the test, resulting in not-reached (NR) items. However, because the test has no consequence for the respondents, these NR items can also stem from quitting the test. This article, by means of mixture modeling, investigates heterogeneity in the onset of NR items in reading in PISA 2015. Test-taking behavior, assessed by the response times on the first items of the test, and the risk of NR item onset are modeled simultaneously in a 3-class model that distinguishes rapid, slow and typical respondents. Results suggest that NR items can come from a lack of time or from disengaged behaviors and that the relationship between the number of NR items and ability estimate can be affected by these non-effortful NR responses.
Disciplines :
Education & instruction
Author, co-author :
Pools, Elodie ; Université de Liège - ULiège > Evaluation et qualité de l'enseignement (EQUALE)
Language :
English
Title :
Not-reached Items: An Issue of Time and of test-taking Disengagement? the Case of PISA 2015 Reading Data
scite shows how a scientific paper has been cited by providing the context of the citation, a classification describing whether it supports, mentions, or contrasts the cited claim, and a label indicating in which section the citation was made.
Bibliography
Brophy, J., & Ames, C. (2005). NAEP testing for twelfth graders: Motivational issues. Washington, DC: National Assessment Governing Board.
Debeer, D., Janssen, R., & De Boeck, P. (2017). Modeling skipped and not-reached items using IRTrees. Journal of Educational Measurement, 54 (3), 333–363. doi: 10.1111/jedm.12147
DeMars, C. E., Bashkov, B. M., & Socha, A. B. (2013). The role of gender in test-taking motivation under low-stakes conditions. Research & Practice in Assessment, 8, 69–82.
Eklof, H. (2010). Skill and will: Test-taking motivation and assessment quality. Assessment in Education: Principles, Policy and Practice, 17 (4), 345–356.
Ferguson, S. L., Moore, E. W. G., & Hull, D. M. (2020). Finding latent groups in observed data: A primer on latent profile analysis in Mplus for applied researchers. International Journal of Behavioural Development, 44 (5), 458–468. doi: 10.1177/0165025419881721
Finn, B. (2015). Measuring motivation in low-stakes assessments (research report RR-15-19). Princeton, NJ: Educational Testing Service.
Foy, P., & Yin, L. (2016). Chapter 13: Scaling the TIMSS 2015 achievement data. In M. O. Martin, I. V. S. Mullis, & M. Hooper (Eds.), Methods and procedures in TIMSS 2015 (pp. 13-1- 13-62). Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Lynch School of Education, Boston College.
Glas, C. A. W., & Pimentel, J. L. (2008). Modeling nonignorable missing data in speeded tests. Educational and Psychological Measurement, 68 (6), 907–922. doi: 10.1177/0013164408315262
Goldhammer, F. (2015). Measuring ability, speed, or both? Challenges, psychometric solutions, and what can be gained from experimental control. Measurement: Interdisciplinary Research and Perspectives, 13 (3–4), 133–164. doi: 10.1080/15366367.2015.1100020
Goldhammer, F., Martens, T., Christoph, G., & Lüdtke, O. (2016). Test-taking engagement in PIAAC. (OECD education working papers, n°133). Paris, France: OECD publishing.
Goldhammer, F., Martens, T., & Lüdtke, O. (2017). Conditioning factors of test-taking engagement in PIAAC: An exploratory IRT modelling approach considering person and item characteristics. Large-scale Assessments in Education, 5 (1). doi: 10.1186/s40536-017-0051-9
Hallquist, M. N., & Wiley, J. F. (2018). MplusAutomation: An R Package for Facilitating Large-Scale Latent Variable Analyses in Mplus. Structural Equation Modeling: A Multidisciplinary Journal, 25 (4), 621–638. doi: 10.1080/10705511.2017.1402334
Jakwerth, P. M., Stancavage, F. B., & Reed, E. D. (2003). NAEP validity studies: An investigation of why students do not respond to questions. Washington, DC: National Center for Education Statistics.
Köhler, C., Pohl, S., & Carstensen, C. H. (2015). Investigating mechanisms for missing responses in competence tests. Psychological Test and Assessment Modelling, 57 (4), 499–522.
Kong, X. J., Wise, S. L., & Bhola, D. S. (2007). Setting the response time threshold parameter to differentiate solution behaviour from rapid-guessing behaviour. Educational and Psychological Measurement, 67 (4), 606–619. doi: 10.1177/0013164406294779
List, M. K., Köller, O., & Nagy, G. (2019). A semiparametric approach for modeling not-reached items. Educational and Psychological Measurement, 79 (1), 170–199. doi: 10.1177/0013164417749679
Lu, Y., & Sireci, S. (2007). Validity issues in test speededness. Educational Measurement: Issues and Practice, 26 (4), 29–37. doi: 10.1111/j.1745-3992.2007.00106.x
Ludlow, L. H., & O’Leary, M. (1999). Scoring omitted and not-reached items: Practical data analysis implications. Educational and Psychological Measurement, 59 (4), 615–630. doi: 10.1177/0013164499594004
Masters, G. N. (1982). A Rasch model for partial credit scoring. Psychometrika, 47 (2), 149–174. doi: 10.1007/BF02296272
Masyn, K. E. (2013). Latent class analysis and mixture modelling. In T. D. Little (Ed.), The Oxford handbook of quantitative methods (pp. 551–611). New York, NY: Oxford University Press.
Muthén, B., & Masyn, K. (2005). Discrete-time survival mixture analysis. Journal of Educational and Behavioural Statistics, 30 (1), 27–58. doi: 10.3102/10769986030001027
Muthén, L. K., & Muthén, B. O. (2007). Mplus user’s guide (5th ed.). Los Angeles, CA: Muthén & Muthén.
OECD. (2009). PISA data analysis manual: SAS (2nd ed.). Paris, France: Author.
OECD. (2019). PISA 2018 results (Volume I): What students know and can do. Paris, France: Author.
Pohl, S., Gräfe, L., & Rose, N. (2014). Dealing with omitted and not-reached items in competence tests: Evaluating approaches accounting for missing responses in item response theory models. Educational and Psychological Measurement, 74 (3), 423–452. doi: 10.1177/0013164413504926
Pools, E., & Monseur, C. (2021). Student test-taking effort in low-stakes assessments: Evidence from the English version of the PISA 2015 science test. Large-scale Assessments in Education, 9. doi: 10.1186/s40536-021-00104-6
Schnipke, D. L., & Scrams, D. J. (1997). Modeling item response time with a two-state mixture model: A new method of measuring speededness. Journal of Educational Measurement, 34 (3), 213–232. doi: 10.1111/j.1745-3984.1997.tb00516.x
Schnipke, D. L., & Scrams, D. J. (1999). Representing response-time information in item banks (Law school admission council computerized testing report 97-09). Newton, PA: Law school admission council.
Silm, G., Pedaste, M., & Täht, K. (2020). The relationship between performance and test-taking effort when measured with self-report or time-based instruments: A meta-analytic review. Educational Research Review, 31. doi: 10.1016/j.edurev.2020.100335
Tijmstra, J., & Bolsinova, M. (2018). On the importance of the speed-ability trade-off when dealing with not reached items. Frontiers in Psychology, 9, 9. doi: 10.3389/fpsyg.2018.00964
Ulitzsch, E., von Davier, M., & Pohl, S. (2020). A multiprocess item response model for not-reached items due to time limits and quitting. Educational and Psychological Measurement, 80 (3), 522–547. doi: 10.1177/0013164419878241
Van der Linden, W. J. (2007). A hierarchical framework for modeling speed and accuracy on test items. Psychometrika, 72 (3), 287–308. doi: 10.1007/s11336-006-1478-z
Van der Linden, W. J. (2009). Conceptual issues in response-time modeling. Journal of Educational Measurement, 46 (3), 247–272. doi: 10.1111/j.1745-3984.2009.00080.x
Wigfield, A., & Eccles, J. A. (2000). Expectancy-value theory of achievement motivation. Comtemporary Educational Psychology, 25, 68–81.
Wise, S. L., & Kong, X. (2005). Response time effort: A new measure of examinee motivation in computer-based tests. Applied Measurement in Education, 18 (2), 163–183. doi: 10.1207/s15324818ame1802_2
Wise, S. L., Pastor, D. A., & Kong, X. J. (2009). Correlates of rapid-guessing behavior in low-stakes testing: Implications for test development and measurement practice. Applied Measurement in Education, 22 (2), 185–205. doi: 10.1080/08957340902754650
Wise, S. L., & DeMars, C. E. (2010). Examinee noneffort and the validity of program assessment results. Educational Assessment, 15 (1), 27–41. doi: 10.1080/10627191003673216
Wise, S. L., & Gao, L. (2017). A general approach to measuring test-taking effort on computer-based tests. Applied Measurement in Education, 30 (4), 343–354. doi: 10.1080/08957347.2017.1353992
Wise, S. L. (2017). Rapid-guessing behavior: Its identification, interpretation and implications. Educational Measurement: Issues and Practice, 36 (4), 52–61. doi: 10.1111/emip.12165
Wu, M., Adams, R., Wilson, M., & Haldane, S. (2007). ACER ConQuest version 2.0. Generalised Item Response Modelling software. Camberwell, Australia: ACER Press.
Similar publications
Sorry the service is unavailable at the moment. Please try again later.
This website uses cookies to improve user experience. Read more
Save & Close
Accept all
Decline all
Show detailsHide details
Cookie declaration
About cookies
Strictly necessary
Performance
Strictly necessary cookies allow core website functionality such as user login and account management. The website cannot be used properly without strictly necessary cookies.
This cookie is used by Cookie-Script.com service to remember visitor cookie consent preferences. It is necessary for Cookie-Script.com cookie banner to work properly.
Performance cookies are used to see how visitors use the website, eg. analytics cookies. Those cookies cannot be used to directly identify a certain visitor.
Used to store the attribution information, the referrer initially used to visit the website
Cookies are small text files that are placed on your computer by websites that you visit. Websites use cookies to help users navigate efficiently and perform certain functions. Cookies that are required for the website to operate properly are allowed to be set without your permission. All other cookies need to be approved before they can be set in the browser.
You can change your consent to cookie usage at any time on our Privacy Policy page.