Validation of the progression of chemical identity thinking using Rasch analysis: A response to challenges in educational measurement

Validation of the progression of chemical identity thinking using Rasch analysis: A response to challenges in educational measurement

Jonathan M. Barcelo jmbarcelo@slu.edu.ph School of Nursing, Allied Health, and Biological Sciences, Saint Louis University (Philippines)
Summary: 
Objective measurement of students’ reasoning competencies in science contexts is relevant in promoting educational reforms in science education. Over the years, progression-based assessments have been prompted by applying Rasch analysis in developing research instruments. Towards contributing to this reform, this study describes the development and validation of the Chemical Identity Thinking Instrument, which is anchored on a hypothesized progression of chemical identity thinking among premedical students. Item content validity indices were determined to inform item restructuring and revision. The research instrument was administered to 362 first-year to third-year pre-medical students from five higher education institutions in the Philippines. A scoring rubric was utilized to evaluate the accuracy of answers, the accuracy of explanations, and the type of explanations of students. Based on these parameters, the level of chemical identity thinking of the students was determined. By applying the Rasch rating scale model, evidence of reliability and construct validity was obtained. The applications of the instrument and the scoring rubric were discussed in the context of assessment and evaluation in pre-medical programs. The interpretation of changes in student abilities along the hypothesized levels of chemical identity thinking in all items was also described.
Keywords: 
Chemical identity
chemical identity thinking
pre-medical students
Rasch analysis
reasoning patterns
Refers: 

[1] Arnold, J., Boone, W., Kremer, K., & Mayer, J. (2018). Assessment of competencies in scientific inquiry through the application of Rasch measurement techniques. Education Sciences, 8(4), 184. https://doi: 10.3390/educsci8040184

[2] Ayre, C., & Scally, A. J. (2014). Critical values for Lawshe’s content validity ratio. Measurement and Evaluation in Counseling and Development, 47(1), 79-86. https:// doi.org/10.1177/0748175613513808

[3] Banks, G., Clinchot, M., Cullipher, S., Huie, R., Lambertz, J., Lewis, R., Ngai, C., Sevian, H., Szteinberg, G., Talanquer, V., & Weinrich, M. (2015). Uncovering chemical thinking in students’ decision making: A fuel-choice scenario. Journal of Chemical Education, 92(10), 1610-1618. https://doi.org/10.1021/acs. jchemed.5b00119

[4] Barbera, J. (2013). A psychometric analysis of the Chemical Concepts Inventory. Journal of Chemical Education, 90(5), 546–553. https://doi.org/10.1021/ed3004353

[5] Bond, T. G., & Fox, C. M. (2003). Applying the Rasch model: Fundamental measurement in the human sciences. Journal of Educational Measurement, 40(2), 185–187. https://doi.org/10.1111/j.1745-3984.2003. tb01103.x

[6] Bond, T. G., & Fox, C. M. (2007). Applying the Rasch model: Fundamental measurement in the human sciences (2nd ed.). Lawrence Erlbaum Associates Publishers.

[7] Boone, W. (2016). Rasch Analysis for instrument development: Why, when, and how? CBE-Life Sciences Education, 15(4), rm4. https://doi: 10.1187/ cbe.16-04-0148

[8] Boone, W. J., & Noltemeyer, A. (2017). Rasch analysis: A primer for school psychology researchers and practitioners. Cogent Education, 4(1). https://doi.org /10.1080/2331186x.2017.1416898

[9] Boone, W. J., & Scantlebury, K. (2006). The role of Rasch analysis when conducting science education research utilizing multiple-choice tests. Science Education, 90(2), 253-269. https://doi.org/10.1002/sce.20106\

[10] Carvalho, I., Borges, Á. D. L., & Bernardes, L. S. C. (2005). Medicinal chemistry and molecular modeling: An integration to teach drug structure-activity relationship and the molecular basis of drug action. Journal of Chemical Education, 82(4), 588. https:// doi.org/10.1021/ed082p588

[11] Clark, L. A., & Watson, D. (2019). Constructing validity: New developments in creating objective measuring instruments. Psychological Assessment, 31(12), 1412–1427. https://doi.org/10.1037/pas0000626

[12] Cicchetti, D. V. (2001). Methodological commentary the precision of reliability and validity estimates revisited: Distinguishing between clinical and statistical significance of sample size requirements. Journal of Clinical and Experimental Neuropsychology, 23(5), 695–700. https://doi.org/10.1076/jcen.23.5.695.1249

[13] Davidowitz, B., & Potgieter, M. (2016). Use of the Rasch measurement model to explore the relationship between content knowledge and topic-specific pedagogical content knowledge for organic chemistry. International Journal of Science Education, 38(9), 1483–1503. https://doi.org/10.1080/09500693.2016. 1196843

[14] Deng, Y., & Wang, H. (2017). Research on evaluation of Chinese students’ competence in written scientific argumentation in the context of Chemistry. Chemistry Education Research and Practice, 18(1), 127–150. https://doi.org/10.1039/c6rp00076b

[15] Enke, C. G. (2001). The science of chemical analysis and the technique of mass spectrometry. International Journal of Mass Spectrometry, 212(1-3), 1–11. https:// doi.org/10.1016/s1387-3806(01)00500-0

[16] Fiedler, D., Sbeglia, G. C., Nehm, R. H., & Harms, U. (2019). How strongly does statistical reasoning influence knowledge and acceptance of evolution? Journal of Research in Science Teaching, 56(9), 1183–1206. https://doi.org/10.1002/tea.21547

[17] Gilbert, G. E., & Prion, S. (2016). Making sense of methods and measurement: Lawshe’s content validity index. Clinical Simulation in Nursing, 12(12), 530- 531. https://doi.org/10.1016/j.ecns.2016.08.002

[18] Herrmann-Abell, C. F., & DeBoer, G. E. (2017). Investigating a learning progression for energy ideas from upper elementary through high school. Journal of Research in Science Teaching, 55(1), 68–93. https:// doi:10.1002/tea.21411

[19] Landis, J., & Koch, G. (1977). The Measurement of Observer Agreement for Categorical Data. Biometrics, 33(1), 159. https://doi: 10.2307/2529310

[20] Li, J., & Wang, Q. (2021). Development and validation of a rating scale for summarization as an integrated task. Asian-Pacific Journal of Second and Foreign Language Education, 6(1). https://doi.org/10.1186/ s40862-021-00113-6

[21] Linacre, J. M. (2011). A user’s guide to WINSTEPS/ MINISTEP: Rasch-model computer programs. Chicago, IL: Winsteps.com

[22] Linacre, J. M. (2012). Many-Facet Rasch Measurement: Facets Tutorial. Retrieved on 15th February 2023 from http://winsteps.com/tutorials.htm

[23] Maeng, S. (2020). Investigating a learning progression for reasoning practices of geocognition using geomapappbased assessment. Asia-Pacific Science Education, 6(2), 584–614. https://doi.org/10.1163/23641177- bja10009

[24] McHugh, M. (2012). Interrater reliability: the kappa statistic. Biochemia Medica, 276-282. https://doi: 10.11613/bm.2012.031

[25] Nedungadi, S., Paek, S. H., & Brown, C. E. (2019). Utilizing Rasch analysis to establish the psychometric properties of a concept inventory on concepts important for developing proficiency in organic reaction mechanisms. Chemistry Teacher International. https://doi.org/10.1515/cti-2019-0004

[26] Ngai, C., Sevian, H., & Talanquer, V. (2014). What is this substance? What makes it different? Mapping progression in students’ assumptions about chemical identity. International Journal of Science Education, 36(14), 2438–2461. https://doi.org/10.1080/0950069 3.2014.927082

[27] Ngai, C. L. (2017). An investigation of chemical identity thinking. Graduate Doctoral Dissertations. 336. http:// scholarworks.umb.edu/doctoral_dissertations/336

[28] Ngai, C., & Sevian, H. (2016). Capturing chemical identity thinking. Journal of Chemical Education, 94(2), 137– 148. https://doi.org/10.1021/acs.jchemed.6b00387

[29] Ngai, C., & Sevian, H. (2018). Probing the relevance of chemical identity thinking in biochemical contexts. CBE-Life Sciences Education, 17(4), ar58. https://doi. org/10.1187/cbe.17-12-0271

[30] Oon, P.-T., Spencer, B. & Kam, C.C. (2016). Psychometric quality of a student evaluation of teaching survey in higher education. Assessment and Evaluation in Higher Education, 42(5), pp. 788–800. https://doi.org /10.1080/02602938.2016.1193119.

[31] Planinic, M., Boone, W., Susac, A., & Ivanjek, L. (2019). Rasch analysis in physics education research: Why measurement matters. Physical Review Physics Education Research, 15(2). https://doi: 10.1103/ physrevphyseducres.15.020111

[32] Polit, D. F., & Beck, C. T. (2006). The content validity index: Are you sure you know what’s being reported? Critique and recommendations. Research in Nursing & Health, 29(5), 489–497. https://doi:10.1002/ nur.20147

[33] Reed, C. R., & Wolfson, A. J. (2021). Concept inventories as a complement to learning progressions. CBE—Life Sciences Education, 20(2). https://doi.org/10.1187/ cbe.20-09-0208

[34] Sanah, N.U., Ridwan, A., & Rahmawati, A. (2019). Analysis of chemical identity thinking through problembased learning based on redox and electrochemistry concepts. In Empowering Science and Mathematics for Global Competitiveness. (pp363-368). CRS pess.

[35] Sevian, H., & Talanquer, V. (2014). Rethinking chemistry: A learning progression on chemical thinking. Chemistry Education Research and Practice, 15(1), 10-23. https://doi.org/10.1039/c3rp00111c

[36] Wei, S., Liu, X., Wang, Z., & Wang, X. (2012). Using Rasch measurement to develop a computer modelingbased instrument to assess students’ conceptual understanding of matter. Journal of Chemical Education, 89(3), 335–345. https://doi.org/10.1021/ ed100852t

[37] Van Regenmortel, M. H. (2004). Reductionism and complexity in molecular biology. EMBO reports, 5(11), 1016-1020. https://doi.org/10.1038/ sj.embor.7400284

[38] Yao, J.-X., & Guo, Y.-Y. (2017). Validity evidence for a learning progression of scientific explanation. Journal of Research in Science Teaching, 55(2), 299–317. https://doi.org/10.1002/tea.21420

[39] Zarkadis, N., Papageorgiou, G., & Stamovlasis, D. (2017). Studying the consistency between and within the student mental models for atomic structure. Chemistry Education Research and Practice, 18(4), 893–902. https://doi.org/10.1039/c7rp00135e

Articles in Issue