Nivetha Kumar1,*, WONG Lik Wei2, Inthrani Raja Indran1, Gavin Stewart DAWE1, and Judy SNG Chia Ghee1
1Department of Pharmacology, Yong Loo Lin School of Medicine (YLLSOM), National University of Singapore (NUS)
2Department of Physiology, YLLSOM, NUS
Sub-Theme
Building Technological and Community Relationships
Keywords
PeerWise, question creation, constructivism, learning
Category
Paper Presentation
Introduction
Co-creation process in curricula allows students to become active collaborators in the learning process (Guerts et al., 2024). This paves the way for a more meaningful and effective education. PeerWise is an online platform which enables students to create questions and answer their peers’ questions (Denny et al., 2008a, Denny et al., 2008c, Denny et al., 2008b). Additionally, it encourages dialogue amongst students as they are also able to comment on and rate questions. Built upon the concepts of active learning, the tool demonstrates a constructivist approach to education by enabling students to conduct self and peer assessment while utilising the platform (Guilding et al., 2021a). As such, it is an innovative tool to develop an active learning community.
Rationale of Study
By integrating technology with pedagogy, PeerWise benefits students and enriches academic teaching strategies. This study evaluated the effectiveness of PeerWise in enhancing academic performance among life science and medical students at the National University of Singapore.
Methodology
PeerWise was introduced to Life Science students in LSM3219 Neuropharmacology course since 2018 and to medical students in MD2140 course for its pharmacology components since 2020. Academic performance was compared between students who used PeerWise and those who did not. Additionally, effects of both the number of questions created and the number of questions answered were explored to assess how these activities impacted students’ academic performance. An analysis of questions based on Bloom’s Taxonomy was conducted to compare the 2024 cohort (with AI integration) and the 2020 cohort (without AI integration).
Key Findings
Overall, for LSM3219 course, cohorts that used PeerWise had higher mean scores on the combined quizzes (86.3±10.9%; p=0.0002) compared to those that did not (73.7±13.7%).
In MD2140 course, 32% (96/300) of undergraduate Year II medical students used PeerWise, creating 147 questions and 1714 answers. We found that the mean assessment score (26.0%) for students who used PeerWise was significantly higher (p=0.0434) compared to the mean assessment score (24.7%) for students who did not. Notably, students who engaged actively in question authoring achieved higher exam scores compared to those with lower authoring activity. The mean scores were 27.6±4.9, 23.7±4.9, 27.5±4.2, 24.0±7.1 and 30.5±0.7 for students who authored 0, 1, 2, 3, and 4 questions, respectively. However, differences in question answering activity did not yield significant performance effects.
Cohorts from 2019 and earlier were considered to have low AI usage, while cohorts from 2020 onward were considered to have high AI usage. When comparing the 2024 cohort with the 2019 cohort for the LSM3219 course, we found that the number and proportion of questions categorized under “Evaluate” were higher in the cohort with AI usage (60/206 or 29.1%) compared to the cohort without AI usage (13/160 or 8.1%).
Significance of the Study
PeerWise appears to be an effective educational tool for promoting academic success, especially when students are encouraged to engage in question creation. Integrating such platforms into classroom practice may foster deeper learning and complement traditional instructional methods. Learners are propelled to actively engage in the learning process resulting in higher order learning outcomes as not only are they answering questions, but they are also involved in the process of creating questions and providing feedback for their peers’ questions (Chen et al., 2022). This produces positive learning outcomes which is evidenced by the higher assessment scores attained by learners who utilise PeerWise as opposed to those who did not. Reliance on AI for tasks leads to reduced cognitive and critical thinking skills in the long-term (Kosmyna et al., 2025). Tools such as PeerWise promote active critical thinking skills by allowing the student to construct questions and critique their peers’ questions.
References
Chen, L., Howitt, S., Higgins, D. & Murray, S. (2022). Students’ use of evaluative judgement in an online peer learning community. Assessment & Evaluation in Higher Education, 47(4), 493-506. https://doi.org/10.1080/02602938.2021.1933378
Denny, P., Hamer, J., Luxton-Reilly, A. & Purchase, H. (2008a) PeerWise: Students sharing their multiple-choice questions. In Proceedings of the Fourth International Workshop on Computing Education Research, 51-58. https://doi.org/10.1145/1404520.1404526
Denny, P., Luxton-Reilly, A. & Hamer, J. (2008b). The PeerWise system of student contributed assessment questions. In Proceedings of the Tenth Conference on Australasian Computing Education, 78, 69-74.
Geurts, E. M. A., Rejis, R. P., Leenders, H. H. M., Jansen, M. W. J., & Hoebe, C. J. P. A. (2024). Co-creation and decision-making with students about teaching and learning: A systematic literature review. Journal of Educational Change, 25, 103-125. https://doi.org/10.1007/s10833-023-09481-x
Guilding, C., Pye, R. E., Butler, S., Atkinson, M., & Field, E. (2021). Answering questions in a co-created formative exam question bank improves summative exam performance, while students perceive benefits from answering, authoring, and peer discussion: A mixed methods analysis of PeerWise. Pharmacology Research & Perspectives, 9, e00833. https://doi.org/10.1002/prp2.833
Kosmyna, N., Hauptmann, E., Yuan, Y. T., Situ, J., Liao, X-H., Beresnitzky, A. V., Braunstein, I., & Maes, P. (2025). Your brain on ChatGPT: Accumulation of cognitive debt when using an AI assistant for essay writing task. arXiv. https://doi.org/10.48550/arXiv.2506.08872