CHAI Jing Wen1,*, TAY Shan Ren2, Joshua Le Chuen TSU2, and LEE Li Neng2,3
1Department of English, Linguistics and Theatre Studies, Faculty of Arts and Social Sciences (FASS), National University of Singapore (NUS)
2Department of Psychology, FASS, NUS
3NUS Teaching Academy
Sub-Theme
Interdisciplinary Education
Keywords
Interdisciplinary learning, general education, common curriculum, student outcomes, longitudinal assessment
Category
Paper Presentation
Context
In recent years, higher education has been characterised by a shift towards interdisciplinary learning to address complex real-world challenges and the need for innovativeness in problem-solving by integrating knowledge across disciplines (Newell, 2010). In the National University of Singapore (NUS), the establishment of the College of Humanities and Sciences (CHS), the College of Design and Engineering (CDE), as well as the redesigning of the General Education (GE) framework, represents the strategic initiatives by the university to address these real-world needs. While these initiatives are in their early stages of implementation and set the path for students to engage in flexible and transformative learning, it is timely and purposeful to understand if the intended learning outcomes associated with these initiatives are acquired by students as they progress with their undergraduate academic journey and post-graduation. Addressing this, we propose an evaluative framework to track the development of student outcomes unique to NUS, using self-report questionnaires administered at multiple time points from matriculation to post-graduation.
Methodology
Our proposed framework extends the empirical model of interdisciplinary learning (Schijf et al., 2024) by incorporating the dimension of character into the existing dimensions of knowledge and skills (see Figure 1). Together, the Knowledge, Skills, and Character (KSC) triadic framework provides a holistic assessment of interdisciplinary learning and development among students in NUS.
The self-reported questionnaire will be developed according to the KSC framework through a systematic process of classifying student learning outcomes in NUS to their respective dimensions, shortlisting of empirically relevant scales and measurements, and refining questionnaire items that are relevant to the educational landscape in NUS.
(A) classifying student learning outcomes in NUS to their respective dimensions, and
(B) the development of a self-reported questionnaire using empirically validated scales and measures.”
.
Interdisciplinary student learning outcomes that are common across disciplines in NUS were first identified and classified under a relevant dimension (see Figure 1A). Next, a literature search was performed to identify and shortlist scales and measures with high psychometric properties and empirical relevance to the KSC dimensions. Subsequently, scale items were manually screened item by item, selecting and editing them to be contextually relevant to the current education landscape in NUS. Table 1 below presents the relevant scales and measures that have been selected for each of the dimensions. An ongoing evaluation is being performed to further trim and select items in these scales and measures, to make the questionnaire concise and ensure that all items in the questionnaire are pertinent to the existing undergraduate curricula in NUS.
Scales and measures for the proposed KSC framework
| Student Outcomes | Scales & Measures | Source | Internal Consistencies |
Knowledge Investigating complex problems Devising interdisciplinary solutions Critical thinking Interdisciplinary thinking | Interdisciplinary Understanding Questionnaire (IUQ) | 0.40 – 0.98 | |
| Measure of Interdisciplinary Competence (MIC) | 0.76 | ||
| Critical thinking toolkit (CTT) | (Stupple et al., 2017) | 0.60 – 0.92 | |
Skills Self-reflection Communication Recognising disciplinary perspectives Collaboration Civic and social responsibility Digital and data literacy | Self-reflection insight scale (SRIS) | 0.71 – 0.91 | |
| Communication Skills Attitude Scale (CSAS) | 0.81 – 0.87 | ||
| Measure of Interdisciplinary Competence (MIC) | 0.76 – 0.79 | ||
| Interprofessional collaborative competency attainment survey (ICCAS) | 0.94 – 0.98 | ||
| Reflective thinking questionnaire (RTQ) | (Leon and Prudente, 2018) | 0.77 | |
| Ethical student scale (ESS) | (Rua et al., 2024) | 0.86 | |
| Computational Thinking scale (CTS) | 0.81 | ||
| Self-efficacy in data literacy scale (SEDLS) | 0.86 – 0.97 | ||
| Generalized Professional Responsibility Assessment (GPRA) | 0.51 – 0.77 | ||
| Youth Civic and Character Measures Toolkit (YCCMT) | 0.84 | ||
Character Curiosity Openness Respect and Empathy Perseverance and grit Self-esteem and humility Tolerance of ambiguity Sensitivity to bias Personality traits | Intellectual Virtues for Interdisciplinary Research Scale (IVIRS) | 0.83 – 0.95 | |
| Virtuous Intellectual Character Scale (VICS) | 0.57 – 0.83 | ||
| Grit scale short (GRIT-S) | 0.73 – 0.83 | ||
| Intellectual humility scale (IHS) | 0.70 – 0.89 | ||
| Multiple Stimulus Types Ambiguity Tolerance Scale (MSTAT-II) | 0.83 | ||
| Melbourne Decision-making questionnaire (MDQ) | 0.74 – 0.87 | ||
| Revised Need for Closure scale (NFC-R) | 0.87 | ||
| The Scale of Cognitive Bias in the Context of Analytical Thinking Skills (CBCATS) | 0.70 – 0.85 | ||
| Big Five Inventory 2 short (BFI2-S) | 0.73 – 0.91 |
.
Discussion and Significance
The KSC triadic framework brings together a comprehensive set of self-report measures that will capture the holistic development of interdisciplinary learning outcomes in terms of knowledge, skills, and character, which are vital for engaging in complex problem solving today. What is unique about the framework is the measurements of civic and social responsibility, digital and data literacy, as well as character traits that reflect the keen intellect and attitude of an NUS graduate.
Although the tracking of interdisciplinary learning outcomes is through self-report measures that are subjective in nature, we expect that findings from this study would contribute valuable insights regarding perceived cognitive, skill-based, and character development over time in relation to their undergraduate education in NUS. Insights from this study will help improve the design of curricula to strengthen the professional readiness of our graduates and their keenness for lifelong learning.
References
Archibald, D., Trumpower, D., & MacDonald, C. J. (2014). Validation of the interprofessional collaborative competency attainment survey (ICCAS). Journal of Interprofessional Care, 28(6), 553-558. https://doi.org/10.3109/13561820.2014.917407
Baysal, E. A., & Ocak, G. (2022). University students’ cognitive bias in the context of their analytical thinking skills: A reliability and validity study. International Journal of Progressive Education, 18(3), 205-225. https://eric.ed.gov/?id=EJ1352232
Duckworth, A. L., & Quinn, P. D. (2009). Development and validation of the Short Grit Scale (GRIT–S). Journal of Personality Assessment, 91(2), 166-174. https://doi.org/10.1080/00223890802634290
Filipe, L., Alvarez, M-J., Roberto, M. S., & Ferreira, J. A. (2020). Validation and invariance across age and gender for the Melbourne decision-making questionnaire in a sample of Portuguese adults. Judgment and Decision Making, 15(1), 135-148. https://doi.org/10.1017/S1930297500006951
Grant, A. M., Franklin, J., & Langford, P. (2002). The self-reflection and insight scale: A new measure of private self-consciousness. Social Behavior and Personality: An International Journal, 30(8), 821-835. https://doi.org/10.2224/sbp.2002.30.8.821
Kim, J., Hong, L., & Evans, S. (2024). Toward measuring data literacy for higher education: Developing and validating a data literacy self‐efficacy scale. Journal of the Association for Information Science and Technology, 75(8), 916-931. https://doi.org/10.1002/asi.24934
Krumrei-Mancuso, E. J., & Rouse, S. V. (2016). The development and validation of the comprehensive intellectual humility scale. Journal of Personality Assessment, 98(2), 209-221. https://doi.org/10.1080/00223891.2015.1068174
Lattuca, L. R., Knight, D. B., & Bergom, I. M. (2012). Developing a measure of interdisciplinary competence for engineers. In 2012 ASEE Annual Conference & Exposition, 25-415. https://doi.org/10.18260/1-2–21173
Leon, J. L. D., & Prudente, M. S. (2018). Development and validation of reflective thinking questionnaire for senior high school students. Advanced Science Letters, 24(11), 8072-8075. https://doi.org/10.1166/asl.2018.12494
McLain, D. L. (2009). Evidence of the properties of an ambiguity tolerance measure: The multiple stimulus types ambiguity tolerance scale–II (MSTAT–II). Psychological reports, 105(3), 975-988. https://doi.org/10.2466/PR0.105.3.975-988
Mesurado, B., & Vanney, C. E. (2024). Assessing intellectual virtues: The Virtuous Intellectual Character Scale (VICS). International Journal of Applied Positive Psychology, 9, 1803-1826. https://doi.org/10.1007/s41042-024-00193-y
Newell, W. H. (2010). Educating for a complex world: Integrative learning and interdisciplinary studies. Liberal Education, 96(4), 6-12. https://link.gale.com/apps/doc/A255538024/AONE?u=anon~1578b75&sid=googleScholar&xid=86a1fa33
Rees, C., Sheard, C., & Davies, S. (2002). The development of a scale to measure medical students’ attitudes towards communication skills learning: The Communication Skills Attitude Scale (CSAS). Medical education, 36(2), 141-147. https://doi.org/10.1046/j.1365-2923.2002.01072.x
Roets, A., & Hiel, A. V. (2011). Item selection and validation of a brief, 15-item version of the Need for Closure Scale. Personality and Individual Differences, 50(1), 90-94. https://doi.org/10.1016/j.paid.2010.09.004
Rua, T., Lawter, L., & Andreassi, J. (2024). The ethical student scale: Development of a new measure. Organization Management Journal, 21(3), 117-128. https://doi.org/10.1108/OMJ-03-2023-1831
Schiff, D. S., Lee, J., Borenstein, J., & Zegura, E. (2024). The impact of community engagement on undergraduate social responsibility attitudes. Studies in Higher Education, 49(7), 1151-1167. https://doi.org/10.1080/03075079.2023.2260414
Schijf, J. E., van der Werf, G. P. C., & Jansen, E. P. W. A. (2023). Measuring interdisciplinary understanding in higher education. European Journal of Higher Education, 13(4), 429-447. https://doi.org/10.1080/21568235.2022.2058045
Soto, C. J., & John, O. P. (2017). Short and extra-short forms of the Big Five Inventory–2: The BFI-2-S and BFI-2-XS. Journal of Research in Personality, 68, 69-81. https://doi.org/10.1016/j.jrp.2017.02.004
Stupple, E. J. N., Maratos, F. A., Elander, J., Hunt, T. E., Cheung, K. Y. F., & Aubeeluck, A. V. (2017). Development of the Critical Thinking Toolkit (CriTT): A measure of student attitudes and beliefs about critical thinking. Thinking Skills and Creativity, 23, 91-100. https://doi.org/10.1016/j.tsc.2016.11.007
Syvertsen, A. K., Wray-Lake, L., & Metzger, A. (2015). Youth civic and character measures toolkit. Minneapolis, MN: Search Institute. http://www.search-institute.org/downloadable/Youth-Civic-Character-Measures-Toolkit.pdf
Tsai, M-J., Liang, J-C., & Hsu, C-Y. (2020). The computational thinking scale for computer literacy education. Journal of Educational Computing Research, 59(4), 579-602. https://doi.org/10.1177/0735633120972356
Vanney, C. E., Mesurado, B., & Sáenz, J. I. A. (2024). Measuring qualities needed for interdisciplinary work: The Intellectual Virtues for Interdisciplinary Research Scale (IVIRS). PloS one, 19(11), e0312938. https://doi.org/10.1371/journal.pone.0312938