Developing a test to measure design thinking

Publikation: Bidrag til bog/antologi/rapport/proceedingKonferencebidrag i proceedingsForskningpeer review


  • Anna Rusmann, Department of Communication and Psychology, Aalborg Universitet
  • ,
  • Jeppe Bundsgaard

Like other progressive learning approaches, game-based learning is expected to develop a number of advanced competences in students (Gee, 2007). These competences are often referred to as 21st century skills (Partnership for 21st Century Skills, 2002; Griffin and Care, 2015). Various frameworks of 21st century skills have been proposed, and what they typically have in common is the inclusion of competences such as collaboration, critical thinking and ICT skills. Two on-going Danish research projects, Game-Based Learning in the 21st Century and Community Drive, involve school-based interventions based on principles from design studies. Students learn to collaboratively investigate real-world problems, conceive ideas, build prototypes of solutions and present them to external parties. The aim is to develop students’ design thinking skills, which can be seen as a subset or special category of 21st century skills. To measure the effects of the interventions, we have developed a computer-based test of design thinking. The test is a performance test, which means that, in addition to being asked to answer factual or procedural questions, students are given the opportunity to engage in activities such as building models, conceiving ideas and reflecting on ethical and social conflicts. To date, no other performance-based assessment of design skills has been developed (Razzouk and Shute, 2012). Our test consists of four test modules. Within each module, students solve different types of tasks within a simulated, authentic narrative. The tasks measure various aspects of four design competences, which we argue are relevant for primary and lower secondary school students. The Rasch model is used to test the dimensionality of the data and scale the responses on the four dimensions. A number of tasks can be scored automatically, while others require human evaluation. In this paper, we present the test, the underlying design thinking theory, the design decisions and the statistical Rasch model used to scale the responses and validate the test. The principles behind the task scoring are also outlined, and the limitations of the test format are discussed in relation to the measurement of design thinking.

TitelProceedings of the European Conference on Games-based Learning
RedaktørerLars Elbaek, Gunver Majgaard, Andrea Valente, Saifuddin Khalid
Antal sider9
ForlagDechema e.V.
ISBN (Elektronisk)9781912764389
StatusUdgivet - 2019
Begivenhed13th International Conference on Game Based Learning, ECGBL 2019 - Odense, Danmark
Varighed: 3 okt. 20194 okt. 2019


Konference13th International Conference on Game Based Learning, ECGBL 2019

Se relationer på Aarhus Universitet Citationsformater

ID: 173218174