Blog

Using Portable Custom Interactions to Measure 21st Century Skills

Teaching mathematics and science requires much more than simply imparting core knowledge and ideas. Students must immerse themselves in practices that allow them to develop cross-curricular skills such as calculating, modeling, problem-solving, and reasoning if they are to succeed in a world that increasingly utilizes science and technology. In the past, it’s been difficult to measure 21st Century Skills, like problem solving and collaboration, because traditional math and science assessments contain test items that are scored, based on a student’s final answer.

Although scoring rubrics allow you to assign points based on the degree to which the response or response part is correct (fully, partially, or not at all), this type of score does not provide the rich data that would enable an educator to determine how their students arrive at their answers for given problems. In addition, as assessment transitions to a digital environment, the types of math and science items that are commonly used today are predominantly multiple-choice and constructed-response: item types that are relatively straightforward to develop and can easily be administered in a large-scale setting. 

The Solution: Portable Custom Interactions

By deploying digital assessments that include interactive items built using Portable Custom Interactions (PCIs), it’s possible to open the window to more creative and immersive testing experiences to see a student’s thought process throughout an entire problem-solving activity. It wasn’t long ago that Technology-Enhanced Items (TEIs) required a large investment of time, money, and programming expertise in order to even be considered as a component of a learning or assessment environment. However, it’s no longer the case. Today, PCI items are simply technology-enhanced items that are built to the “PCI open standard.” 

Assessment and learning software applications that incorporate the PCI standard give users the ability to create problem-solving items, and the flexibility to include an unlimited number and type of (QTI standard based) interactions, like hot-spot, drag and drop, text entry, and graph, in the item. These same standards ensure that the item’s content and data are interoperable within an institutions’ digital environment. In practical terms, item authoring modules that are compliant with the QTI and PCI standard—such as TAO’s Item Creator module— allow the item author(s) unlimited creativity to conceptualize problem-solving tasks that leverage any number of interaction templates from the entire range. 

These rich, and seemingly complex items, can be used to measure how students interact in a given situation to analyze and solve a problem. For example, the Assessment, Forecasting and Performance Directorate at the French Ministry of Education (Direction de l’évaluation, de la prospective et de la performance [DEPP]) has released a number of model interactive items using the PCI standard. 

One of these items assesses knowledge and skills related to the Relativity of Movement, and allows the student to interact with the on-screen characters to see the movement in the scenario from each character’s perspective. 

The exercises and questions included in interactive items engage students on multiple levels, and capture not just their answers, but their thought process as well. In fact, the rich log data (learning events) captured by these items, such as the time at which students start and stop their work, mouse movements, the use of different onscreen tools, idle time, and a screenshot of the last actions, allow educators to gain deep insight into how students approach the problem, and identify areas that might require additional focus. In other words, PCIs make it possible to collect much more than just the digital answers and provide data to facilitate the feedback loop between teaching, learning, and assessment.

Learning Impact Outcomes

With this in mind, the DEPP, the Luxembourg Ministry of Education, and their contractors Vretta and Wiquid, collaborated to create a number of multi-step PCI items using TAO. Vretta developed the math PCI items, and Wiquid developed the science PCI items. The goal was to introduce these items in CEDRE (the cycle des évaluations disciplinaires réalisées sur échantillons/Cycle of Sample-based Subject Specific Assessments), which are annual low-stakes tests in France given in a variety of subjects, that include history, geography, science, math, foreign languages, and French. 

The CEDRE exams were first introduced in 2003 as paper and pencil assessments, migrated to digital format in 2016, and are now administered on desktop computers. The science and math exams each contain between 50 and 60 items from a 300-item bank. In May 2016, the DEPP introduced for the first time three math and two science PCIs to a nationwide standardized test, which was administered to 8,000 9th grade students. DEPP worked with Capgemini to analyze the resulting log data from the three math PCIs. This exercise provided very rich and interesting observations with respect to the strategies the students used to solve the interactive questions. 

For instance, one question asked the students to estimate the circumference of a lake. The log data collected showed that oftentimes students confused “circumference” with “area”, judging by the steps they took to solve the problem. Below are samples of log data gathered from four student responses. 

The data collected for each response shows the lines that each student traced in his or her attempt to solve the problem, the percentage that were within the border (or circumference) of the lake, and the student’s estimate of the measurement. 

In the spring of 2017, when the DEPP took the practice a step further and increased the scale both in terms of the number of PCIs administered and the number of student participants. This time, 12 science PCIs were administered to 10,000 students, while 25 math PCIs were administered to 11,000 students.

Return on Investment

PCIs and their log data can provide excellent feedback to the educational community—including teachers, researchers, and policy-makers—into students’ level of engagement, understanding of the subject areas, and how they approach problem-solving. Because TAO is built on open source, The DEPP can leverage additional resources contributed by the user community.