How adaptive testing can fill post-pandemic learning gaps
A growing number of edtech platforms are using adaptive learning to get personal with their learning loss recovery plans. Here are four customer examples we’ve seen.
Adaptive learning is about meeting learners where they are: providing them with content to fill in what they don’t know and reinforcing what they do. This involves assessing “knowledge components” or the skills and competencies required to meet a specific learning outcome/goal.
For example, for a learner to complete a physics problem about momentum, they must understand Newton’s laws of motion, systems of equations, and vectors. To understand Newton’s laws, they must be able to draw a free-body diagram and solve an equation for a specific variable.An adaptive learning system aims to identify, via assessment, the areas in which a student needs support and to enrich their skills in that area before building on them. Click To Tweet
Drilling down, we’d eventually identify more basic competencies: for example, understanding a coordinate system, parallel vs perpendicular lines, and fractions and ratios. Unlike traditional learning where a student without an understanding of the Cartesian plane would struggle through a lesson on free-body diagrams, an adaptive learning system aims to identify, via assessment, the areas in which a student needs support and to enrich their skills in that area before building on them.
This kind of personalization has been critical as education systems look to accommodate a diverse range of student needs (and combat lack of engagement) post-pandemic.
However, it isn’t trivial, requiring:
- A large bank of content
- A method of reliably identifying learners’ competencies
- The ability to drill an assesment down to distinct knowledge components
- In-depth skill-focused reports valuable to both learners and educators/curriculum planners.
Though many edtech organizations already had the content required for this personalization, they needed to develop a reliable, repeatable way of determining which learners would benefit most from what content. Here, we look at four use cases, anonymized to maintain customer IP, of how Learnosity customers used our adaptive offering to personalize learning recovery.
Use Case A: Intelligent online tutoring
Client A knew that demand for their product would grow post-pandemic as learners and parents recognized the gaps in their learning upon their return to schooling. Users would no longer be satisfied with grade or age-based tailoring and so a more personalized, formative test was needed—to not only identify the learners’ knowledge gaps, but to also help families choose the right tutoring plan.
Client A did not have the item difficulties to implement a fully computer adaptive test and wanted to leverage their existing formative entry tests. Furthermore, as their target market is K12 home-based learning, they knew parents were the decision-makers. Parent buy-in, through hyper-personalized data and comprehensive reports, was key.
Their solution: Build a custom learning pathway using Learnosity
An initial assessment of the learner’s skills produces a comprehensive report for parents, breaking down the knowledge components required for each question and demonstrating the learner’s mastery of it. When they received the report, clients were directed to the tutoring plan most suitable for their child, with a tailored learning pathway ready to go.
- Learnosity tags allow for reporting scores based on tags—in this case, skills. Metadata ensures there are clear explanations for how a learner was evaluated on each skill, with reported feedback customized to the learner’s performance.
- New content and formative assessment can be selected and auto-scheduled based on data returned from Learnosity.
- On-the-fly assessment allows for the creation of a new personalized assessment, instantly digging deeper into the area in which the learner is struggling while requiring no additional author involvement.
Use case B: Teacher support for recovering learning loss (supplemental intervention)
Client B originally provided supplemental intervention to prepare students for state testing—an in-classroom aide.
During the pandemic, they developed a product to mitigate covid-related learning loss. While much of the content could be reused, determining learner need was drastically different. Unlike state testing, students may have vastly different degrees of unfinished learning and learning loss. The client’s goal was to determine which disparate learning blocks a learner understood of a broader knowledge competency (as in our previous example—to check that a learner understands coordinates and forces before assessing their knowledge of free-body diagrams). They wanted to test multiple building blocks without frustrating learners at the more basic level, boring learners with high competency, or requiring them to sit through six different full-length tests on each skillset.
Their solution: A branching assessment using Learnosity
- If a student answers the first few questions on a learning block skill correctly, they can be branched to deeper questions or a different learning block—whereas students struggling with basics will be evaluated on questions closer to their competency-level.
- Again, used tags to design the branching test and enable insightful reporting.
- Produced a personalized learning map to bring the student back to grade level.
Use case C: Repeatable diagnostics for decision-maker planning
Client C offers out-of-the-box diagnostic tests to school districts, as well as customizable diagnostics. Most schools need to report on skills and knowledge domains in math and English at school and district level to show student progress over time. For fairness, reliability, and repeatability, they needed a test of the same difficulty, content coverage, and validity that could be administered multiple times a year.
Their solution: Computerized adaptive assessments
- The client’s psychometricians were able to input the calibrated difficulty of their items into their Learnosity item banks.
- By using a set of psychometrician-defined hyper parameters (initial ability, difficulty tolerance, difficulty offset, termination criteria, etc.), learners can take the same activity at different times in the year while answering different questions than they were asked previously.
- Out-of-the-box Rasch model calculates an ability estimate for the user over the course of the test, which can then be directly compared with ability estimates in past or future, demonstrating improvement.
- Computer adaptive testing (CAT) provides huge value for schools. If they want a custom assessment for their school or district, they only need to design a single assessment. This saves time and money while still ensuring item exposure is low and learners are seeing (mostly) new questions to test their knowledge.
- As in previous examples, educators can see in-depth reports and make decisions based on the given data for further in-person instruction.
Use case D: Adaptive assessment in standardized and high stakes testing
Client D provides benchmark assessments for schools and districts in reading and writing with their unique method of ability estimation. Adaptive testing is a key part of their value proposition.
Their solution: A custom algorithm through Learnosity’s self-hosted adaptive engine
- Utilized their proprietary algorithm for adaptive assessment within the Learnosity assessment player.
- Removed the need for a separate assessment system while retaining full control of their value-enhancing item selection algorithm.
- Gained the flexibility to customize scoring—for example, returning two distinct ability estimates from one adaptive assessment.
Adaptive assessment systems can both guide machine-led personalization and be an invaluable tool to educators as they foster social, creative, and collaborative learning opportunities to re-engage students both online and off.
To learn more about the transformative potential of adaptive testing, download the slide deck below .👇