Skip to main content

How adaptive testing helped Open English boost language learning outcomes

Learnosity - Case Study - Thais Preis Oliveira, VP of Marketing for Open English

Leading language instruction supported by modern digital assessment

Open English empowers its students to lead better lives by giving them the language and digital skills they need to succeed. The innovative platform guides students towards language fluency by offering 24/7 access to live instruction for a convenient in-home or on-the-go experience, complemented by thousands of hours of proprietary content, interactive lessons, and level assessments.

“Our platform democratizes access to high-quality English learning with native teachers,” says Thais Oliveira, SVP of Marketing for Open English. “We empower students to expand their social and economic opportunities by helping them get a new job and further their career or getting them ready for international trips.” 

Requirements

- A robust, flexible adaptive testing solution

- Support for a tagging taxonomy

- Easy integration and an intuitive user interface

The Challenge

Adapt, attract, and engage

Before language learners could start using Open English’s platform, they first had to take a placement test to establish their level of English proficiency. While Open English had long been a leader in English language learning, their placement test was no longer setting the right expectations for modern users. That placement test, which was a self-assessment made in-house, wasn’t performing the way they’d hoped.

“The old self-assessment was very glitchy and often required a lot of work,” says Thais. “And so we had a lot of downtime, which wasn’t great for us. Because the placement test is a requirement for users to start using the platform, when the previous placement task was having issues that would stop people from using our platform.”

And because the placement test was the user’s first point of contact with the Open English learning platform, that unsatisfactory assessment was giving users a poor first impression of the Open English platform. 

There was also misalignment between the placement test (which was a self-assessment) and the level tests (which used branching and pooling models), so the learner experience lacked consistency. Due to the limitations of those types of assessments, they were also incapable of providing actionable insights that could help refine Open English’s language learning content.

We wanted a very credible adaptive testing solution… It had to give us flexibility, be easy to use for the end customer, and assess learners’ current understanding of English more accurately.

Thais Oliveira

SVP of Marketing for Open English

Thais Preis Oliveira, VP of Marketing for Open English

The problem was becoming harder and harder to ignore, because the demand for English language learning was growing all the time—and Open English was becoming increasingly aware that they were losing customers because of an inability to deliver a high-quality placement test. Open English realized what needed to be done: adopt an adaptive testing solution that would give learners a smoother user experience and measure their English proficiency with a higher degree of accuracy.

Open English also believed an adaptive testing solution would help them build an additional test, which could be used for capturing the growing demand for English language learning while also being closely aligned with their platform’s content.

“We needed a top of funnel strategy to attract potential students, those people that are not necessarily ready to commit to the course,” says Thais. “We needed a free tool to engage them, to attract them, to help them become a little bit more familiar with the platform by offering them a free English test that could determine their level.”

The Solution

Turning to the leader in assessment solutions for rock-solid adaptive testing

If Open English was going to abandon its in-house self-assessment, there needed to be certainty that its replacement could meet new requirements.

“We wanted a very credible adaptive testing solution, because we’re talking about users with different languages, from different levels, and different backgrounds,” says Thais. “It had to give us flexibility, be easy to use for the end customer, and assess learners’ current understanding of English more accurately.”

While conducting research into possible alternatives to replace the self-assessment, Open English’s academic team landed on Learnosity. Learnosity was chosen because it could both provide an adaptive testing solution for the placement test, as well as offer powerful tagging that would allow Open English to more precisely measure learning outcomes, competencies, and criteria for level tests.

“Not only was implementation easy and fast, but it also provided a very seamless learner experience,” says Thais. “From the moment they take a placement test as a prospective customer through to later level tests as a student, the users now get consistent assessments throughout.”

Learnosity allowed us to align the assessment cycle and ecosystem within one tool, while also creating a process to use the resulting data to improve the program as a whole.

Jessica Buch Ed.D

Director of Academic Quality & Development at Open English

Jessica Buch Ed.D, Open English’s Director of Academic Quality & Development

A simple way to deliver complex adaptive tests

To deliver the modern assessment experience they had envisioned, Open English needed an adaptive testing solution that was robust without being rigid. 

“One of the main reasons why we chose Learnosity was flexibility,” says Thais. “The variety of test options— adaptive, branching, linear—and the capability to report on taxonomy, according to what was needed for the company. The biggest problem that Learnosity helped us solve was the ability to measure student learning by level, outcome, competency, and criteria for the very first time, due to the complexity of the tagging that you offer.”

And even though Learnosity’s solutions were highly sophisticated, they remained user-friendly thanks to an intuitive interface and clear documentation.

“Learnosity’s adaptive testing solution was also easy to manage from our end, which was especially helpful because we planned to use it for multiple brands that each required different reports,” says Thais. “And the integration process was very smooth, because we had really good documentation from the start.”

  • Open English level test

The Outcome

Data-powered assessment experiences that attract new (and old) customers

Within two months, Learnosity’s integration with Open English was complete and their  adaptive placement test was ready to go live. Since then, the switch to Learnosity’s adaptive testing solution has had a significant impact on the platform.

Jessica Buch Ed.D, Open English’s Director of Academic Quality & Development, says: “The big win for Open English was the deep insights we now have from our level tests—we can now measure learning outcomes and competencies in a way we were unable to do before.”

Powering its placement test with the same modern assessment engine already within its level tests, one that users knew they could rely upon from the get-go, allowed Open English to improve the learner journey and earn the trust of more customers.

“Learnosity allowed us to align the assessment cycle and ecosystem within one tool,” says Jessica, “while also creating a process to use the resulting data to improve the program as a whole.”

The biggest problem that Learnosity helped us solve was the ability to measure student learning by level, outcome, competency, and criteria for the very first time, due to the complexity of the tagging that you offer.

Thais Oliveira

SVP of Marketing for Open English

Thais Preis Oliveira, VP of Marketing for Open English

And because this adaptive testing solution was reliable and easy to use, Open English saw the opportunity to build another test in-house with the potential to be a highly effective (and cost effective) lead generation tool.

By offering users a valuable sample test to give them a taste of this new assessment experience, Open English could use the Learnosity-powered adaptive test to both attract new customers and re-engage old customers who had become inactive on the platform.

“We’ve seen a lift in organic traffic for the specific key words ‘free English test’,” says Thais. “We use our lead-gen test as a tool to nurture and improve conversion of our existing database. So for people that were interested in learning English at some point, we use this test as something new and exciting for them to re-engage with the brand. So this new adaptive test became a closing tool for us.”

About Open English

Open English has been a leader in online English language learning in Latin America since 2007. To date, the company has +2 million students and +10,000 companies that learn English with its online method. Open English’s interactive technology delivers live online classes for adults and businesses taught by native-English speakers 24 hours a day, 7 days a week.