Skip to main content

Why user research adds more value to products than user assumptions

Read Time 5 Mins
Process
Design

The best way of building tools that support learning is by bringing end-users into the process sooner rather than later.

Because user behaviors vary from person to person and product to product, the design challenges that every company faces are unique. This means that even the most long-standing design principles act as guidelines rather than dogma.

Even though in-house design principles are important in guiding our work at Learnosity, we believe that good design starts with looking at a problem from as many angles as possible before identifying a solution. This means paying close attention to users’ needs – or, as our Design Principle No. 3 puts it:

Know our users
Listen and understand to get to know our users and why they do what they do. Putting the user’s goals at the center of what we do allows us to build and design products that solve real problems. Use real feedback to build a complete picture.

While refining our drawing tool, one of our newer features, we put this principle into action and brought user research into the process.

Bringing users into the picture

The drawing tool gives learners a range of options when answering questions. They can construct responses using a compass, straight-edge ruler, or scribble freehand. It’s a feature we designed with extensibility in mind, meaning the long-term vision for it was that we’d add new capabilities down the line to increase its overall impact and usefulness.

Since the main users of the drawing tool are likely to be high school students, the group we worked with had an age range of 12 to 15.

Gif of Learnosity's digital compass tool
Learnosity’s drawing question lets students give answers using a variety of tools, such as a compass, straightedge (ruler) and text annotations.


Our sample size of 6 might seem small, but this worked well for our usability testing (some suggest using an even smaller sample size), What you’re really looking for is specific, deep-dive information that challenges assumptions and deepens your understanding of how users interact with your product.

Context is key

To put the students at ease we made the session casual and conversational.

Under the supervision of their parents we opened with some general questions on what a regular school day looked like and what motivates them in their studies.

We believe that good design starts with trying to see a problem from as many angles as possible before trying to find a solution. Share on X

By offering details of their day-to-day schedules, exam preparation, and device usage, the students gave us valuable context we might not otherwise have been privy to.

Support motivation, drive action

It was clear from early on that students have unique motivators and learning styles. We’ve long known that it’s a mistake to lump “learners” into some homogenous category, but it was still eye-opening to see the fault lines so clearly.  

Whether it’s a friendly competition to beat a classmate’s best score, a long-term goal to start a business, or a passion for sports, every student has their own story to tell.

And through those stories, we caught a glimpse of their unique motivations and drivers. 

The same can be said for each individual’s preferred learning style. While one student liked to type notes on their laptop for greater legibility, another preferred to jot things down by hand in a notebook to better control the layout of their study material.

Every student has their own story to tell. And through those stories, we caught a glimpse of their unique motivations and drivers.  Share on X

This served as a reminder of how important it is to develop tools that are flexible enough to support needs that are often vastly different from one user to te next. 

Study tools need to be streamlined

When asked what digital tools they typically use, each student gave us a different answer. The range differed from school to school and class to class but the overriding challenge remained the same: learning how to master – and then toggle between – a multitude of tools while remembering when to use each one. 

Additionally, some described using a “painful mixture” of digital and physical resources. While students admitted to using the resource type they felt most comfortable with, the assessment formats and submission methods they used were chosen by teachers.

Students learn shortcuts and workarounds quickly

As the session progressed, we noted just how quickly students learned to use shortcuts and find workarounds in the drawing tool.

This led us to wonder whether users were interacting with the tool as we’d intended. And if not, why not? 

A possible reason came up during post-session feedback where a number of students mentioned other tools or technologies they were familiar with. One participant referred to apps such as Snapchat and Instagram when commenting on the expected behavior of a feature. 

Small, easily overlooked details often offer high-impact insights for refining your design in a way that significantly improves the overall experience.

What we took from this was that popular tech has created a kind of common language when it comes to interacting with other tools or products. But can user expectations based on familiarity with other tools be leveraged to design a more fluid user experience? Or would trying to do so run the risk of imposing prescriptive limits on their behaviors?

This is where the real value of usability testing lies. Paying close attention to what users are doing during sessions and what they say in their feedback is where you can yield the most valuable payoffs.

Small, easily overlooked details often offer high-impact insights for refining your design in a way that significantly improves the overall experience.

Unexpected feedback can be disruptive – in a bad way

We try to make using our tools as intuitive as possible. If there’s friction for users, then there’s room for improvement.

We were reminded just how important it is to give even small details the utmost care and attention after one student pointed out a glitch in the prototype that resulted in a confusing user experience.

Distraction and multitasking are stress amplifiers that hinder concentration. In a high-stakes exam environment where the pressure is already intense, a poor user experience during an assessment only adds fuel to the fire.

Unable to tell whether the glitch was intentional or not, the student couldn’t tell if they were at fault for the unexpected behavior. This was an eye-opening observation for us. The glitch not only interrupted their experience – it undermined their confidence in their ability to use the tool.

Distraction and multitasking are stress amplifiers that hinder concentration. In a high-stakes exam environment where the pressure is already intense, a poor user experience during an assessment would be adding fuel to the fire in making an already challenging situation even more daunting.

It’s our job to eliminate these hurdles so students can focus on learning without distractions or additional stress.

Closing thoughts

Learnosity’s products are part of the learning journey for millions of learners every month. That’s a humbling reminder – if we needed one – of why we always need to take our responsibilities as UX designers so seriously. 

It’s also a good reason to frame best practice design principles within the context of the users you’re trying to have a positive impact on. The more clearly you can understand things from their perspective, the more effective you’ll be in making something valuable to them. 

Learnosity's products are part of the learning journey for millions of learners. That’s a humbling reminder of why we always need to take our responsibilities as UX designers so seriously. Share on X

Our main purpose for conducting user research was therefore to gain a better understanding of who we design for, and to test and validate the usability of one of our newly designed tools. 

Putting even the smallest features under close scrutiny helps us identify potential issues and feed solutions into our next design sprint as part of a dynamic, iterative process.

Kimberly Wong

Product Designer

More articles from Kimberly