In the last 12 months, Learnosity has delivered online assessments to over to 30 million users. That kind of scale brings with it several responsibilities.
Foremost among them is inclusivity; ensuring that the product is accessible to as many people as possible.
Given the fact that 3 percent of the total population in the United States experiences some form of visual impairment, accessibility is a hugely important issue.
If that same percentage was applied to Learnosity’s user base, then approximately 810,000 learners would be affected by this impairment.
In part one of this series, I wrote about how high-contrast interfaces add clarity for every person, how we prioritized our efforts to evaluate color contrast in Learnosity’s Questions API, and how we addressed crucial issues in our design process.
Here in part two, I will examine the important next steps: how we defined, implemented, tested an accessible color palette, and went beyond our delivery.
In earlier evaluations, we identified low-contrast colors in the bars of the Histogram Question type.
We needed to replace the low-contrast colors with high-contrast colors so that all learners could see and perceive them.
Where would we get new accessible colors from? There are many great color resources on the web.
Because of our earlier efforts to trace key components during testing, I could apply the new colors to mockups of the Histogram Question type quickly.
The design team critiqued my color choices and compared them with the current version of the Histogram Question type.
The new colors, while higher in contrast, gave off less light than the colors we’d implemented before. They were beginning to make the Question type in the mockups feel dark and heavy, but we wanted to make sure it felt light and engaging for students.
So I returned to the Color Safe website and chose lighter colors that were at further distances across the color spectrum.
We were feeling more confident in the new color choices so we specified the colors for development and annotated the details.
Additionally, we specified an increase of color contrast to the handle elements across all charting Question types, so that more learners would be able to perceive and adjust the bars.
We visualized the color palette and annotated it with name, token value, and hexadecimal color codes.
We also thought about how a content author could utilize the accessible color palette. So we asked the engineering team to configure the default values in the Histogram Question template to use one color after another.
As a result, whenever a content author would go to add new bars they would then be an accessible color straight out of the box.
QA engineers were testing to make sure the changes in the code base matched the design specification.
One particular engineer was also using the NoCofee vision simulator plugin to simulate how persons with color vision deficiency (a.k.a. color blindness) might see and perceive Question types in their browser.
I received a Slack message from the engineer:
I followed up and took a look at the result of the simulator.
Even though the colors we specified for the bars were high in contrast, the contrast between the bars was low. How would a learner with color vision deficiency differentiate between bars in the chart?
We weren’t making it easy for them with the current order of colors.
The simulation tool proved our assumption wrong that fixing the color contrast of the individual bars would be enough to make the Question type accessible.
We decided to rearrange the colors for maximum contrast. The Stark Sketch plugin’s color simulator made it easy to do this and then proof them in our design tools.
We ordered the colored bars in a way that would provide maximum contrast between each color.
The desired order of colors in the Histogram Question type also became the new order of preference in the accessible color palette.
At this point we had covered the color contrast in the top 10 Learnosity Question types and the color contrast of textual information across all of Learnosity’s Question types.
We turned our focus to the Question types with more advanced functionality.
Upon doing so, we discovered more colors deeper in the fundamental and communicative skeleton planes. Input-type components such as text input, or selectors, had interaction states – hover, focus, selected – and the interaction states brought with them new colors to test.
Being able to see and perceive the interaction states would be critical, particularly for any visual Question types.
We identified an issue with the Shading Question type, where low-contrast colors were being shown for the normal hover and selected interaction states.
We realized that in this scenario the low-contrast colors would have resulted in no visible feedback for students with a visual disability.
Making the foreground and background colors high in contrast, in all interaction states, was going to add further scope.
Even though we were nearing the finish line, we didn’t want to let what we’d found in the Shading Question type go unaddressed, so we reviewed our priorities again.
In the international Web Content Accessibility Guidelines (WCAG) 2.0, there are guidelines for making textual information high enough in contrast, but there are no clear guidelines for making interface components high in contrast.
To us, the perception of the Shading Question type’s user interface was just as important as the information in it.
If we were “checking a box for accessibility” then we could easily have stopped there and ignored the issue at this skeleton plane.The journey to learner empowerment goes nowhere without first recognizing learner diversity. Click To Tweet
But we chose not to. Instead, we decided to give ourselves a stretch goal of enhancing the color contrast in these deeper layers of the skeleton plane, even though this would increase the current scope of our work.
We captured screenshots of the interaction states in the Shading Question type and moved the screenshots into our Sketch file.
We repeated the same color evaluation and specification process as before.
We increased the color contrast in the cell’s normal and hover interaction states.
As this Question type was selectable, we made sure the color contrast of the cell’s selected interaction state was also visibly different to the normal and hover interaction states.
We continued to evaluate more interactive Question types and tested color contrast in the interaction states.
Thankfully, the majority of colors present in the interaction states of the remaining Question types were high enough in contrast and implemented consistently.
We initially estimated that the design approach would take 4 days. After the scope increases, the evaluation process took twice that to complete – a total of 8 days.
I lead the process with the support of an additional designer, product manager, product owner and QA engineer, all of whom ramped up at certain points to make the delivery a success.
What the design team achieved:
What the design team did next:
Color contrast testing is complex, but not impossible.
With clear priorities, an eye for detail, the right tools, a positive attitude, and a willingness to look deeper at what you first see, I believe a product can be made more inclusive for more people.
At Learnosity, color testing is a crucial part of the process of designing an accessible and usable product.
I am proud of the design team in the way they responded to the call, the breadth and depth we traversed, and what we achieved together.
Today, the product development team continues to ensure that every student can see, perceive, and fully engage with Learnosity’s products.
The journey to learner empowerment goes nowhere without first recognizing learner diversity.