Skip to main content

Look beyond what you see: Color testing for accessibility – Part 1

Read Time 8 Mins
Process
Design

In part one of a two-part series, Learnosity’s User Experience Designer James Santilli traces the team’s journey toward color testing the product for accessibility.

It was 2016 when I joined the product team at Learnosity.

For the first month I was doing what most product designers would do: dive head-first into the core product and start evaluating its effectiveness.

For Learnosity, the core product is online assessment.

I was evaluating the Questions API, which helps our customers deliver interactive learning content to learners.

The more I used the different Question types in Questions API, the more I began to feel my designer spidey-sense tingling.

I began noticing some of the colors that had been implemented in each Question type were slightly inconsistent. I found myself asking questions like “why were we using this shade of green, but not that shade of green? Why did we apply this particular color here, but not over there?”

I wondered whether students were noticing the same thing I was?

How did the design team get involved?

Around the same time, the engineers were working their way through a major project which had them optimizing Question types for accessibility.

They were focused on making the product more accessible for students who would be using assistive technology such as screen readers.

They were ensuring that ARIA states and properties were built in, that ARIA live regions updated content automatically, and that keyboard paths took students on a logical journey towards hearing, perceiving, and responding to questions.

*Knock-knock*.

A Slack message came through to the design team channel.

“Some of these colors look weird =/”.

It was one of our engineers. He had attached a screenshot.

The design team followed up and soon learned that the engineering team had noticed inconsistency with how colors were applied across some Question types.

They’d also noticed that some text was not quite visible on particular desktop monitors in the office.

We initialized a few more Question types on different mobile devices. It was clear that some components had poor visibility. The engineering team’s optimization for accessibility had brought color contrast into focus for the wider business.

The engineers and product owner turned to the design team for help. They needed us to investigate the issue further.

Learnosity's mission is to provide rich, online assessment to as many people as possible. Share on X

It was the perfect opportunity for us to employ our attention to detail and experience with color management into the testing for accessibility process.

How did the design team respond?

The first step the team took was to group together and plan a response to the issue. We discussed some of the potential approaches with the product owner.

The approach we settled on was simple enough. It looked like this:

  1. Evaluate colors in Question type interfaces
  2. Test for color contrast
  3. Document issues and fixes

We estimated that it would take us about four days.

Why test for color?

Testing for color isn’t some ‘nice to have’ luxury here. It’s a cornerstone of what Learnosity offers. The company’s mission is to provide rich, online assessment to as many people as possible around the globe.

This includes students who have a temporary or permanent visual disability.

accessibility for visual disability

This includes color blindness, corneal clouding, cataracts, and refractive errors that may cause blurry, distorted, or weak vision.

In the United States, approximately 3% of the total population suffers from some form of visual impairment. That means around 9.7 million people experience some form of vision loss. In fact, it’s estimated that red-green color blindness affects approximately 1 in 12 men (it affects roughly 1 in 200 women).

Visual disability impacts how a user sees and perceives information, whether in the physical world or when using a digital interface.

Color contrast is a key characteristic of human visual perception.

color contrast

Contrast helps make things stand out.

An accessible interface will utilize light and dark colors to create contrast between foreground and background elements.

For example: if the color of a text label is too similar to the color of the container it is placed in, both colors blend together, and the text label becomes subtle.

For users with a visual disability, this could mean they misinterpret the text label, or overlook it entirely.

color testing

High-contrast interfaces are not only important for users with a visual disability, they add clarity for all users.

If you’ve ever viewed your mobile device’s interface outside under the glare of the sun and could still clearly perceive the display, it’s because your device’s color contrast adapted to allow you to do so.

Color contrast is a key characteristic of human visual perception. Share on X

In the context of interactive learning, students could easily find themselves viewing information from a distance, or blearily projected onto a wall, or displayed on poorly lit monitors.

In a high-stakes, pressurized environment such as that of a test, this puts students at a major disadvantage – not to mention an unnecessary one.

What were our priorities?

Product mantra: Most users, most of the time, for most impact

With dozens of pre-built Question types to evaluate and a fixed deadline with the development sprint already underway, we wanted to conduct a broad sweep of Question types to avoid getting bogged down in the details.

We agreed to evaluate the colors in Learnosity’s most-used Question types first.

With a high number of elements present in even the simplest of interfaces, we wanted to prioritize our efforts when evaluating the interface. We agreed on the parts of the interface we were going to focus on.

The skeleton plane, in Chapter 2 of Jesse James Garrett’s Elements of User Experience, provided a conceptual framework that helped break down the work.

We wanted to ensure that the colors applied to the informational skeleton plane were high enough in contrast for all students to see clearly.

There were two reasons for this:

  1. To ensure textual information is high enough in contrast (as per the WCAG 2.0 – this is a color contrast ratio of 3:1 for an AA rating), and;
  2. The essence of a Learnosity Question type is its content. Questions types are designed to deliver content to which the student then responds.

color contrast in assessment

We planned to evaluate fundamental, communicative, and foundational skeletons too, but only if time permitted.

At this point, the informational skeleton plane was the priority.

How we evaluated colors

Product mantra: Use your own product

We wanted to see and perceive colors in the same way that students would when interacting with our product.

Using Learnosity’s Question Editor, we replaced the default content in the Learnosity Question template with new content that was more contextual to an assessment.

For our testing, we leveraged content from our customers’ item banks to ensure the Question types we were evaluating were as robust and effective as they would be in the real world.

We wanted to test the extremes of each Question type.

We turned on the advanced options in each Question type to display as much of the skeleton in the interface as we could. We wanted to use the product in a scenario that was as close to a real-world use case as possible.

We stepped away from our MacBook Pros, with their bright and color-rich display panels, and opted for devices with different display panels, sizes, and color profiles.

color testing for accessibility

We viewed the interfaces on the Google Chromebook, which was the most highly used device at the time, and the Apple iPad since its display brightness could be adjusted quickly. Its portability also made it easy to place in different environments.

What we documented

Product mantra: Document as you go

In Questions API, we knew a student would generally take the following journey:

  1. Read and understand the question asked
  2. Formulate a response
  3. Learn how the Question type functions
  4. Respond to the Question type

We took this journey for each of the top 10 Question types, and documented screenshots of the interface in a Sketch file.

color testing

As far as we could tell, the majority of the labels, headings, lists, and options, appeared high in contrast, apart from some elements in the charting Question types, which we thought looked slightly low in contrast.

Using Sketch, we inspected colors of the informational and fundamental skeleton planes in the screenshots and extracted hexadecimal color codes.

We put the foreground and background colors into WebAIM’s color contrast checker to test them.

color contrast testing

Colors from the informational skeleton plane passed testing and were high contrast.

color testing
Elements that did not pass color contrast testing.

However, the colored bars in the charting Question types were low in contrast and failed the test.

Because inaccurate color reproduction can occur between monitors, software, color profiles, and color spaces, we wanted to be certain that the colors we extracted from our screenshots were the same colors that existed in the production environment.

We opened the browser developer tools, identified the hexadecimal color codes in the component, and compared colors from the CSS to what our design tools were showing us.

There were no inaccuracies between the two.

How we documented our findings

Product mantra: Show the bits and the bytes

In our Sketch file, we created a new artboard that visualized the colors we extracted.

We drew a color palette and marked which hexadecimal color codes passed and failed color contrast testing.

Hexadecimal color codes

This visualization of colors provided value for the team in a few different ways:

  • It focused the team; we had a baseline to reference, compare, and test colors.
  • It aligned the team; we could talk about the same ‘grey and ‘blue’, with something visual to refer to.
  • It humbled the team; colors in the charting Question types were not as high in contrast as we’d come to assume.

We would need to choose more effective colors to fix those that had failed color contrast testing.

But first, how did we get there?

Evaluation and insight

Before jumping in and adding additional colors to the product, we wanted to analyze the specifications in past design deliverables so as to better understand how we came to the current solution.

We sourced related design work from the design folder and placed it into our Sketch file.

Placing the past design deliverables in proximity to one another revealed some slight inconsistencies in color. To be sure, we began the process of redrawing the interface components – accurately – in Sketch.

As we redrew the interface components, we used the Stark Sketch plugin to help us check color contrast. This confirmed that some of the input fields and navigation elements were indeed slightly lower in contrast, especially in the deeper layers of the skeleton plane.

Evaluating the past design deliverables in proximity to each other was insightful:

  • We could see how the interface design of Learnosity’s Question types had evolved over the previous 12 months and gained a shared understanding.
  • The side-by-side nature of viewing the deliverables with the current state made it easy to identify which components in the interface were shared, and see which colors were shared too.
  • We saw there were slightly inconsistent colors specified across the design deliverables.
  • When we compared the design deliverables to screenshots we documented earlier, we realized the design team was the root cause of some of the inconsistencies on the production environment.

Working toward a collective vision

We wanted to have an open discussion about how we chose colors, how we specified them, and how we were ensuring they were accurately implemented.

  1. We discovered we were not consistently specifying colors, or in the same formats, on each design deliverable.
    Moving forward, we agreed to visualize color, specify color in a consistent format, and review each other’s design deliverables to uphold quality.
  2. We discovered we had not made the importance of color clear in the specification, or in the development ticket’s acceptance criteria.
    Moving forward, we agreed to annotate intentions clearly on the design deliverables and check colors were made a priority on development tickets.
  3. We discovered we were lacking any documentation, or guidelines, for color style and color best practice (beyond our own knowledge).
    Not only had the evaluation process revealed slightly inconsistent colors in the interface, it had revealed a gap in our design process and design documentation.

In the current scope of work, we agreed to produce a lightweight document that communicated a baseline set of colors for Learnosity’s Question types.

The first iteration of this document would guide the design team in making color choices for interface design.

We planned to expand on this document later, making further iterations to help answer when to apply these colors and give more guidance.

First and foremost, if we were going to fix the color contrast issues we identified, then we would need an effective and accessible color palette.

accessible colors

In part two of the series, we discuss how we defined an accessible color palette and collaborated with engineers to implement new colors.