top of page
Thumbnail cover.jpg

Making reading feel possible again: The story of making a  literacy assessment

 

​​Role :

 

Responsibilities : 

​

​​

​

Industry :

Team : 

Timeline : 2 months

Overview

This project is of an AI-driven digital reading platform designed to support students who struggle with reading, particularly dyslexic learners. I was brought on as the Product Designer to lead the creation of first literacy assessment, an engaging, speech-enabled benchmark that would personalize the app experience and set the visual and functional tone for the entire platform.

The Challenge

Reading can feel frustrating and inaccessible for dyslexic students. Many prefer drawing or visual activities and often disengage from traditional reading tasks. The goal was to design a literacy assessment that:

​

  • Gauges each student’s reading level

  • Personalizes their future learning journey

  • Feels screen-reader accessible, playful, and encouraging from the start

​

This first-time experience also had to be inclusive, intuitive, and set the UI foundation for the rest of the LUCA app.

Design Goal

Create an assessment experience that accurately measures reading ability while being intuitive, accessible, and engaging for dyslexic learners.

iPad Pro (Landscape).png
iPad Pro (Landscape) cc.png
final mockup 1.png

What We Achieved

92% 

Phonetic error accuracy 

 

83%

Educator satisfaction
(with phonetic progress report)

73%

Task completion rate

Research and Explorations 

Understanding dyslexia

I began with primary research to understand how dyslexic students experience reading. Dyslexia affects fluency, decoding, and comprehension, often making traditional reading tools feel inaccessible and overwhelming.

Identifying user needs

From there, I focused on core user pain points:

​

  • Many students felt anxious or frustrated by long instructions and dense text

  • They often lost focus when navigating multi-step tasks or cluttered interfaces

  • There was a strong preference for listening and visual cues over reading

  • These insights led to design choices like voice-guided interactions, minimal text and simplified navigation

Applying the Orton-Gillingham Approach

I drew from the Orton-Gillingham approach, a structured, evidence-based reading method designed for people with dyslexia. Its emphasis on multisensory learning (sight, sound, movement), phonemic awareness, and a step-by-step instructional sequence directly shaped how we designed the assessment flow, feedback mechanics, and interaction patterns of LUCA’s reading assessment. The approach also helped ensure that the experience was accessible, consistent, and aligned with how dyslexic students learn best.

Mapping the Assessment Journey

I reviewed a detailed Learning Assessment (LA) document created by our Learning Innovation Manager and translated it into a clear, actionable user flow for the prototype. This helped ensure that each step accurately measured students’ reading ability while keeping the experience intuitive, easy to follow, and age-appropriate for young learners.

User Flow Part 1.png

Design Principles

Accessibility

Voice-First

Personalization

Engagement

Design Constraints

Designing the assessment meant balancing multiple priorities across teams:

​

  • CEO prioritized a fast launch and consistent branding.

  • Accessibility Experts required refinements for screen-reader and TTS support.

  • Learning Innovation Manager emphasized pedagogical accuracy and reading-level integrity.

​

These real-world priorities directly shaped how I approached interaction patterns, accessibility features, and user autonomy, especially for first-time readers.

Design Decisions

After understanding the needs of our users and priorities from the team, I made several key design decisions to make the assessment easy to use, accessible, and voice-first. Each decision was guided by real questions that came up during the process, helping us create an experience that felt clear, supportive, and student-friendly.





As part of the early design process, I spoke with accessibility experts to better understand how dyslexic students interact with digital tools. Through those conversations, I uncovered several key challenges, like how screen readers handle instructions, how font size and contrast affect readability, and how much control students need over audio playback. These insights raised important questions that directly shaped our design decisions, starting with one big one...

♿ Accessibility Audit

The Big Question

How will kids with lower reading proficiency read the written instructions on the screen on their own?

1. Voice-first Interaction

💭 The Problem

Students with dyslexia may struggle to even begin a task if they can't read on-screen instructions. This raised an immediate need for a supportive, accessible interaction model that didn’t rely solely on text.

✅ Design Decision

Text-to-speech (TTS) was implemented throughout the app, and multiple small speaker icons were replaced with
a single, large animated speaker button in the header. This allowed students to control when to hear instructions, whether from the support robot, story narration, or UI, making the experience feel less overwhelming and more empowering.

BD Before.png

Before : No Unified Audio Control System

BD After.png

After : Unified Audio Control

The Second Big Question

Should the support robot start speaking automatically, or should students have control over when to listen?

2. Accessibility Meets Interaction Design

💭 The Problem

Originally, the support robot began speaking automatically when a screen loaded. While intended to be helpful, this raised accessibility concerns, unexpected audio could catch students off guard or disrupt their focus.

✅ Design Decision

Manual control was added by linking the robot’s narration to the speaker button as well as the robot character itself. This gave students the ability to start instructions on their own terms, either by tapping the button or the robot, reducing surprise, preventing audio overlap, and creating a calmer, more inclusive experience.

BD 2.png

The Third Big Question

How do we reduce cognitive load through visual hierarchy and layout?

3. Simplifying the Control Panel

💭 The Problem

An overloaded control panel made it harder for students to focus and understand the next step, creating unnecessary cognitive load during the assessment.

✅ Design Decision

The control panel was streamlined by removing non-essential actions and simplifying the layout. While tap targets were already accessible, the redesign emphasized visual clarity and clear hierarchy, helping students stay focused and better align with the task’s mental model.

BD 3 Before.png

Before

BD 3 Afterr.png

After

The Final Big Question

How can we keep students motivated and emotionally engaged during a multi-step assessment?

4. Gamification for Motivation

💭 The Problem

Literacy assessments can feel long, high-pressure, or even boring, especially for young readers with learning challenges. Without encouragement, students might disengage before completing both parts of the assessment.

✅ Design Decision

To boost motivation, we introduced small wins through visual rewards, like celebration screens, badges, and encouraging messages after each milestone. These moments helped break up the experience, making it feel achievable and playful while nudging students to keep going.

BD 4.png

Final Designs & Prototypes

With key design decisions finalized, I translated them into a polished, developer-ready prototype that captured both the interaction flow and visual tone of the assessment.

 

The following highlights show how each part of the assessment came together, along with subtle accessibility refinements, to support a smooth design-to-dev handoff and an inclusive student experience.

Part 1 : Grapheme-Phoneme Mastery

Students first read individual letters to demonstrate basic letter-sound recognition, then progress to reading real and pseudo-words aloud.

The platform uses speech recognition to detect phoneme-level errors and adapt difficulty in real time. This ensures each student is placed at the right level while keeping the experience playful, accessible, and fast-moving.

Voice recordings are saved automatically when users press ‘Next’ to continue. They can tap ‘Retry’ to re-record or use ‘Pause’ to take a break.

Part 2: Fluency in Stories

This prototype simulates personalized story experience, tailored to each student’s skill level based on part 1.
The system tracks accuracy, fluency (WPM), and prosody, using only mastered phonemes.

Pages are read aloud in sequence to mimic a book-like flow, reducing friction, boosting confidence, and keeping the experience playful yet focused.

Extras

Font size and speech speed controls were moved into a settings button in the header, making them easier to access while keeping the main interface clean and focused.

Future Improvements

In the future, the accessibility settings could offer more choices to support different needs, like changing text spacing, switching to high-contrast mode or adjusting the TTS voice’s tone. A read-along pointer could also help students follow along more easily. These updates would make the experience even more comfortable and personalized for each learner.

Child on Tablet

What I learned â€‹â€‹

​

Designing this assessment experience taught me how to design with care for students with dyslexia. It meant rethinking familiar UX patterns and and focusing on accessibility from the start, not as a layer on top.

 

Voice-first interaction was especially eye-opening, crafting something that doesn’t rely on visuals felt like designing for the future. Collaborating closely with educators and accessibility specialists also reminded me that great design often comes from shared perspectives, not just a single skillset.

bottom of page