Helene Giraud, UX & UI digital art director

— AGENCY

Cloud Academy


— DATE

2022

Back in 2019, Cloud Academy launched Knowledge check, a series of 5 questions at the end of a course to help users validate their learnings.

01. PROBLEM STATEMENT

Over 50% of NPS detractors complain about a lack of guidance during the learning process.


One of the main reasons for the lack of guidance is that after a knowledge check the user is not offered an option to go to the next step of the learning path.

There is plenty of other issues such as:  


• User interface replicate the wrong entity color  

• Inconsistent experience

• Information overload

• Waste of space

• The feedback on the answers is misleading


02. USERS

We have two types of users, individual & Enterprise

Both of them are turning to Cloud Academy to bolster their skills and get educated on emerging tech either from their own initiative or from their organization’s initiative.

03. ROLES AND RESPONSIBILITIES

My role as the only Product Designer in this project was to hit two main objectives:


  1. 1. Create one consistent experience that fully integrates knowledge check as part of the learning path.

  2. 2. Reducing design and technical debt by setting up foundations 

I was responsible for designing, prototyping and testing the end-to-end design flow. I also worked closely with the tech team to build out the final experience.


04. SCOPE AND CONSTRAINTS

Over the years, little consideration was given to the mobile experience or its best practices.


  • The scope of work includes :
  • • Access/showing result explanation
  • • Improve answer feedback to differentiate correct/wrong from selected answers.

• Remove timer

• Follow iOS best practices


The scope of work does not include a rebuilt of the flow

PROCESS & CONTRIBUTION

01. DISCOVERY

I gathered information about the main competition in order to gain insights into their quiz features.


What did I find?

  • They have a strong CTA guiding them to the next step
  • They removed all unnecessary information
  • The feedback given is very clear
  • Most of them have a positive word of encouragement


02. DESIGN

Why skipping the wireframes ?

Firstly, the results feedback requires interface design to be tested properly. Users need to be presented a functional prototype close to the real experience to be able to correctly perform.


Moreover, the scope of work doesn’t allow for a flow change.



The first step is to update the current design to make it compliant with our design system.

BEFORE

AFTER

The first prototype is ready to be tested

03. USER TESTING

I wrote a scenario with 4 tasks for users to complete.


The goal of the test is simple: we want to validate that our user can successfully understand their results of the quiz when asked for multiple answers and what his next step is from there.


Each task performance will have a score between 1 & 3 (1: successfully completed the task, 2: completed the task with some doubts, 3: failed to complete the task).



I was initially optimistic, thinking I would stumble on just making many but small changes to the current design to improve it, but the testing showed that even though the usability was good, understanding at quick glance the answers result was way more complicated that I imagined!



The biggest challenge came from differentiating the answers I selected and were correct from the one that were correct but that I didn’t select.

1 user commented: "The fact that there is 2 outline answers, 1 grey and 1 green, makes you work to understand what the different types meant." 


More than 70% of our users did not understand the feedback icon.

04. ITERATE

Together with the project manager we decide to change the color of the entity to remove the idea of a quiz (and the pressure it can put on the student)and reinforce the idea of a knowledge validation. I wrote a scenario with 4 tasks for users to complete.


Each of our learning entity have their own color system (course is blue, lab is green, quiz is orange….), therefore I want the knowledge check to be following the course color system.

It took me another couple of rounds of iterations and testings to finally get to a state where it was ready to be released to our users.

OUTCOMES & LESSONS

I really enjoyed this project because I was able to have multiple rounds of user testing and discovered a bit more about the brain and its learning process. 


Working within tight constraints like technical and design debt or limited access to users led me to find creative solutions. The first round of users were people around me recruited based on the persona, they didn’t know the platform.


The outcomes proved to be good, the platform registered and increased rate of LP’s completion and an increased rate of exam’s success.