Michigan Online

Increasing the transparency of ratings and reviews for an online course catalog

I performed comparative analyses, ideated design solutions, and conducted usability tests to increase user satisfaction with online course ratings and reviews.

  • UX Designer

  • UX Researcher

  • February 2020

View Live Site

Michigan Online is your destination for global, lifelong, and engaged learning with the University of Michigan.

The Problem

Course pages lacked sufficient detail when it came to displaying rating and review information.

Why It Matters

A well-designed review section contributes to an organization/product's overall reputation, and increases consumer trust in the quality of their purchase. Visibility into the full array of reviews (not just the most positive ones) also improves the credibility of reviews.

The Goals

To address this challenge, I needed to:

  1. Provide a fuller picture of the range of ratings

  2. Make reviews more browsable

How I Made It Happen

Competitive Analysis

I first examined how three other online learning sites present ratings and reviews for their content:

Prototyping

I then explored two approaches: one that kept all rating and review information on the main course page, and one that separated the exploration experience into two pages (offering surface-level information on the main page and a list of full reviews on a different page).

The first design minimized clicks but increased scrolling on the page as a whole. The second design provided a more progressive exploration of the reviews, but obscured the full set of reviews behind an additional click that some users might miss.

Usability Testing

After that, I conducted 5 usability tests with undergraduate students who were experienced with online learning. Findings included:

  • Participants were okay with going to a second page to see all reviews, as it made the main course page less overwhelming.

  • Participants were sometimes unaware that the histogram and star ratings were interactive. One person also suggested displaying the percentage of total ratings each level represented.

  • Some participants expressed interest in knowing more about the people who wrote reviews, in order to determine if the feedback was coming from a person similar to them.

Final Iteration

In the last iteration of the project, I incorporated user feedback to better communicate each rating's interactivity and proportion.

Unexpected Constraint

I experimented with adding more biographical information about each reviewer, but due to data limitaitons I was unable to provide these details. We source our reviews from 3rd party platforms who don't pass that info to us (e.g., Coursera).

The Result

The outcome of this effort was a more interactive, transparent course rating section that allowed prospective learners to explore the full set of reviews for a given learning experience.

Now that the design is implemented, I'd like to collect quantitative measurements of user satisfaction, and monitor click-through rates between the course and review pages to guage usage.