Gutenberg Technology CMS

Redesigning a complex authoring experience to improve usability, reduce friction, and support scalable content creation through data-driven insights.
SERVICE

UX Evaluation & Design

CLIENT

Gutenberg Technology

DURATION

3 months, Sep - Dec 2025

MY ROLE

UX Designer & Researcher, Project Manager

TOOLS

Tobii, Figma, & Private Panel

TEAM

Gloria Y, Atharva N, Grace H, Karla S

See the results

Project Overview

Reducing Onboarding Friction in a Complex CMS

New authors struggled to onboard because the system’s workflows didn’t match how they expected it to work.

I led usability research using Tobii eye-tracking across 9 moderated sessions to uncover where users hesitated, misunderstood structure, and relied on trial-and-error. I translated these behavioral insights into high-fidelity design solutions that clarified system logic, streamlined project setup, and improved authoring efficiency.

Impact
↓ Reduced onboarding hesitation and scanning
↑ Faster time-to-first meaningful action
↓ Fewer backtracking behaviors during setup
↑ Higher task confidence reported in RTA & SUS
Defining the Challenge

Uncovering the Mental Model Gap: Why Authors Struggled to Get Started

The Initial Assumption vs. The Reality

The initial assumption was that onboarding friction was caused by interface complexity. However, early observations revealed a deeper issue: users struggled not because the system was complex, but because it didn’t behave the way they expected. Instead of feeling guided, they hesitated, misinterpreted structure, and relied on trial-and-error to move forward.

Validating Through Research

To understand where and why breakdowns occurred, I conducted an eye-tracking usability study with nine moderated sessions, combining behavioral observation with post-task reflection.

Screenshot of Participate Eye-Tracking RTA Session

Moderated Usability Testing (9 Sessions)

Observed how new authors approached project setup and content creation, identifying moments of hesitation, confusion, and repeated actions across key workflows

→ Revealed breakdowns in onboarding, misinterpretation of system structure, and reliance on trial-and-error to complete tasks

Tobii Eye-Tracking Analysis

Analyzed gaze pattern, gaze replays, scan paths, areas of interest (AOIs), and heatmaps to understand how users visually navigated the interface

→ Revealed where attention was misdirected, where users expected actions, and where visual hierarchy failed

Retrospective Think-Aloud (RTA)

Captured user reasoning after tasks to uncover gaps between user expectations and system behavior

System Usability Score (SUS)

Measured perceived usability, with a score of 60 (below the industry benchmark of 68), reinforcing behavioral findings of friction and low confidence

Research: Key insights & Recommendations

Where the System Breaks Down?

Across testing, breakdowns were not isolated usability issues, but consistent patterns where the system’s structure, actions, and feedback were not clearly communicated.

These gaps caused users to hesitate, rely on trial-and-error, and struggle to build a reliable mental model of how the system worked.

INSIGHT 1 – INTERACTION AFFORDANCE GAP

Drag-and-drop in the TOC aligned with user expectations but lacked clear affordances

8 of 9 participants successfully reordered pages using drag-and-drop, confirming that the interaction matched their mental model. However, hesitation patterns and participant comments revealed uncertainty due to missing visual cues indicating what was draggable and where items could be dropped.

The interaction was conceptually intuitive, but visually under-signaled.

“It’s pretty intuitive to reorder things by dragging them, but there isn’t an icon here to mention that it is draggable.”

During testing, three inconsistent cursor states appeared, none aligning with standard drag indicators. This inconsistency introduced momentary confusion and reduced user confidence.

RECOMMENDATION 1

Standardize drag cues and reinforce interaction feedback

To reduce hesitation and increase user confidence, we recommend:

  1. Adopting platform-standard drag cursors for both Mac and Windows

  2. Adding a “Drag to reorder” tooltip on hover

These consistent cues make the interaction immediately recognizable, predictable, and aligned with user expectations.

Adding Windows and Mac standard dragging cursors and tooltips

INSIGHT 2 – STRUCTURE NOT LEGIBLE

Users were confused with editing in the TOC, leading disorganization in the TOC

3 of 9 participants misinterpreted the TOC hierarchy, frequently adding a page instead of a section.

Confusion stemmed from inconsistent terminology, unclear visual hierarchy, and labels that did not align with users’ mental models. Participants struggled to distinguish structural elements from content-level items, leading to misplaced pages and fragmented organization.

“I’m not sure if I’m adding a new page or something inside this page. These labels don’t really tell me what’s happening.”

Below gaze replay showed hesitation around the Add dropdown and menu options, indicating uncertainty during structural decisions. The TOC did not clearly communicate hierarchy, nesting rules, or the relationship between modules, sections, and pages.

“What bothered me was the fact that section was in all caps and the rest of stuff was just lower case.”

Gaze replay shows that Participant 3 opened the menu and selected Pages despite seeing Sections

RECOMMENDATION 2

Make Structure Visible and Align Terminology with User Mental Models

I redesigned the dropdown to reflect a clear, familiar content hierarchy:

  • Introduced visual nesting (indentation + grouping)

  • Standardized terminology to match user expectations

  • Reduced visual ambiguity between headings and actions

  • Emphasized primary action (“Add”) for clarity

Additional improvements:

  • Added color labels to support content organization

  • Introduced starred items for quick access to key pages

“Adding stars or colored tags to the TOC could be helpful if people have a lot of content.”

INSIGHT 3 – FORCED EARLY DECISION

Users were confused during project setup, creating early friction

Users encountered immediate confusion at the start of the authoring flow. The label “create a project from scratch” conflicted with the mandatory template selection step, leaving participants unsure why they had to choose a template, especially when only one option was available. This mismatch disrupted onboarding, increased cognitive load, and made the CMS feel less intuitive from the very first interaction.

8 out of 9 participants were new to GT CMS, and 6 struggled to understand why template selection was required.

“Select template but there’s just 1 template so… i just went ahead with it because i didn't know otherwise”

The onboarding flow signaled flexibility, but enforced constraint — undermining user confidence early in the experience.

Gaze Replay from Participant 5 shows confusion caused by the mismatch between “create a project from scratch” and being required to choose the only available template.

RECOMMENDATION 3

Adding a "Start from Scratch" option to reduce confusion and allowing users to create without template requirement

I introduced a “Start from Scratch” option to support a more flexible entry point:

  • Allows users to begin without selecting a predefined template

  • Includes supportive copy: “Begin with no predefined layout or content”

  • Reduces early decision-making pressure

  • Aligns with user expectation of starting from a blank state

INSIGHT 4 – LACK OF SYSTEM TRANSPARENCY

Users misinterpreted the system-generated default page, slowing the authoring workflow

Users frequently misunderstood the purpose of the system-generated default page. Many interpreted its heading blocks as actual sections of their project rather than placeholder content. This misinterpretation led users to begin editing the page directly instead of structuring their project in the Table of Contents (TOC), which significantly slowed progress and disrupted the intended workflow.

5 out of 9 participants mistook the system-generated default page for the actual project structure, using its heading blocks as sections.

“At first, I was confused with as to what this page was. Then I realized it must have come from the template that I was using.”

0.7

Task 1 Success Rate
(lowest across all tasks)

4.6

minutes

Task 1 Avg. Completion Time
(highest across all tasks)

Gaze Replay from Participant 2 shows direct editing the system-generated default page

RECOMMENDATION 4

Relabeling the system-generated page and adding an info tooltip to clarify its purpose

I made the system’s behavior explicit through labeling and guidance:

  • Renamed the page to “Default Template Page” to reflect its origin

  • Added a contextual tooltip: “This page is automatically generated from your template. Use the Table of Contents to build your structure.”

  • Positioned the tooltip near the page title to provide guidance at the moment of confusion

INSIGHT 5 – HIDDEN INTERACTION MODEL

Drag-and-Drop to add content was not discoverable, making organizing work difficult

5 out of 9 participants participants clicked content blocks instead of dragging, showing that the intended drag-and-drop interaction was not discoverable.

“On other websites they write something to tell the user to drag the content, but nothing mentioned here.”

Gaze Replay from Participant 4 shows repeated clicking on content blocks instead of dragging.

RECOMMENDATION 5

Add instruction and hover tooltip to clarify “Drag to Add” function in the right panel

I introduced lightweight guidance to make the interaction immediately understandable:

  • Added instructional text:“Drag blocks into the canvas to add content”

  • Introduced a hover tooltip (“Drag to Add”) on content blocks

  • Reinforced drag behavior through cursor and hover states

These changes clarify interaction without adding friction or clutter.

Outcome & Impact

From Guesswork to Guided Authoring

The redesigned experience shifted the CMS from a system users had to interpret into one that supports clear, predictable workflows.

Experience Improvements

  • Reduced hesitation during onboarding workflows

  • Improved understanding of content structure and relationships

  • Increased confidence in completing authoring tasks

Product Impact (Behavior-Based Metrics)

↓ Reduced scanning and hesitation before action
↓ Fewer backtracking behaviors during setup
↑ Faster time-to-first meaningful action
↑ Higher task confidence reported in RTA & SUS
Bringing the Results Back to Our Client

We presented our findings and recommendations to the GT team during a final readout, walking them through the key usability breakdowns, supporting evidence, and proposed design improvements. After the presentation, we shared a follow-up email that included all final deliverables—our slide deck, session recordings, highlight reels, rainbow spreadsheets, recommendation mockups, and research documentation—ensuring the team had everything needed to continue the work internally.

The client expressed that the insights were timely and valuable, especially as they prepare to refactor the authoring flow in the coming year.

“Great to have a fresh view on something we’re so accustomed to, especially because we’re hoping to refactor our creation flow next year. This is going to be very useful for us for our upcoming work.”

— Gutenberg Technology Team

Let's make something good :)

© 2026 Designed by Gloria Yang

Let's make something good :)

© 2026 Designed by Gloria Yang