Unlocking insights in classroom observation reporting data

Case Study
Observation Report interface

Overview

The main objective of this project was to address low utilization and engagement with the company’s reports for learning walks, facilitated events where school leaders observe classrooms to assess the evidence of a set of indicators.

This effort was an integral part of a six-month project to refine and expand the existing learning walks platform. The goal was to increase the utilization of reporting functionality with school leaders by improving the user experience and enhancing the actionable insights gathered from observation data.

Solution Highlights

To resolve identified pain points, the solution was focused on three main enhancements:

  • Interactive features that enable school leaders to tailor data views and comparisons for deeper, relevant insights.
  • A dual-report system to support needs for event-specific observation reporting and for observation reporting over time, enabling the identification of trends and patterns.
  • Improvements to data visualization so that all of the collected data is accessible and clearly represented to enable informed, data-driven decisions.

Discovery

Screenshot of existing page

Existing learning walks report page

Quantitative data analysis showed that school leaders rarely returned to reports over time. In discovery, I documented several usability issues deterring continued use:

Pain Point Impact on User
Data Comparison: It was difficult for school leaders to compare results across learning walk events or to view data over time in the platform. Inhibited the ability to measure impact and track growth.
Data Visualization Limitations: The report only displayed data charts for each indicator individually, lacking summarized data, percentage breakdowns, or high-level aggregates. Evidence data was displayed in pie charts, which aren't optimal for the data type and consumed a significant amount of space on the page, inhibiting users from viewing data for multiple indicators on their screen at the same time. Users struggled to understand high-level insights without manual data synthesis.
Filter Misalignment: Existing filtering options were sparse and not reflective of user needs. The filters reflected the internal company learning taxonomy and often were not utilized by users. Users could not easily manipulate data based on their needs.
Information Architecture: The application of styling in the interface was inconsistent and the layout was overly complex, making it difficult to parse the data. Increased cognitive load for users.
Report Sharing: Reporting pages could only be accessed by users with specific permissions managed by the customer service team. Report sharing was managed by using PDF downloads. Limited ability to share data.
Pain Points
Data Comparison: It was difficult for school leaders to compare results across learning walk events or to view data over time in the platform.Impact on UserInhibited the ability to measure impact and track growth.
Data Visualization Limitations: The report only displayed data charts for each indicator individually, lacking summarized data, percentage breakdowns, or high-level aggregates. Evidence data was displayed in pie charts, which aren't optimal for the data type and consumed a significant amount of space on the page, inhibiting users from viewing data for multiple indicators on their screen at the same time.Impact on UserUsers struggled to understand high-level insights without manual data synthesis.
Filter Misalignment: Existing filtering options were sparse and not reflective of user needs. The filters reflected the internal company learning taxonomy and often were not utilized by users.Impact on UserUsers could not easily manipulate data based on their needs.
Information Architecture: The application of styling in the interface was inconsistent and the layout was overly complex, making it difficult to parse the data. Impact on User Increased cognitive load for users.
Report Sharing: Reporting pages could only be accessed by users with specific permissions managed by the customer service team. Report sharing was managed by using PDF downloads. Impact on User Limited ability to share data.

Internal teams had an additional pain point relating to their current workflows: the customer service team was manually summarizing data and identifying trends and patterns from report data across learning walk events, spending a considerable amount of time and effort across accounts.

Planning

In the planning phase, I worked closely with the product manager and the software engineer responsible for greenfield projects to synthesize improvements that addressed the identified pain points and increased the usability of the report.

I outlined a design solution that introduced new features and usability improvements into the platform, prioritized clear data visualization, and defined differentiated experiences based on user roles.

User Flows

I designed user flows for observers (users observing classrooms), school leaders (users with access to all reports), and facilitators (company employees who facilitate learning walk events). The flows consisted of learning walk event creation, email notifications, event delivery, and reporting. Detailed permissions were necessary to ensure school leaders and facilitators had comprehensive data access while maintaining privacy and ensuring observations were non-evaluative for the educators who were observed.

User flows

User flows for observers and facilitators

Reporting Structure

The existing platform only enabled combining data from multiple events and didn’t have functionality to compare event data or data over time. Another limitation was that observations could only be submitted during a facilitated event. As part of the larger project to rebuild the learning walks platform, the ability to submit observations at any time was added to the platform. This unlocked additional data and highlighted the need to build for longitudinal data. This led me to create of a dual-report design, supporting two distinct reporting needs for school leaders:

  • Event-Based Report: Includes data from a single learning walk event.
  • Indicator-Based Report: Includes all data for a set of indicators, allowing leaders to view and compare data across events and individually submitted observations.

These reporting views ensured that the platform supported both in-the-moment analysis and long-term trend tracking.

Overview tab

Event-based report

Over Time tab

Indicator-based report

Design Validation and Iteration

To validate the design before engineering implementation, I built high-fidelity prototypes of the reports with data from recent events for customers selected for alpha testing. These prototypes were shared with the customers and the solution designer overseeing the pilot for initial feedback, enabling validation of the core information hierarchy and data visualization.

Alpha testing was conducted on the initial build with three key customers. Based on feedback, I made refinements to data comparison features, comment organization, and filtering options, which were implemented in the final solution.

In-App Guidance

I planned and built in-app guidance in the Userpilot platform for each user role. This consisted of tours to introduce new features, tooltips to highlight updates to existing functionality, and surveys as a mechanism to continuously gather insights.

Screenshots of in-app guidance screens

Planned in-app guidance

Solution

The final solution addressed all of the identified pain points, transforming learning walk reporting into a robust platform:

  • Data Comparison: The new indicator-based report allows school leaders to choose how to compare data gathered over time:
  • Event Comparison: Enables comparison of data from two or more selected events by indicator.
  • Time Comparison: Enables comparison of data aggregated by week or by month.
  • Expanded Data Inclusion: Display of event data, data summary, and percentage breakdowns for all values. Stacked bar charts replaced pie charts, using page space more efficiently so that data for indicators could be viewed all at once.
  • Advanced Filtering: All report data dynamically updates based on selected filters. Filters populate based on classroom details entered during observations.
  • Interface Styling: Improved organization of elements, visual hierarchy, and intentional color usage in the layout.
  • AI Data Summary: A summary of the report data generated by the company's native AI, reducing the time spent by internal teams manually summarizing the data.
  • Share Report: Share a link to the report with other school leaders, with the ability to display or hide the filtering capabilities.
Overview tab

Overview tab on the Report page

Responses tab on the Report page

Responses tab

Responses tab on the Report page, with filters applied

Responses tab - filtered

Interface to share the report page

Share Report functionality

Interface to share the report page

Mobile learning walk report

Responses tab on the Report page on a mobile device

Mobile responses tab

Filter drawer on the Report page on a mobile device

Mobile filter drawer

Outcome

This project was an integral step in the company's strategic shift from a service-centric model to a scalable SaaS provider and informed the product roadmap for the upcoming year. The utility of historical and future observation data was unlocked by the introduction of the dual-report system, data comparison views, and improved data visualization.

I created page event tracking and in-app surveys for the following metrics to track the success of the new reports post-launch:

Metric Expected Outcome
User Engagement Increased frequency of school leaders returning to report pages over time.
Feature Adoption Increased utilization of advanced filtering options and report sharing functionalities.
User Experience Improved NPS from school leaders due to enhanced data utility.
Operational Efficiency Reduction in the time spent manually synthesizing learning walk summaries and insights from reporting data by the customer service team.
Metrics
User Engagement Expected Outcome: Increased frequency of school leaders returning to report pages over time.
Feature Adoption Expected Outcome: Increased utilization of advanced filtering options and report sharing functionalities.
User Experience Expected Outcome: Improved NPS from school leaders due to enhanced data utility.
Operational Efficiency Expected Outcome: Reduction in the time spent manually synthesizing learning walk summaries and insights from reporting data by the customer service team.

This effort transformed classroom observation reports from an underutilized data archive into a dynamic resource, enabling the platform's evolution into an essential analytical tool to support school districts in teacher development and classroom growth.


Explore UX case studies