The Challenge
In today’s fast-paced academic environment, students face mounting pressure to learn effectively, yet many lack access to personalized support. To address this, we introduced Ace Study Assistant, an AI-powered chat feature offering tailored, on-demand learning assistance. Despite its potential, adoption was slower than expected.
Objective
Our goal for this project was to increase adoption of Ace Study Assistant by demonstrating its value in augmenting students’ learning experiences. 
High-level Goals: 
- Showcase the tangible benefits of AI-powered learning to students. 
- Design seamless pathways for integrating Ace into daily study habits. 
- Leverage data-driven insights to refine Ace’s features, aligning them with user needs. 
- Foster a culture of experimentation to enable continual growth and improvement.
My Role
I led the design initiatives for Ace Study Assistant's adoption strategy, collaborating with a multidisciplinary team, including a product manager, tech lead, user researcher, data scientist, and specialists in learning science and product marketing. I also worked closely with the engineering team during development and iteration to ensure alignment with user needs and technical feasibility.
User Research
With limited prior insights on students’ interaction with AI-powered tools, I partnered with our user researcher to explore how students perceive and use such technologies.
Table displaying some of the findings from user research
Our research revealed key pain points and preferences:
- Students valued tools that simplified complex concepts and offered quick, reliable assistance.
- Many expressed hesitation around AI, citing concerns about accuracy and reliability.
- Students wanted a clear demonstration of how Ace could help them achieve their academic goals.
These insights guided our approach, enabling us to quickly prototype and test targeted experiments to enhance and refine Ace’s user experience.
Design Experiments
To tackle the challenges uncovered during user research, I led a series of product design experiments aimed at making Ace Study Assistant more approachable, useful, and seamlessly integrated into students’ study workflows. Each experiment was designed to test specific hypotheses about what would drive adoption and improve user engagement.

Experiment 1: Smart Defaults
To address the needs identified during user research, we introduced four quick-action options on Ace’s interface, such as “Question Help.” These pre-defined options helped students immediately understand how Ace could assist them. We refined these defaults iteratively based on user behavior. For instance, our initial implementation included a “Fun Fact” option, but it saw low usage. Observing that students often used Ace to quiz themselves on course content, we replaced “Fun Fact” with a “Quiz Me” option, which significantly increased engagement.
Comparison of Ace Study Assistant before and after implementing smart defaults, designed to highlight the tool’s value to users.
Ace Study Assistant in context within Top Hat
Experiment Results
The results showed that 6% of users had Smart Defaults as their first interaction with Ace, and there was a 30% increase in messages sent per user when Smart Defaults were available. However, while Smart Defaults boosted engagement, they had little impact on activation, as 94% of users who interacted with them were already familiar with Ace. This underscored the ongoing challenge of low discoverability, prompting our next experiment to focus on improving Ace’s visibility.
Experiment 2: Ask Ace from Highlights
Highlighting is one of the most frequently used features in Top Hat, so we leveraged it as a natural entry point into Ace. By enabling students to interact with Ace based on their highlighted content, we seamlessly connected their existing study habits to Ace’s capabilities. Additionally, we introduced smart options, such as “Simplify Language,” directly from the highlighted text to provide quicker assistance. 
Highlight toolbar with the Ask Ace action menu expanded
Experiment Results
This experiment proved highly effective in driving adoption, resulting in a 40% increase in usage. It highlighted the importance of utilizing and optimizing high-traffic sections of the app to improve discoverability and increase conversions, ultimately helping us achieve our usage goals.
Experiment 3: Improving Accessibility and Onboarding
Initially, Ace was one of many tools in the toolbar, which limited its visibility. To address this, we introduced a dedicated button for Ace with a clear call-to-action (CTA), making it more prominent and easier for students to discover. This small but impactful change significantly increased engagement by drawing attention to the tool.
In addition to improving visibility, we streamlined Ace’s start chat screen to make onboarding as simple and intuitive as possible. The redesigned interface enabled students to start a conversation quickly and with minimal effort, effectively removing barriers to first-time use. Together, these enhancements made Ace more accessible and approachable, driving both discovery and engagement.

Comparison of Ace Study Assistant's entry point before and after.

Ace in context, showing the dedicated button and improved start chat screen.

Experiment Result
Adoption increased by approximately 35% with the new dedicated button compared to the original sidebar button.
Experiment 4: Just-In-Time Nudges
To proactively engage students, we implemented contextual nudges that expanded from Ace’s entry point. 
These nudges included:
- Due Date Nudge: Encouraging students to ask Ace for help when an assignment was due soon.
- Incorrect Answer Nudge: Offering assistance when a student submitted an incorrect answer.
- Page Summarization Nudge: Suggesting Ace’s help with summarizing textbook pages.

Incorrect Answer Nudge

Incorrect Answer Nudge in context

Experiment Results
The incorrect answer nudge, in particular, was extremely successful in driving engagement, as it directly addressed a moment of need. During the experiment, the nudge resulted in an 18.18% increase in engagement and a 19.92% boost in conversion rates. Notably, 24.15% of new Ace users during this period were attributed to the nudge, and 63.75% of users who converted through the nudge had never used Ace before.
Iterative Approach
Each of these experiments was designed and shipped within a sprint. If an experiment didn’t yield the desired results, we quickly iterated or pivoted to refine our approach. This nimble methodology allowed us to learn and adapt in real-time, delivering meaningful improvements to the user experience.
Impact
Before these improvements, Ace Study Assistant had an activation rate of 1.81%, with 5,859 active students (defined as those who sent Ace two or more messages). After the enhancements, the activation rate increased significantly to 10.0%, representing 32,448 active students.
Next Steps
With adoption at an all-time high, we are now focused on expanding Ace’s functionality to offer even greater value to students. Our next initiative is to enable Ace to create adaptive study guides tailored to students’ unique strengths and weaknesses in their courses. This feature will further personalize the learning experience, helping students master challenging material and achieve their academic goals.

Storyboard I developed to secure stakeholder buy-in for our proposed next steps.

Related Projects

Back to Top