Quantitative & Qualitative Product Research


[Overview]
The Context
Across both enterprise SaaS and AI-enabled coaching products, I rely on rigorous mixed-methods research to justify design decisions and shape product strategy. In this case study, I demonstrate how I validated the desirability and financial viability of a User Management Tool using quantitative service data, and how I used deep qualitative research to define the design direction for the Instructor Connect feature in the Coachi ecosystem — ensuring every design decision was defensible, traceable, and grounded in data.
The Context
Across both enterprise SaaS and AI-enabled coaching products, I rely on rigorous mixed-methods research to justify design decisions and shape product strategy. In this case study, I demonstrate how I validated the desirability and financial viability of a User Management Tool using quantitative service data, and how I used deep qualitative research to define the design direction for the Instructor Connect feature in the Coachi ecosystem — ensuring every design decision was defensible, traceable, and grounded in data.
Problem Statement
Design solutions often fail when they are built on assumptions rather than evidence. Without quant data, teams risk building products with no business case; without qualitative depth, teams risk building solutions misaligned to human needs. I needed research methods that not only surfaced insights, but also created a clear through-line from data → implications → product decisions.
Problem Statement
Design solutions often fail when they are built on assumptions rather than evidence. Without quant data, teams risk building products with no business case; without qualitative depth, teams risk building solutions misaligned to human needs. I needed research methods that not only surfaced insights, but also created a clear through-line from data → implications → product decisions.
Skills Demonstrated
UX Research Strategy
Mixed-Methods Research
Service Blueprinting
Qualitative Thematic Analysis
Evidence-Led Product Design
Data Analysis
Data Analysis
[Impact]
Quant Insights Proved Financial Viability & Product Demand
Service blueprinting revealed engineers spent two hours daily on low-value user management tasks — a clear case for an automated User Management Tool that would free engineers for billable work and drive profitability.
Qual Research Uncovered Instructor Mental Models & Pedagogical Intent
Interviews, observations, and thematic analysis revealed how instructors gather and use learner information, directly shaping Instructor Connect so AI-generated summaries aligned with real coaching behaviours and pedagogical goals.
End-to-End Research Created a Defensible, Evidence-Backed Design Blueprint
By linking raw data to themes, interpretations, design implications, user stories, and workflows, I created a transparent evidence chain that justified every design decision and aligned the entire team around the same insights.
[My Process] - 1. Quantitative Validation: User Management Tool Viability
1.1 Creating a Current-State Service Blueprint
• Led a research project analysing end-to-end IT support operations.
• Mapped processes, roles, pain points, and data gaps across the helpdesk environment.
• Identified lack of reporting granularity for fixed-fee support tickets.
1.1 Creating a Current-State Service Blueprint
• Led a research project analysing end-to-end IT support operations.
• Mapped processes, roles, pain points, and data gaps across the helpdesk environment.
• Identified lack of reporting granularity for fixed-fee support tickets.

1.2 Establishing Performance Reporting
• Worked with business leaders to define reporting needs for service performance.
• Identified which workflows could reliably provide the required metrics.
• Integrated findings into the company’s service performance scorecard.

1.2 Establishing Performance Reporting
• Worked with business leaders to define reporting needs for service performance.
• Identified which workflows could reliably provide the required metrics.
• Integrated findings into the company’s service performance scorecard.
1.3 Analysing Time-on-Task Data
• Discovered engineers spent ~2 hours/day on repeated, low-value user management tasks.
• Quantified the financial impact: reduced profitability and reduced capacity for billable work.
1.3 Analysing Time-on-Task Data
• Discovered engineers spent ~2 hours/day on repeated, low-value user management tasks.
• Quantified the financial impact: reduced profitability and reduced capacity for billable work.

1.4 Translating Data Into Product Strategy
• Used the data to justify the development of the User Management Tool.
• Demonstrated that simplifying user management tasks would free up engineers for high-value work.
• Provided the business case for eventual zero-touch customer self-service, increasing long-term scalability.

1.4 Translating Data Into Product Strategy
• Used the data to justify the development of the User Management Tool.
• Demonstrated that simplifying user management tasks would free up engineers for high-value work.
• Provided the business case for eventual zero-touch customer self-service, increasing long-term scalability.

1.5 Measuring Early Impact
• First release resulted in a 40% increase in engineer efficiency.
• Data provided evidence for continued investment into future releases targeting customer self-service.

1.5 Measuring Early Impact
• First release resulted in a 40% increase in engineer efficiency.
• Data provided evidence for continued investment into future releases targeting customer self-service.
[My Process] - 2. Qualitative Depth: Instructor Connect Feature

Coachi App - Insturctor Connect Feature
Instructor Connect is a key feature within the Coachi platform designed to help human instructors deliver personalised, learner-centred coaching informed by AI-generated performance insights. The feature aggregates data from learners’ sessions with the AI coach and transforms it into structured, transparent summaries that instructors can review, interpret, and use to plan both asynchronous feedback and in-person lessons. Its goal is to strengthen the instructor–learner relationship, support repeat bookings, and ensure instructors have the right information at the right time to make informed coaching decisions.

Coachi App - Insturctor Connect Feature
Instructor Connect is a key feature within the Coachi platform designed to help human instructors deliver personalised, learner-centred coaching informed by AI-generated performance insights. The feature aggregates data from learners’ sessions with the AI coach and transforms it into structured, transparent summaries that instructors can review, interpret, and use to plan both asynchronous feedback and in-person lessons. Its goal is to strengthen the instructor–learner relationship, support repeat bookings, and ensure instructors have the right information at the right time to make informed coaching decisions.
2.1 Literature Review: Designing AI to Augment, Not Replace
Reviewed Human Computer Interaction and AI design research, identifying foundational principles:
AI must augment, not replace human expertise.
In education, AI should align with pedagogical intent.
Trust is built through performance, process transparency, and purpose alignment.
Designs must communicate capabilities, limitations, explainability, interpretability, controllability, and context awareness.
These principles formed the theoretical backbone against which all design decisions would later be justified.
2.1 Literature Review: Designing AI to Augment, Not Replace
Reviewed Human Computer Interaction and AI design research, identifying foundational principles:
AI must augment, not replace human expertise.
In education, AI should align with pedagogical intent.
Trust is built through performance, process transparency, and purpose alignment.
Designs must communicate capabilities, limitations, explainability, interpretability, controllability, and context awareness.
These principles formed the theoretical backbone against which all design decisions would later be justified.

2.2 Qualitative Data Collection
Purpose: Understand how ski instructors gather, interpret, and apply learner information.
Methods:
5 semi-structured depth interviews
4 contextual enquiry observations
4 follow-up interviews post-observation
Explored:
What learner information instructors value most
How they interpret information to plan lessons
How digital summaries could support asynchronous coaching
How learner insights should be structured to align with their pedagogical goals

2.2 Qualitative Data Collection
Purpose: Understand how ski instructors gather, interpret, and apply learner information.
Methods:
5 semi-structured depth interviews
4 contextual enquiry observations
4 follow-up interviews post-observation
Explored:
What learner information instructors value most
How they interpret information to plan lessons
How digital summaries could support asynchronous coaching
How learner insights should be structured to align with their pedagogical goals
2.3 Systematic Thematic Analysis Process
1. Insight Generation
Read all transcripts and field notes.
Captured early patterns and analytical directions.
2. Semantic Coding
Broke data into descriptive units (behaviours, cues, decisions).
Ensured codes reflected explicit meanings from participants.
3. Collating Codes Into Themes
Grouped related codes to identify emerging patterns.
4. Defining Themes & Narrative Statements
Produced semantic narratives explaining what each theme meant and why it mattered.
5. Latent Interpretation
Applied Kolb’s experiential learning model, focusing on abstract conceptualisation.
Interpreted deeper cognitive processes behind instructor decision-making, beyond what was explicitly said.
2.3 Systematic Thematic Analysis Process
1. Insight Generation
Read all transcripts and field notes.
Captured early patterns and analytical directions.
2. Semantic Coding
Broke data into descriptive units (behaviours, cues, decisions).
Ensured codes reflected explicit meanings from participants.
3. Collating Codes Into Themes
Grouped related codes to identify emerging patterns.
4. Defining Themes & Narrative Statements
Produced semantic narratives explaining what each theme meant and why it mattered.
5. Latent Interpretation
Applied Kolb’s experiential learning model, focusing on abstract conceptualisation.
Interpreted deeper cognitive processes behind instructor decision-making, beyond what was explicitly said.
2.4 Workflow & Journey Mapping
6. Instructor Action Mapping
Ordered observable instructor behaviours chronologically.
7. Current-State Journey Map
Mapped stages, goals, emotions, decisions, and pain points across the instructor workflow.
8. Instructor Goals & Empathy Mapping
Translated insights into clear instructor goals and emotional drivers.
Created an empathy map reflecting what instructors say, think, feel, and do regarding learner information.
2.4 Workflow & Journey Mapping
6. Instructor Action Mapping
Ordered observable instructor behaviours chronologically.
7. Current-State Journey Map
Mapped stages, goals, emotions, decisions, and pain points across the instructor workflow.
8. Instructor Goals & Empathy Mapping
Translated insights into clear instructor goals and emotional drivers.
Created an empathy map reflecting what instructors say, think, feel, and do regarding learner information.

2.5 From Insights → Design Implications → User Stories
9. Linking Themes to Design Opportunities
Tagged opportunities across the dataset.
Connected themes to design implications to ensure all decisions were evidence-based.
10. Generating User Stories
Transformed design implications into actionable product requirements:
“As an instructor, I need to understand a learner’s movement history so that I can tailor the first lesson effectively.”
11. Affinity Grouping User Stories
Clustered stories into capability areas forming the conceptual structure of the product.
12. Future-State Journey Map
Mapped the new experience instructors would have using Instructor Connect.

2.5 From Insights → Design Implications → User Stories
9. Linking Themes to Design Opportunities
Tagged opportunities across the dataset.
Connected themes to design implications to ensure all decisions were evidence-based.
10. Generating User Stories
Transformed design implications into actionable product requirements:
“As an instructor, I need to understand a learner’s movement history so that I can tailor the first lesson effectively.”
11. Affinity Grouping User Stories
Clustered stories into capability areas forming the conceptual structure of the product.
12. Future-State Journey Map
Mapped the new experience instructors would have using Instructor Connect.
2.6 Using Research to Justify Design
13. Prototype Development
Designed flows and screen concepts grounded entirely in the research chain.
Validated decisions with instructors and against academic principles.
Ensured every UI element had a direct connection to:
A theme
A latent insight
A pedagogical requirement
Or an AI-trust principle
2.6 Using Research to Justify Design
13. Prototype Development
Designed flows and screen concepts grounded entirely in the research chain.
Validated decisions with instructors and against academic principles.
Ensured every UI element had a direct connection to:
A theme
A latent insight
A pedagogical requirement
Or an AI-trust principle
[Key Learnings]
1. Quant Data Creates Business Confidence
Service data provided indisputable evidence for building the User Management Tool and clearly demonstrated its value to the business. Quant research is often the most powerful lever for gaining leadership buy-in.
2. Qualitative Depth Reveals How to Design for Experts
The instructor interviews and observations revealed the tacit knowledge, mental models, and decision-making processes that no quantitative dataset could surface — enabling design that truly supports expert judgement.
3. A Full Evidence Chain Strengthens Design Credibility
By linking insights → themes → interpretations → implications → user stories → prototype decisions, I created a transparent reasoning trail. This level of rigour builds trust with stakeholders and ensures the final product is grounded in genuine user needs.


[Persona]
Jhon Roberts
Marketing Manager
Content
Age: 29
Location: New York City
Tech Proficiency: Moderate
Gender: Male
[Goal]
Quickly complete purchases without interruptions.
Trust the platform to handle her payment securely.
Access a seamless mobile shopping experience.
[Frustrations]
Long or confusing checkout processes.
Error messages that don’t explain the issue.
Poor mobile optimization that slows her down.
Quantitative & Qualitative Product Research
[Overview]
The Context
Across both enterprise SaaS and AI-enabled coaching products, I rely on rigorous mixed-methods research to justify design decisions and shape product strategy. In this case study, I demonstrate how I validated the desirability and financial viability of a User Management Tool using quantitative service data, and how I used deep qualitative research to define the design direction for the Instructor Connect feature in the Coachi ecosystem — ensuring every design decision was defensible, traceable, and grounded in data.
The Context
Across both enterprise SaaS and AI-enabled coaching products, I rely on rigorous mixed-methods research to justify design decisions and shape product strategy. In this case study, I demonstrate how I validated the desirability and financial viability of a User Management Tool using quantitative service data, and how I used deep qualitative research to define the design direction for the Instructor Connect feature in the Coachi ecosystem — ensuring every design decision was defensible, traceable, and grounded in data.
Problem Statement
Design solutions often fail when they are built on assumptions rather than evidence. Without quant data, teams risk building products with no business case; without qualitative depth, teams risk building solutions misaligned to human needs. I needed research methods that not only surfaced insights, but also created a clear through-line from data → implications → product decisions.
Problem Statement
Design solutions often fail when they are built on assumptions rather than evidence. Without quant data, teams risk building products with no business case; without qualitative depth, teams risk building solutions misaligned to human needs. I needed research methods that not only surfaced insights, but also created a clear through-line from data → implications → product decisions.
Research Insights
Leading end-to-end research bridging academic rigour and product goals.
Defined scope, built a theory-informed process, and applied Human–AI Collaboration principles to uncover actionable design opportunities — all while collaborating cross-functionally with engineering.
Skills Demonstrated
UX Research Strategy
UX Research Strategy
Mixed-Methods Research
Mixed-Methods Research
Service Blueprinting
Service Blueprinting
Qualitative Thematic Analysis
Qualitative Thematic Analysis
Data Analysis
Data Analysis
Data Analysis
[Impact]
Quant Insights Proved Financial Viability & Product Demand
Service blueprinting revealed engineers spent two hours daily on low-value user management tasks — a clear case for an automated User Management Tool that would free engineers for billable work and drive profitability.
Qual Research Uncovered Instructor Mental Models & Pedagogical Intent
Interviews, observations, and thematic analysis revealed how instructors gather and use learner information, directly shaping Instructor Connect so AI-generated summaries aligned with real coaching behaviours and pedagogical goals.
End-to-End Research Created a Defensible, Evidence-Backed Design Blueprint
By linking raw data to themes, interpretations, design implications, user stories, and workflows, I created a transparent evidence chain that justified every design decision and aligned the entire team around the same insights.
[Key Learnings]
1. Quant Data Creates Business Confidence
Service data provided indisputable evidence for building the User Management Tool and clearly demonstrated its value to the business. Quant research is often the most powerful lever for gaining leadership buy-in.
2. Qualitative Depth Reveals How to Design for Experts
The instructor interviews and observations revealed the tacit knowledge, mental models, and decision-making processes that no quantitative dataset could surface — enabling design that truly supports expert judgement.
3. A Full Evidence Chain Strengthens Design Credibility
By linking insights → themes → interpretations → implications → user stories → prototype decisions, I created a transparent reasoning trail. This level of rigour builds trust with stakeholders and ensures the final product is grounded in genuine user needs.
[My Process] - 1. Quantitative Validation: User Management Tool Viability
1.1 Creating a Current-State Service Blueprint
• Led a research project analysing end-to-end IT support operations.
• Mapped processes, roles, pain points, and data gaps across the helpdesk environment.
• Identified lack of reporting granularity for fixed-fee support tickets.
1.1 Creating a Current-State Service Blueprint
• Led a research project analysing end-to-end IT support operations.
• Mapped processes, roles, pain points, and data gaps across the helpdesk environment.
• Identified lack of reporting granularity for fixed-fee support tickets.

1.2 Establishing Performance Reporting
• Worked with business leaders to define reporting needs for service performance.
• Identified which workflows could reliably provide the required metrics.
• Integrated findings into the company’s service performance scorecard.

1.2 Establishing Performance Reporting
• Worked with business leaders to define reporting needs for service performance.
• Identified which workflows could reliably provide the required metrics.
• Integrated findings into the company’s service performance scorecard.
1.3 Analysing Time-on-Task Data
• Discovered engineers spent ~2 hours/day on repeated, low-value user management tasks.
• Quantified the financial impact: reduced profitability and reduced capacity for billable work.
1.3 Analysing Time-on-Task Data
• Discovered engineers spent ~2 hours/day on repeated, low-value user management tasks.
• Quantified the financial impact: reduced profitability and reduced capacity for billable work.

1.4 Translating Data Into Product Strategy
• Used the data to justify the development of the User Management Tool.
• Demonstrated that simplifying user management tasks would free up engineers for high-value work.
• Provided the business case for eventual zero-touch customer self-service, increasing long-term scalability.

1.4 Translating Data Into Product Strategy
• Used the data to justify the development of the User Management Tool.
• Demonstrated that simplifying user management tasks would free up engineers for high-value work.
• Provided the business case for eventual zero-touch customer self-service, increasing long-term scalability.

1.5 Measuring Early Impact
• First release resulted in a 40% increase in engineer efficiency.
• Data provided evidence for continued investment into future releases targeting customer self-service.

1.5 Measuring Early Impact
• First release resulted in a 40% increase in engineer efficiency.
• Data provided evidence for continued investment into future releases targeting customer self-service.
[My Process] - 2. Qualitative Depth: Instructor Connect Feature
2.1 Literature Review: Designing AI to Augment, Not Replace
Reviewed Human Computer Interaction and AI design research, identifying foundational principles:
AI must augment, not replace human expertise.
In education, AI should align with pedagogical intent.
Trust is built through performance, process transparency, and purpose alignment.
Designs must communicate capabilities, limitations, explainability, interpretability, controllability, and context awareness.
These principles formed the theoretical backbone against which all design decisions would later be justified.
2.1 Literature Review: Designing AI to Augment, Not Replace
Reviewed Human Computer Interaction and AI design research, identifying foundational principles:
AI must augment, not replace human expertise.
In education, AI should align with pedagogical intent.
Trust is built through performance, process transparency, and purpose alignment.
Designs must communicate capabilities, limitations, explainability, interpretability, controllability, and context awareness.
These principles formed the theoretical backbone against which all design decisions would later be justified.

2.2 Qualitative Data Collection
Purpose: Understand how ski instructors gather, interpret, and apply learner information.
Methods:
5 semi-structured depth interviews
4 contextual enquiry observations
4 follow-up interviews post-observation
Explored:
What learner information instructors value most
How they interpret information to plan lessons
How digital summaries could support asynchronous coaching
How learner insights should be structured to align with their pedagogical goals

2.2 Qualitative Data Collection
Purpose: Understand how ski instructors gather, interpret, and apply learner information.
Methods:
5 semi-structured depth interviews
4 contextual enquiry observations
4 follow-up interviews post-observation
Explored:
What learner information instructors value most
How they interpret information to plan lessons
How digital summaries could support asynchronous coaching
How learner insights should be structured to align with their pedagogical goals
2.3 Systematic Thematic Analysis Process
1. Insight Generation
Read all transcripts and field notes.
Captured early patterns and analytical directions.
2. Semantic Coding
Broke data into descriptive units (behaviours, cues, decisions).
Ensured codes reflected explicit meanings from participants.
3. Collating Codes Into Themes
Grouped related codes to identify emerging patterns.
4. Defining Themes & Narrative Statements
Produced semantic narratives explaining what each theme meant and why it mattered.
5. Latent Interpretation
Applied Kolb’s experiential learning model, focusing on abstract conceptualisation.
Interpreted deeper cognitive processes behind instructor decision-making, beyond what was explicitly said.
2.3 Systematic Thematic Analysis Process
1. Insight Generation
Read all transcripts and field notes.
Captured early patterns and analytical directions.
2. Semantic Coding
Broke data into descriptive units (behaviours, cues, decisions).
Ensured codes reflected explicit meanings from participants.
3. Collating Codes Into Themes
Grouped related codes to identify emerging patterns.
4. Defining Themes & Narrative Statements
Produced semantic narratives explaining what each theme meant and why it mattered.
5. Latent Interpretation
Applied Kolb’s experiential learning model, focusing on abstract conceptualisation.
Interpreted deeper cognitive processes behind instructor decision-making, beyond what was explicitly said.
2.4 Workflow & Journey Mapping
6. Instructor Action Mapping
Ordered observable instructor behaviours chronologically.
7. Current-State Journey Map
Mapped stages, goals, emotions, decisions, and pain points across the instructor workflow.
8. Instructor Goals & Empathy Mapping
Translated insights into clear instructor goals and emotional drivers.
Created an empathy map reflecting what instructors say, think, feel, and do regarding learner information.
2.4 Workflow & Journey Mapping
6. Instructor Action Mapping
Ordered observable instructor behaviours chronologically.
7. Current-State Journey Map
Mapped stages, goals, emotions, decisions, and pain points across the instructor workflow.
8. Instructor Goals & Empathy Mapping
Translated insights into clear instructor goals and emotional drivers.
Created an empathy map reflecting what instructors say, think, feel, and do regarding learner information.

2.5 From Insights → Design Implications → User Stories
9. Linking Themes to Design Opportunities
Tagged opportunities across the dataset.
Connected themes to design implications to ensure all decisions were evidence-based.
10. Generating User Stories
Transformed design implications into actionable product requirements:
“As an instructor, I need to understand a learner’s movement history so that I can tailor the first lesson effectively.”
11. Affinity Grouping User Stories
Clustered stories into capability areas forming the conceptual structure of the product.
12. Future-State Journey Map
Mapped the new experience instructors would have using Instructor Connect.

2.5 From Insights → Design Implications → User Stories
9. Linking Themes to Design Opportunities
Tagged opportunities across the dataset.
Connected themes to design implications to ensure all decisions were evidence-based.
10. Generating User Stories
Transformed design implications into actionable product requirements:
“As an instructor, I need to understand a learner’s movement history so that I can tailor the first lesson effectively.”
11. Affinity Grouping User Stories
Clustered stories into capability areas forming the conceptual structure of the product.
12. Future-State Journey Map
Mapped the new experience instructors would have using Instructor Connect.
2.6 Using Research to Justify Design
13. Prototype Development
Designed flows and screen concepts grounded entirely in the research chain.
Validated decisions with instructors and against academic principles.
Ensured every UI element had a direct connection to:
A theme
A latent insight
A pedagogical requirement
Or an AI-trust principle
2.6 Using Research to Justify Design
13. Prototype Development
Designed flows and screen concepts grounded entirely in the research chain.
Validated decisions with instructors and against academic principles.
Ensured every UI element had a direct connection to:
A theme
A latent insight
A pedagogical requirement
Or an AI-trust principle

Coachi App - Insturctor Connect Feature
Instructor Connect is a key feature within the Coachi platform designed to help human instructors deliver personalised, learner-centred coaching informed by AI-generated performance insights. The feature aggregates data from learners’ sessions with the AI coach and transforms it into structured, transparent summaries that instructors can review, interpret, and use to plan both asynchronous feedback and in-person lessons. Its goal is to strengthen the instructor–learner relationship, support repeat bookings, and ensure instructors have the right information at the right time to make informed coaching decisions.

Coachi App - Insturctor Connect Feature
Instructor Connect is a key feature within the Coachi platform designed to help human instructors deliver personalised, learner-centred coaching informed by AI-generated performance insights. The feature aggregates data from learners’ sessions with the AI coach and transforms it into structured, transparent summaries that instructors can review, interpret, and use to plan both asynchronous feedback and in-person lessons. Its goal is to strengthen the instructor–learner relationship, support repeat bookings, and ensure instructors have the right information at the right time to make informed coaching decisions.