Research that Shapes Strategy


[Overview]
The Context
During early discovery for both an academic project for my Masters in UX Design and a commercial project I was the lead designer for, I led research activities designed to align teams, clarify knowledge gaps, and ground product decisions in evidence.
By combining Question Board workshops with competitive benchmarking, I established a structured foundation for learning, prioritisation, and eventual design direction.
The Context
During early discovery for both an academic project for my Masters in UX Design and a commercial project I was the lead designer for, I led research activities designed to align teams, clarify knowledge gaps, and ground product decisions in evidence.
By combining Question Board workshops with competitive benchmarking, I established a structured foundation for learning, prioritisation, and eventual design direction.
Problem Statement
Teams often begin with assumptions, scattered questions, and little clarity about what they need to learn to design the right product. At the same time, they may lack awareness of how competitors solve similar problems. I needed a repeatable research approach that aligned teams, revealed unknowns, shaped interview scripts, and ensured we understood the competitive landscape before committing to solutions.
Problem Statement
Teams often begin with assumptions, scattered questions, and little clarity about what they need to learn to design the right product. At the same time, they may lack awareness of how competitors solve similar problems. I needed a repeatable research approach that aligned teams, revealed unknowns, shaped interview scripts, and ensured we understood the competitive landscape before committing to solutions.
Skills Demonstrated
UX Research Strategy
Qualitative Research
Competitive Benchmarking
Workshop Facilitation
Cross-Functional Workshop
Product Strategy & Prioritisation
Product Strategy
[Impact]
A Clear, Team-Aligned Research Roadmap
Using Julia Cowing’s Question Board method, I facilitated a cross-functional session that clarified what we know, don’t know, and need to learn, enabling the team to converge on a prioritised set of research questions and the correct methodologies (qual vs quant, attitudinal vs behavioural).
High-Quality Interview Scripts Rooted in Real Needs
For my Masters project, I translated thematically clustered team questions into a semi-structured interview script. This ensured interviews stayed aligned to strategic learning objectives — especially user attitudes, motivations, and behavioural drivers — and provided consistent, high-quality data for synthesis.
Competitive Insights That Informed Product Direction
Through structured benchmarking for the User Management Tool, I identified gaps, best practices, and opportunities across competing IT management tools. Using a scoring framework focused on speed, productivity, and automation, I produced insights that influenced our early hypotheses, feature direction, and usability standards.
[My Process]

1. Aligning the Team Using the Question Board
• Facilitated a workshop following Julia Cowing’s method to identify knowns, unknowns, and assumptions.
• Guided the team through generating as many research questions as possible without constraint.
• Categorised questions into attitudinal/behavioural and qual/quant to determine the right methodologies.
• Grouped related questions using affinity mapping and created theme titles that summarised user needs.

1. Aligning the Team Using the Question Board
• Facilitated a workshop following Julia Cowing’s method to identify knowns, unknowns, and assumptions.
• Guided the team through generating as many research questions as possible without constraint.
• Categorised questions into attitudinal/behavioural and qual/quant to determine the right methodologies.
• Grouped related questions using affinity mapping and created theme titles that summarised user needs.
2. Turning Question Themes into Interview Scripts
• Used a combination of manual synthesis and AI-assisted drafting to create a semi-structured interview script covering each theme.
• Ensured scripts were neutral, open-ended, and designed to uncover latent attitudes and motivations.
• Validated the script with the team to ensure it reflected their learning needs and broader product goals.
• Used the script as a flexible guide during interviews, enabling natural conversation without losing focus.
2. Turning Question Themes into Interview Scripts
• Used a combination of manual synthesis and AI-assisted drafting to create a semi-structured interview script covering each theme.
• Ensured scripts were neutral, open-ended, and designed to uncover latent attitudes and motivations.
• Validated the script with the team to ensure it reflected their learning needs and broader product goals.
• Used the script as a flexible guide during interviews, enabling natural conversation without losing focus.
3. Conducting Competitive Benchmarking (User Management Tool)
• Identified a relevant set of competing tools targeting similar IT support workflows.
• Analysed UI patterns, automation features, workflow complexity, navigation structures, and user-permission models.
Created a scoring system prioritising:
Faster support
Increased productivity
Automation and reduction of repetitive tasks
• Scored each competitor feature (1–5) using criteria aligned to business outcomes and user pain points.
• Highlighted strengths, weaknesses, and “anti-patterns” to avoid in our own design.
• Translated insights into early hypotheses for how our tool could streamline support flows and remove friction.
3. Conducting Competitive Benchmarking (User Management Tool)
• Identified a relevant set of competing tools targeting similar IT support workflows.
• Analysed UI patterns, automation features, workflow complexity, navigation structures, and user-permission models.
Created a scoring system prioritising:
Faster support
Increased productivity
Automation and reduction of repetitive tasks
• Scored each competitor feature (1–5) using criteria aligned to business outcomes and user pain points.
• Highlighted strengths, weaknesses, and “anti-patterns” to avoid in our own design.
• Translated insights into early hypotheses for how our tool could streamline support flows and remove friction.

4. Synthesising Insights Into Strategic Direction
• Mapped competitor gaps to user needs emerging from early research.
• Recommended opportunities where our product could meaningfully differentiate (e.g., automation, simplified permissions workflows).
• Identified usability pitfalls to avoid — such as cluttered dashboards or multi-step modifications.
• Provided early guidance to engineering and product on where technical exploration (spikes) would be required.

4. Synthesising Insights Into Strategic Direction
• Mapped competitor gaps to user needs emerging from early research.
• Recommended opportunities where our product could meaningfully differentiate (e.g., automation, simplified permissions workflows).
• Identified usability pitfalls to avoid — such as cluttered dashboards or multi-step modifications.
• Provided early guidance to engineering and product on where technical exploration (spikes) would be required.
[Key Learnings]
1. Collaborative Question Generation Strengthens Research Quality
By involving the whole team in defining what they want to learn, research becomes shared, not siloed, leading to better buy-in and more relevant findings.
2. Thematic Mapping Accelerates Insightful Interview Design
Clustering questions into themes reveals deeper patterns about what the team really wants to understand — enabling more targeted, strategic interview scripts.
3. Benchmarking Is Most Powerful When Tied to Business Goals
A scoring system anchored in productivity and speed framed competitive insights in a way that spoke directly to product leadership, shaping early decisions and technical priorities.


[Persona]
Jhon Roberts
Marketing Manager
Content
Age: 29
Location: New York City
Tech Proficiency: Moderate
Gender: Male
[Goal]
Quickly complete purchases without interruptions.
Trust the platform to handle her payment securely.
Access a seamless mobile shopping experience.
[Frustrations]
Long or confusing checkout processes.
Error messages that don’t explain the issue.
Poor mobile optimization that slows her down.
Research that Shapes Strategy
[Overview]
The Context
During early discovery for both an academic project for my Masters in UX Design and a commercial project I was the lead designer for, I led research activities designed to align teams, clarify knowledge gaps, and ground product decisions in evidence.
By combining Question Board workshops with competitive benchmarking, I established a structured foundation for learning, prioritisation, and eventual design direction.
The Context
During early discovery for both an academic project for my Masters in UX Design and a commercial project I was the lead designer for, I led research activities designed to align teams, clarify knowledge gaps, and ground product decisions in evidence.
By combining Question Board workshops with competitive benchmarking, I established a structured foundation for learning, prioritisation, and eventual design direction.
Problem Statement
Teams often begin with assumptions, scattered questions, and little clarity about what they need to learn to design the right product. At the same time, they may lack awareness of how competitors solve similar problems. I needed a repeatable research approach that aligned teams, revealed unknowns, shaped interview scripts, and ensured we understood the competitive landscape before committing to solutions.
Problem Statement
Teams often begin with assumptions, scattered questions, and little clarity about what they need to learn to design the right product. At the same time, they may lack awareness of how competitors solve similar problems. I needed a repeatable research approach that aligned teams, revealed unknowns, shaped interview scripts, and ensured we understood the competitive landscape before committing to solutions.
Skills Demonstrated
UX Research Strategy
UX Research Strategy
Qualitative Research
Qualitative Research
Competitive Benchmarking
Competitive Benchmarking
Workshop Facilitation
Workshop Facilitation
Cross-Functional Workshop
Cross-Functional Workshop
Product Strategy
Product Strategy
[Impact]
A Clear, Team-Aligned Research Roadmap
Using Julia Cowing’s Question Board method, I facilitated a cross-functional session that clarified what we know, don’t know, and need to learn, enabling the team to converge on a prioritised set of research questions and the correct methodologies (qual vs quant, attitudinal vs behavioural).
High-Quality Interview Scripts Rooted in Real Needs
For my Masters project, I translated thematically clustered team questions into a semi-structured interview script. This ensured interviews stayed aligned to strategic learning objectives — especially user attitudes, motivations, and behavioural drivers — and provided consistent, high-quality data for synthesis.
Competitive Insights That Informed Product Direction
Through structured benchmarking for the User Management Tool, I identified gaps, best practices, and opportunities across competing IT management tools. Using a scoring framework focused on speed, productivity, and automation, I produced insights that influenced our early hypotheses, feature direction, and usability standards.
[Key Learnings]
1. Quant Data Creates Business Confidence
Service data provided indisputable evidence for building the User Management Tool and clearly demonstrated its value to the business. Quant research is often the most powerful lever for gaining leadership buy-in.
2. Qualitative Depth Reveals How to Design for Experts
The instructor interviews and observations revealed the tacit knowledge, mental models, and decision-making processes that no quantitative dataset could surface — enabling design that truly supports expert judgement.
3. A Full Evidence Chain Strengthens Design Credibility
By linking insights → themes → interpretations → implications → user stories → prototype decisions, I created a transparent reasoning trail. This level of rigour builds trust with stakeholders and ensures the final product is grounded in genuine user needs.
[My Process]

1. Aligning the Team Using the Question Board
• Facilitated a workshop following Julia Cowing’s method to identify knowns, unknowns, and assumptions.
• Guided the team through generating as many research questions as possible without constraint.
• Categorised questions into attitudinal/behavioural and qual/quant to determine the right methodologies.
• Grouped related questions using affinity mapping and created theme titles that summarised user needs.

1. Aligning the Team Using the Question Board
• Facilitated a workshop following Julia Cowing’s method to identify knowns, unknowns, and assumptions.
• Guided the team through generating as many research questions as possible without constraint.
• Categorised questions into attitudinal/behavioural and qual/quant to determine the right methodologies.
• Grouped related questions using affinity mapping and created theme titles that summarised user needs.
2. Turning Question Themes into Interview Scripts
• Used a combination of manual synthesis and AI-assisted drafting to create a semi-structured interview script covering each theme.
• Ensured scripts were neutral, open-ended, and designed to uncover latent attitudes and motivations.
• Validated the script with the team to ensure it reflected their learning needs and broader product goals.
• Used the script as a flexible guide during interviews, enabling natural conversation without losing focus.
2. Turning Question Themes into Interview Scripts
• Used a combination of manual synthesis and AI-assisted drafting to create a semi-structured interview script covering each theme.
• Ensured scripts were neutral, open-ended, and designed to uncover latent attitudes and motivations.
• Validated the script with the team to ensure it reflected their learning needs and broader product goals.
• Used the script as a flexible guide during interviews, enabling natural conversation without losing focus.
3. Conducting Competitive Benchmarking (User Management Tool)
• Identified a relevant set of competing tools targeting similar IT support workflows.
• Analysed UI patterns, automation features, workflow complexity, navigation structures, and user-permission models.
Created a scoring system prioritising:
Faster support
Increased productivity
Automation and reduction of repetitive tasks
• Scored each competitor feature (1–5) using criteria aligned to business outcomes and user pain points.
• Highlighted strengths, weaknesses, and “anti-patterns” to avoid in our own design.
• Translated insights into early hypotheses for how our tool could streamline support flows and remove friction.
3. Conducting Competitive Benchmarking (User Management Tool)
• Identified a relevant set of competing tools targeting similar IT support workflows.
• Analysed UI patterns, automation features, workflow complexity, navigation structures, and user-permission models.
Created a scoring system prioritising:
Faster support
Increased productivity
Automation and reduction of repetitive tasks
• Scored each competitor feature (1–5) using criteria aligned to business outcomes and user pain points.
• Highlighted strengths, weaknesses, and “anti-patterns” to avoid in our own design.
• Translated insights into early hypotheses for how our tool could streamline support flows and remove friction.

4. Synthesising Insights Into Strategic Direction
• Mapped competitor gaps to user needs emerging from early research.
• Recommended opportunities where our product could meaningfully differentiate (e.g., automation, simplified permissions workflows).
• Identified usability pitfalls to avoid — such as cluttered dashboards or multi-step modifications.
• Provided early guidance to engineering and product on where technical exploration (spikes) would be required.

4. Synthesising Insights Into Strategic Direction
• Mapped competitor gaps to user needs emerging from early research.
• Recommended opportunities where our product could meaningfully differentiate (e.g., automation, simplified permissions workflows).
• Identified usability pitfalls to avoid — such as cluttered dashboards or multi-step modifications.
• Provided early guidance to engineering and product on where technical exploration (spikes) would be required.