Skip to content

6.2 Stakeholder feedback

Goal: Translate authentic user experience into product and operational excellence

By translating firsthand experiences directly into a structured improvement backlog, you enhance sDHT usability and significantly reduce trial burden for future protocols.

Overview

Capture insights & eliminate friction

Trial participants and site staff are your ultimate users, and their feedback is gold. This step captures the qualitative insights—such as pain points regarding comfort, battery life, or app syncing—that objective adherence data cannot show. This is crucial for reducing friction in future trial protocols. This step captures qualitative insights from participants and site staff—such as pain points related to comfort, battery life, or app syncing—that objective adherence data may not fully capture, and translates them into protocol and usability improvements that reduce friction, improve adherence, and strengthen the quality of future datasets.

Stakeholder benefits

Adopters: You reduce future trial burden and attrition by acting on complaints, which improves your ability to achieve target recruitment and retention metrics for the next trial.

Site staff: You can expect reduced operational friction, leading to faster setups, lower administrative burden, and smoother workflows in future trials, which enables more efficient resource allocation.

Developers: You receive detailed, validated usability data that informs user interface (UI)/user experience (UX) changes and reduces the reliance on costly downstream technical support.

in practice

Step‑by‑step actions: Closing the loop with users and sites

1. Collect and analyze feedback

Deploy qualitative tools (like the Study Participant Feedback Questionnaire (SPFQ), see industry spotlight below) at trial closeout.

Participants: Assess subjective, qualitative details on ease-of-use, comfort, battery life, and overall trial burden.

Site feedback: Systematically collect feedback from site staff regarding operational friction, technical support quality, and experiences associated with data monitoring and device management.

Correlation: Analyze the qualitative feedback alongside the adherence data captured in trial. Look for correlations between specific pain points reported by participants and data trends—for example, reports that “charging was confusing” correlating with low overnight wear time.

2. Transform feedback into actionable items

Create a continuous improvement backlog that prioritizes fixes and enhancements based on the feedback. Consider tagging each item based on effort and impact using a simple 2×2 matrix (e.g., high-impact/low-effort as “Quick Wins” like updating a training manual; high-impact/high-effort as “Major Projects” like redesigning device hardware; low-impact items may be deprioritized).

Developers: Own product improvements (e.g., UI updates, bug fixes).

Adopters: Own tasks related to operational improvements (e.g., streamlining patient onboarding or site selection criteria).

examples

Patient Voices Network (PVN) – “Closing the Loop”

A real-world example comes from the Patient Voices Network (PVN) in British Columbia, Canada, which supports patient engagement in healthcare, including research and clinical initiatives. PVN’s “Closing the Loop” process involves sending a prompt “What We Heard” summary (typically within 1-2 weeks of engagement) that recaps key concerns, ideas, or feedback from participants, allowing participants to validate it and see how their input was heard. This is often followed by updates on actions implemented (e.g., protocol changes or experience enhancements) and initiative impacts.

For instance, in a healthcare redesign engagement, a “What We Heard” summary might highlight patient-noted issues like confusing instructions, paired with actions like “We’ve revised the onboarding materials based on your input, with rollout in the next phase.”

PVN provides a downloadable template for these summaries, emphasizing their role in fostering ongoing trust and participation. This approach is adaptable to clinical trials, where it helps address retention challenges and refine study designs.

The SPFQ toolkit from TransCelerate (see industry spotlight) includes a dedicated “Results Sharing” section in its Implementation User Guide, encouraging sponsors to communicate aggregated feedback and resulting improvements back to participants and sites.

In the INTERLINK-1 phase 3 oncology trial (a real implementation of SPFQ), feedback on aspects like information provision and daily impact led to actionable recommendations, such as earlier survey administration and question rephrasing to reduce bias, though the study focused more on internal use for future trials rather than explicit participant sharing.

Key deliverable

You Spoke—We Acted” summary

This is a report shared with participants and sites to demonstrate responsiveness, validating their contribution, and building trust for future engagement. 

The “You Spoke—We Acted” summary is a critical relationship-building tool that transforms participants and sites from passive data sources into active partners. By formally acknowledging how stakeholder feedback influenced the evolution of your digital health tool, you validate their effort and prevent the transaction from feeling like simple data extraction. This transparency builds the trust required for long-term engagement and serves as auditable evidence of a patient-centric quality assurance process.

To be effective, this document must be jargon-free and highly scannable. For example, it might include highlights like: “Participants noted frequent app syncing delays—we’ve implemented a firmware update reducing sync time by 30%, with rollout planned for Q2.” This concise summary (1-2 pages) should feature categorized feedback themes, corresponding actions taken or planned, timelines, and owners, formatted for easy sharing via email or a trial portal.

Consider structuring the summary into three distinct sections: 

List arrow decoration

The insight (mirroring specific friction points back to the user),

List arrow decoration

The resolution (demonstrating concrete technical or operational fixes), and

List arrow decoration

The commitment (outlining longer-term solutions in your continuous improvement backlog).

This structured approach creates a reusable blueprint for scaling your strategy, ensuring you reduce future trial burden by systematically eliminating the sources of friction identified in your retrospective.

Library resources to guide you

The sDHT roadmap library gathers 200+ external resources to support the adoption of sensor-based digital health technologies. To help you apply the concepts in this section, we’ve curated specific spotlights that provide direct access to critical guidance and real-world examples, helping you move from strategy to implementation.

Gathers real-world examples, case studies, best practices, and lessons learned from peers and leaders in the field relevant to this section. Use these insights to accelerate your work and avoid common pitfalls.

Open Industry spotlight

Back to top ↑