Industry spotlight
PUBLICATION
The V3 framework foundational paper. Digital medicine is an interdisciplinary field, drawing together stakeholders with expertize in engineering, manufacturing, clinical science, data science, biostatistics, regulatory science, ethics, patient advocacy, and healthcare policy, to name a few. Although this diversity is undoubtedly valuable, it can lead to confusion regarding terminology and best practices. There are many instances, as we detail in this paper, where a single term is used by different groups to mean different things, as well as cases where multiple terms are used to describe essentially the same concept. Our intent is to clarify core terminology and best practices for the evaluation of Biometric Monitoring Technologies (BioMeTs), without unnecessarily introducing new terms. We focus on the evaluation of BioMeTs as fit-for-purpose for use in clinical trials. However, our intent is for this framework to be instructional to all users of digital measurement tools, regardless of setting or intended use. We propose and describe a three-component framework intended to provide a foundational evaluation framework for BioMeTs. This framework includes (1) verification, (2) analytical validation, and (3) clinical validation. We aim for this common vocabulary to enable more effective communication and collaboration, generate a common and meaningful evidence base for BioMeTs, and improve the accessibility of the digital medicine field.
BEST PRACTICES FROM DIME
Outlines the V3+ Framework, which includes step-by-step guidance to ensure that sensor-based DHTs meet rigorous engineering standards and reliably deliver accurate and consistent data. The playbook emphasizes the importance of documenting technical testing and verification against predefined specifications, ensuring that DHTs are “fit for purpose” and ready for use in clinical study settings.
BEST PRACTICES FROM CTTI
The processes of verification and validation are complementary but independent from one another. This is illustrated, in part, by the example of a recently developed neural network algorithm to detect scratching movement from an accelerometry signal, where the technology was produced by one company, and the algorithm was developed and validated by another.
PUBLICATION
When available, verification evidence should include demonstration that the sensor technology provides raw data that has adequate technical performance characteristics such as accuracy, reliability, precision, consistency over time, uniformity across mobile sensor generations and/or technologies, and across different environmental conditions.
COURSE BY DIGITAL MEDICINE ACADEMY (DIME)
Whether you work in clinical trials, healthcare delivery, public health, or wellness, using sensor-based digital health technologies (sDHT) that are fit for purpose is a must. This crash course teaches you how to develop, evaluate, and deploy a digital measurement product that meets the requirements of regulators and payers, as well as your buyers and adopters.
Verification, analytical validation, and clinical validation (V3): the foundation of determining fit-for-purpose for Biometric Monitoring Technologies (BioMeTs)
The term “clinically validated” is frequently used in marketing but lacks a clear, standardized meaning, leading to confusion. The rapid development of BioMeTs has outpaced the creation of systematic, evidence-based evaluation frameworks, creating a knowledge gap. Existing validation standards from software, hardware, and clinical development are often applied in silos and are not fully sufficient for modern BioMeTs. Evaluating a BioMeT requires assessing the entire “data supply chain,” from the sensor hardware (verification) and data processing algorithms (analytical validation) to its performance against a meaningful clinical concept (clinical validation).
Recommendations
The digital medicine field should adopt the V3 (Verification, Analytical Validation, Clinical Validation) framework as a foundational evaluation standard for all BioMeTs to ensure they are fit-for-purpose. Technology manufacturers, clinical trial sponsors, and researchers should transparently report their V3 processes and results to overcome “black box” approaches and build a common evidence base. Technology manufacturers are primarily responsible for verification , while the entity developing the algorithm (e.g., manufacturer or sponsor) is responsible for analytical validation. The sponsor or clinical team using the BioMeT for a specific purpose is responsible for clinical validation in that context of use.
Regulatory Considerations
The V3 framework is designed to inform and align with the current regulatory landscape, although the regulatory pathway for a specific BioMeT depends on its intended use and marketing claims, not just its underlying technology. The 21st Century Cures Act and the concept of Software as a Medical Device (SaMD) have created new regulatory paradigms that decouple software from specific hardware. BioMeTs used to support drug development may follow a tool qualification pathway, while those marketed as standalone medical devices are subject to device clearance or approval processes. Stakeholders should engage with regulatory agencies early to determine appropriate validation approaches.
Some summaries are generated with the help of a large language model; always view the linked primary source of a resource you are interested in.
Playbook Digital Clinical Measures
Successful deployment of digital clinical measures requires a shared foundation of standardized methodologies, terminology, and best practices.
The selection of digital measures must prioritize patient-centered outcomes and align with meaningful aspects of health.
Technology validation processes, including the Verification, Analytical Validation, and Clinical Validation (V3) framework, are crucial to ensuring data accuracy and reliability.
Interoperability, data security, and governance remain key challenges for digital health technologies in both research and clinical applications.
Case studies demonstrate the real-world utility of digital clinical measures in clinical research, patient care, and public health initiatives.
Recommendations
Stakeholders should follow a structured, stepwise approach to selecting and validating digital clinical measures, starting with identifying meaningful health aspects.
Digital health tools must undergo rigorous verification and validation to ensure they are fit-for-purpose and meet clinical and regulatory standards.
Patient engagement should be integrated into every stage of digital measure development to ensure the relevance and usability of selected endpoints.
Regulatory and payer engagement should occur early in the process to streamline market access and reimbursement pathways.
Organizations should adopt a proactive approach to data privacy, security, and governance, ensuring compliance with regulations such as HIPAA and GDPR.
Regulatory Considerations
The FDA and other regulatory bodies emphasize the need for clinical validation of digital measures before they can be used as primary endpoints in trials.
Standardization of digital health technologies is critical to regulatory approval, requiring alignment with frameworks such as HL7 and ISO standards.
Data security and privacy regulations must be strictly adhered to, particularly in decentralized clinical trials where remote monitoring is used.
Digital endpoint validation must include real-world evidence (RWE) to support regulatory decision-making and post-market surveillance.
Organizations must consider the evolving regulatory landscape for AI-driven health technologies, ensuring compliance with best practices for algorithmic transparency and bias mitigation.
Some summaries are generated with the help of a large language model; always view the linked primary source of a resource you are interested in.
Case Example: Verification and Validation Processes in Practice
Verification involves testing the accelerometer’s technical specifications (e.g., accuracy and precision) through peer-reviewed studies.
Validation of the algorithm relies on “ground truth” data, gathered through infrared video recordings and manual scoring of movements.
Cross-validation was used to assess the algorithm’s performance, with additional validation in independent samples planned.
The separation of verification and validation allows greater flexibility, enabling the algorithm’s use with multiple accelerometer devices that meet minimum standards.
Recommendations
Conduct separate verification and validation processes to ensure the reliability of both the device and the algorithm.
Use peer-reviewed publications to document the performance of DHTs and their limitations.
Ensure validation includes testing with representative populations to confirm the algorithm’s utility across diverse contexts.
Promote industry-wide standards to facilitate scalability and regulatory acceptance of DHTs in clinical trials.
Regulatory Considerations
Ensure DHTs undergo rigorous verification to meet accuracy and precision standards documented in peer-reviewed studies.
Validate algorithms using empirical “ground truth” data to demonstrate their ability to measure clinically meaningful outcomes.
Align the design and validation of DHTs with regulatory expectations for reliable and transferable performance across devices.
Some summaries are generated with the help of a large language model; always view the linked primary source of a resource you are interested in.
Considerations for development of an evidence dossier to support the use of mobile sensor technology for clinical outcome assessments in clinical trials
Mobile sensors provide unique opportunities for objective, real-world data collection but face challenges in achieving regulatory acceptance due to a lack of standardization and validation frameworks.
A comprehensive evidence dossier must address three key components: verification, analytical validity, and clinical validation, to ensure endpoints are fit-for-purpose.
Demonstrating content validity is critical, especially when endpoints are not directly measuring meaningful aspects of health but infer these through related concepts.
Early engagement with regulatory bodies (e.g., FDA, EMA) is recommended to align expectations and address evidentiary gaps.
Usability and feasibility research are vital to ensure patient compliance and data quality in real-world applications.
Recommendations
Develop Comprehensive Dossiers: Include sections on endpoint definition, concept of interest, content validity, clinical validation, analytical validation, and implementation details to support regulatory review.
Ensure Content Validity: Demonstrate a clear relationship between sensor-derived endpoints and meaningful health outcomes, supported by literature, patient interviews, and expert consensus.
Engage with Regulators Early: Discuss the proposed endpoint and its context of use with regulatory agencies to ensure alignment and identify potential challenges.
Standardize Validation Processes: Use rigorous methods for verification, analytical validation, and construct validation to establish the reliability and accuracy of sensor technologies.
Promote Collaboration: Share validation data and methodologies across stakeholders to reduce redundancy and accelerate the adoption of mobile sensor endpoints.
Regulatory Considerations
Verification of Sensor Technologies: Demonstrate that sensors produce accurate, reliable, and consistent raw data under various conditions, including environmental variability.
Analytical Validation: Show that firmware and algorithms used to process raw data maintain high technical performance and align with regulatory standards.
Clinical Validation: Provide evidence that sensor-derived data reliably measure the concept of interest and are responsive to meaningful clinical changes.
Context of Use: Clearly define the intended application of the endpoint, including target populations, trial design, and labeling claims, to guide regulatory evaluation.
Data Security and Privacy: Ensure compliance with data protection regulations, such as 21 CFR Part 11, to secure patient data during collection, transmission, and storage.
Some summaries are generated with the help of a large language model; always view the linked primary source of a resource you are interested in.
Building Fit-for-Purpose Sensor-based Digital Health Technologies: A Crash Course
Usability gaps in sDHTs remain a barrier to adoption, with many technologies failing to prioritize ease of use, accessibility, and diverse user needs
Human-centered design is critical for ensuring that digital health solutions are intuitive, functional, and scalable across different healthcare environments
Standardized usability metrics for evaluating digital health technologies are lacking, leading to inconsistent reporting and validation of usability outcomes
Use-related risk analysis is essential to identifying and mitigating risks associated with user errors, ensuring the safety and effectiveness of sDHTs
The V3+ framework provides a structured approach to integrating usability validation into digital health technology development, aligning with global regulatory expectations
Recommendations
Developers should incorporate human-centered design principles from the outset, ensuring that usability, accessibility, and user needs are central to sDHT development
Usability validation should be standardized, with clear methodologies for measuring usability, including satisfaction, ease of use, efficiency, and error mitigation
Regulatory and clinical stakeholders should collaborate on defining best practices for usability evaluation, ensuring that digital endpoints are both meaningful and scalable
Risk analysis should be iterative, with developers continuously refining their technologies based on real-world user feedback and testing
The usability validation component of V3+ should be widely adopted to ensure that digital clinical measures meet patient-centered, regulatory, and technical expectations
Regulatory Considerations
Regulators are emphasizing the need for usability validation to ensure that digital endpoints are both clinically relevant and patient-friendly
sDHTs must comply with human factors engineering guidelines, aligning with global regulatory frameworks such as ISO 9241-210 and FDA usability requirements
Data security, privacy, and interoperability must be ensured, particularly as sDHTs become integrated into remote monitoring and decentralized clinical trials
Real-world evidence (RWE) should support usability validation, helping to bridge the gap between regulatory approval and real-world adoption
Regulatory bodies should work toward standardizing usability testing methodologies, ensuring consistency across clinical research, digital endpoints, and medical device evaluations
Some summaries are generated with the help of a large language model; always view the linked primary source of a resource you are interested in.
For reference: review the relevant regulatory guidances
Regulatory spotlight
Features essential guidance, publications, and communications from regulatory bodies relevant to this section. Use these resources to inform your regulatory strategy and ensure compliance.