The Integrated MDR Compliance Architecture for AI-Enabled Medical Device Families: A Comprehensive Framework

Executive Summary: The Strategic Context

Medical device manufacturers developing AI-enabled products face an unprecedented regulatory convergence in Europe. Four major frameworks—EU MDR (Medical Devices Regulation), EU IVDR (In Vitro Diagnostic Devices Regulation), EU AI Act, GDPR, and EHDS—create an interlocking ecosystem of requirements that cannot be addressed through traditional, siloed compliance approaches.

This challenge is amplified by three critical realities:

  1. Manufacturers don’t produce standalone devices, but rather device families and product lines built on shared AI platforms.
  2. Modern portfolios often include both MDR devices (medical devices) and IVDR devices (in vitro diagnostics) requiring parallel but distinct regulatory pathways.
  3. The MDCG 2019-13 sampling procedures mean notified bodies assess only representative devices from each generic group, making platform consistency essential.

Figure 1: Create Once, Use Everywhere 

When combined, these factors create a strategic imperative: manufacturers must adopt an integrated, modular compliance architecture that addresses all regulations simultaneously while supporting efficient scaling across device portfolios—whether those devices fall under MDR, IVDR, or both.

The Core Thesis:

The only sustainable path to regulatory compliance for AI-enabled medical device families is a three-tier modular architecture:

  1. Platform-level governance addressing cross-cutting requirements, 
  2. Generic device group documentation demonstrating family consistency (with clear MDR vs IVDR distinction),
  3.  Device-specific supplements providing unique context—all integrated across MDR/IVDR, AI Act, GDPR, and EHDS requirements.

Critical Distinction:

This framework applies to both:

  • MDR devices (Regulation 2017/745) – Medical devices such as imaging systems, surgical instruments, diagnostic software.
  • IVDR devices (Regulation 2017/746) – In vitro diagnostic devices such as laboratory analyzers, pathology systems, genetic testing platforms.

While the modular architecture principles apply universally, the framework recognizes and addresses the distinct requirements, classification systems, and evidence standards between these two regulatory pathways.

Part 1: The European Regulatory Landscape – Understanding the Terrain

1.1 The Four Pillars of Compliance 

Modern medical device manufacturers operate within four intersecting regulatory domains:

Pillar 1: EU MDR & IVDR – The Dual Safety Foundation 

The European Union has established two parallel but distinct regulatory frameworks for medical products:

  • EU MDR (Regulation 2017/745) – Medical Devices
    • Scope: Devices used for diagnosis, prevention, monitoring, treatment, or alleviation of disease (e.g., imaging systems, surgical instruments, patient monitoring devices, diagnostic software).
    • Classification System: 4 risk classes.
      • Class I: Low risk (bandages, examination gloves, non-powered instruments).
      • Class IIa: Medium risk (powered surgical instruments, diagnostic imaging software).
      • Class IIb: Higher risk (ventilators, advanced imaging with AI, life-supporting devices).
      • Class III: Highest risk (implantable defibrillators, heart valves, AI tools for direct clinical decision-making).
  • EU IVDR (Regulation 2017/746) – In Vitro Diagnostic Medical Devices
    • Scope: Devices used for examination of specimens derived from the human body to provide information for diagnosis, monitoring, or prognosis (e.g., blood glucose meters, pregnancy tests, genetic testing systems, laboratory analyzers).
    • Classification System: 4 risk classes.
      • Class A: Low risk (specimen receptacles, general-purpose laboratory equipment).
      • Class B: Medium risk (self-testing pregnancy tests, infectious agent detection without high propagation risk).
      • Class C: High risk (HLA typing, HIV screening tests).
      • Class D: Highest risk (blood screening for transfusion, life-threatening transmissible agents with high propagation risk).

Critical Distinction – Is Your Device MD or IVD? 

According to MDCG 2021-24 and the decision tree in MDCG guidance, a device is an IVD if:

  1. It provides information within the scope of the IVD definition (diagnosis, monitoring, prognosis based on examination of specimens).
  2. It is based on data obtained by IVD medical devices only.
  3. Its intended purpose is substantially driven by IVD data sources.

If none of these conditions are met, the device falls under MDR as a medical device.

For AI-enabled software:

  • IVD Software: Analyzes laboratory test results, genetic sequences, pathology images from tissue samples.
  • MD Software: Analyzes medical images (CT, MRI, X-ray), patient physiological data, clinical decision support based on patient observations.

Common Core Requirements (Both MDR & IVDR): 

  • Technical documentation (Annex II for MDR, Annex III for IVDR) proving GSPR compliance.
  • ISO 13485:2016 Quality Management System (applicable to both).
  • Risk management (ISO 14971) throughout lifecycle.
  • Clinical evaluation (MDR) / Performance evaluation (IVDR).
  • Post-market surveillance and vigilance reporting.
  • EUDAMED database registration with UDI tracking.
  • Notified body involvement for medium to higher-risk classes.

Key Regulatory Difference: 

  • MDR devices require clinical evaluation demonstrating clinical benefits and safety.
  • IVDR devices require performance evaluation demonstrating analytical and clinical performance.

Pillar 2: EU AI Act (2024/1689) – The Algorithmic Trust Framework

  • Primary Focus: AI transparency, data governance, human oversight.
  • Application: Applies to BOTH MDR and IVDR devices that incorporate AI systems.
  • Core Requirements for High-Risk Medical AI (both MD and IVD):
    • Article 9: AI-specific risk management (bias, drift, explainability).
    • Article 10: Data governance (quality, representativeness, bias mitigation).
    • Article 11: Technical documentation (integrated with MDR/IVDR, not separate).
    • Article 13: Transparency and interpretability of AI outputs.
    • Article 14: Human oversight mechanisms.
    • Article 15: Accuracy, robustness, cybersecurity.

Note: Per MDCG 2025-6, AI Act requirements apply equally to both MDR and IVDR devices. The high-risk classification under the AI Act follows the device classification under MDR or IVDR.

Pillar 3: GDPR (2016/679) – The Privacy Baseline 

  • Primary Focus: Personal data protection and individual rights.
  • Application: Applies to BOTH MDR and IVDR when processing personal health data.
  • Core Requirements:
    • Lawful basis for processing health data (Articles 6 + 9).
    • Data subject rights (access, rectification, erasure, portability).
    • Data Protection by Design and by Default.
    • Data Protection Impact Assessments for high-risk processing.
    • Breach notification within 72 hours
    • Consent management where applicable.

Pillar 4: EHDS (European Health Data Space) – The Interoperability Infrastructure 

  • Primary Focus: Standardized, secure health data exchange.
  • Application: Applies to BOTH MDR and IVDR devices.
  • Core Requirements:
    • Technical Standards: EEHRxF, HL7 FHIR, SNOMED CT, LOINC, ICD.
    • Primary Use: MyHealth@EU platform, patient-controlled data exchange.
    • Secondary Use: Secure research access (2029-2031 timeline).
    • Security: ISO 27001, GDPR-compliant, NIS2 Directive alignment.

Figure 1: European Landscape for AI enabled medical devices in Europe

1.2 The Critical Intersections – Where Regulations Overlap

These regulations don’t operate in isolation—they create a regulatory nexus where requirements reinforce and extend each other. This nexus applies to BOTH MDR and IVDR devices.

The Regulatory Nexus: Where Requirements Intersect

The following chart illustrates the convergence of four major EU regulatory frameworks, with all compliance efforts anchored in the data and culminating in a unified compliance architecture.

1. Data Foundation

  • Input: CLINICAL DATA

2. The Core Nexus

  • The four regulatory pillars interlock to form the REGULATORY NEXUS
PillarConnectionFocus
MDRSafety
AI ActTrust
GDPRPrivacy
EHDSData Exchange

3. Strategic Output

  • Goal: INTEGRATED COMPLIANCE ARCHITECTURE

Figure 2: AI enabled Medical Devices’Regulatory Nexus, key interection

To further emphasize the integration imperative, here is a summary of the key mandatory connections: Table 1

Table 1: Key Intersection Points

IntersectionOverlapping RequirementsIntegration Imperative
MDRAI ActRisk management, technical documentation, post-market surveillance Single integrated technical file; MDCG 2025-6 mandates this 
MDR ∩  GDPRClinical data collection, patient consent, data security Clinical evaluation plans must establish lawful basis 
AI Act  ∩  GDPRTraining data processing, bias detection, data minimization AI Act Article 10(5) permits exceptional processing with safeguards 
EHDS  ∩  GDPRData access controls, patient rights, cross-border transfers EHDS builds on GDPR with sector-specific mechanisms
EHDS  ∩ MDRClinical evidence data, PMS data, device databases EHDS provides standardized infrastructure for MDR data 
EHDS AI ActTraining data quality, performance monitoring, secondary use EHDS enables compliant access to diverse datasets 

Critical Regulatory Guidance:

  • MDCG 2025-6 confirms that
    • AI Act requirements must be “integrated into existing MDR/IVDR conformity assessments”, not treated as separate compliance tracks
    • Applies equally to medical devices (MDR) and in vitro diagnostic devices (IVDR).
  • MDCG 2019-13 establishes that
    • Notified bodies will “sample representative devices” from generic device groups rather than assess every individual device.
    • Defines different sampling approaches for MDR vs IVDR (see Part 2)

Part 2: The Device Family Reality – What Manufacturers Actually Build

2.1 From Standalone Devices to Integrated Ecosystems

The traditional regulatory model assumed manufacturers produced individual, discrete devices. Modern reality is fundamentally different, and this applies to both “MDR and IVDR manufacturers”: We will illustrate this across illustrative yet realistic examples.

2.1.1 The Three-Tier Modular Compliance Architecture

The solution organizes all documentation and evidence into three distinct tiers, moving from a broad platform-level foundation to narrow device-specific details 

TierScope / ApplicabilityKey PurposeDocuments Generated (Create Once)Key Regulations Addressed
Tier 1: PlatformUniversal (Applies to ALL devices using the core AI system)Addresses shared, cross-cutting requirements to enable evidence reuse.Platform Technical File (PTF-AI-DIAG-vX.X) , Master AI Algorithm Library, Platform Risk Management File (Algorithm Hazards), Training Data Governance System (AI Act Art. 10).AI Act (Art. 9, 10, 11), GDPR , EHDS, ISO 1348513.
Tier 2: Generic Device GroupGroup-Specific (Common to a predefined group of similar devices—distinct for MDR vs. IVDR)Justifies the use of MDCG 2019-13 sampling by proving consistency and equivalence across the device family.Group Definition and Inclusion Criteria, Representative Device Selection Justification, Equivalence Demonstration Matrix, Group-Level Clinical Evaluation Strategy (MDR-specific) or Performance Evaluation Strategy (IVDR-specific).MDR/IVDR (Classification & Sampling), MDCG 2019-13
Tier 3: Device-Specific SupplementUnique (Specific to each individual commercial device model).Documents unique hardware, device-specific risks, and integration details, referencing Tiers 1 and 2.Device Technical File Supplement, Device-Specific Risk Assessment (e.g., Use Errors, Hardware Failures), Instructions for Use (IFU), Integration Testing Results.MDR/IVDR (Annex II/III Device Description) , AI Act (Art. 13 Transparency) , IEC 62366 (Usability/IFU).

2.1.2. The Integration Principle: « Create Once, Use Everywhere »

This model ensures that the complex, high-effort documentation (like algorithm validation and data governance) is completed only once at the Platform (Tier 1) level, achieving a significant reduction in time and cost compared to the traditional, device-by-device approach

Example: The Modern Medical Device Portfolio Structure

MANUFACTURER: AI Diagnostics Corp (Mixed MD/IVD Portfolio) 

The manufacturer builds their product line on a shared Platform Layer:

Platform Layer: Shared AI Infrastructure

├─ Core AI Algorithms (reusable across products)

│  ├─ Image Analysis Engine v3.2

│  ├─ Automated Measurement Suite v2.1

│  ├─ Diagnostic Decision Support v1.8

│  └─ Natural Language Processing v2.5

├─ Data Governance Platform (AI Act Article 10 compliance)

│  ├─ Training dataset registry (12 disease-specific datasets)

│  ├─ Bias monitoring system (continuous demographic tracking)

│  ├─ Data provenance tracking (complete lineage)

│  └─ EHDS-compliant data exchange infrastructure

├─ Security Architecture (ISO 27001 module)

│  ├─ Cloud infrastructure security

│  ├─ Edge device security protocols

│  ├─ Cybersecurity risk management

│  └─ GDPR technical measures

└─ Quality Management System (ISO 13485 foundation)

   ├─ Design controls

   ├─ Risk management (ISO 14971)

   ├─ Post-market surveillance

   └─ Change control procedures

▼ Platform supports multiple device families ▼

Device Family 1: Radiology Workstations (Generic Device Group 1)

├─ Entry-level CT Analysis Workstation (Class IIa)

├─ Advanced MRI Analysis Workstation (Class IIb) ← Representative

├─ Premium Multi-Modality Workstation (Class IIb)

└─ Mobile Radiology Review App (Class IIa)

Device Family 2: Point-of-Care Ultrasound (Generic Device Group 2)

├─ Handheld Cardiac Ultrasound (Class IIa)

├─ Portable General Ultrasound (Class IIa) ← Representative

├─ Advanced Obstetric Ultrasound (Class IIb)

└─ Emergency Department Ultrasound (Class IIb)

Device Family 3: Cloud Diagnostic Services (Generic Device Group 3)

├─ Teleradiology AI Platform (Class IIb) ← Representative

├─ Remote Consultation Suite (Class IIa)

└─ AI-Assisted Triage System (Class IIa)

Figure 3:  Modern medical device portfolio structure, highlighting the shared platform layer and device families

Common Characteristics Across ALL Devices (MD + IVD):  

  • All devices use shared AI algorithms from platform • All comply with EHDS interoperability standards
  • All implement same data governance approach (AI Act Article 10) •
  • All follow integrated MDR/IVDR + AI Act technical documentation structure 
  • All share security architecture and QMS foundation

Key Insight: Modern manufacturers increasingly develop hybrid portfolios containing both MDR devices (medical devices) and IVDR devices (in vitro diagnostics). The platform architecture approach works for both, but regulatory documentation must clearly distinguish which regulatory framework applies to each device family

2.2 The Sampling Reality – Notified Body Assessment 

2.2.1. MDCG 2019-13 + MDCG 2025-6

Notified bodies do NOT assess every device individually. However, sampling rules differ between MDR and IVDR:

For MDR Devices (Medical Devices): 

  • Sampling Requirements:
    • Class IIb & Class III: Assess at least one representative device per generic device group.
    • Class IIa: Assess at least one representative device per category of devices (MDA/MDN codes).
    • Class I (sterile, measuring, reusable surgical): Requires notified body involvement.
    • Class I (non-sterile, non-measuring, non-reusable): No notified body required.
  • Example – MDR Portfolio: 

Your MDR Portfolio: 12 devices across 3 generic device groups 

Notified Body Assessment: 

MDR Group 1: Radiology Workstations (4 devices)         

  • EMDN 4th level: Z180301                                 
  • NB assesses: Advanced MRI Workstation (Class IIb)    
  • You must prove: Other 3 devices equivalent           

MDR Group 2: Point-of-Care Ultrasound (4 devices)       

  • EMDN 4th level: Z181203                               
  • NB assesses: Portable General Ultrasound (Class IIa)  
  • You must prove: Other 3 devices equivalent           

MDR Group 3: Cloud Diagnostic Services (4 devices) 

  • EMDN 4th level: Z180302                            
  • NB assesses: Teleradiology AI Platform (Class IIb)   
  • You must prove: Other 3 devices equivalent            

Result: NB assesses 3 devices, you certify 12 devices.This means Notified Body cost savings and time saving for faster market entry for derivatives.

For IVDR Devices (In Vitro Diagnostics):

Generic Device Group Definition:

3rd level of EMDN + IVP code (In Vitro Diagnostic Product code).

Format: 1 letter + 4 digits + IVP code (e.g., Y0201 + IVP-XXX).

More granular than MDR due to IVP code requirement.

If 3rd level not specific enough, use next lower level if available

Sampling Requirements:

Class D & Class C:Assess at least one representative device per generic device group

Class B:Assess at least one representative device per category of devices

(IVR codes)

Class A (sterile):Requires notified body involvement.

Class A (non-sterile): No notified body required.

Example – IVDR Portfolio:**

Your IVDR Portfolio: 9 Devices Across 3 Generic Device Groups

Notified Body Assessment:  Devices

IVDR Group 1: Digital Pathology (4 devices)

  • EMDN 3rd level + IVP: Y0201 + IVP-PATH-01
  • Notified Body Assessment Focus: AI-Assisted Histopathology (Class C)
  • Your Responsibility: Prove equivalence for the other 3 devices.

IVDR Group 2: Laboratory Data Analytics (3 devices)

  • EMDN 3rd level + IVP: Y0105 + IVP-DATA-02
  • Notified Body Assessment Focus: Clinical Chemistry Interpreter (Class C)
  • Your Responsibility: Prove equivalence for the other 2 devices.

IVDR Group 3: Genetic Testing Platforms (3 devices)

  • EMDN 3rd level + IVP: Y0808 + IVP-GEN-03
  • Notified Body Assessment Focus: Next-Gen Sequencing Suite (Class C)
  • Your Responsibility: Prove equivalence for the other 2 devices.

Result: NB assesses 3 devices, you certify 9 devices Cost saving: This means Notified Body cost savings and time saving for faster market entry for derivatives.

2.2.2. AI Act Follows Device Regulation Sampling (MDCG 2025-6 Q13): 

Critical Consideration: When an AI system is integrated into a medical device (MDR) or IVD device (IVDR), the AI Act conformity assessment follows the same sampling rules as the device regulation:

  • For MDR + AI Act:
    • If your AI-enabled Class IIb device is the representative device for its MDR generic device group, the notified body assesses BOTH MDR and AI Act compliance in a single integrated assessment.
    • Other devices in the group are certified based on equivalence demonstration.
  • For IVDR + AI Act:
    • If your AI-enabled Class C IVD device is the representative device for its IVDR generic device group, the notified body assesses BOTH IVDR and AI Act compliance in a single integrated assessment.
    • Other devices in the group are certified based on equivalence demonstration.

Key Principle: ONE INTEGRATED ASSESSMENT, NOT TWO SEPARATE TRACKS 

This means:

  • ✓ Single technical file covering MDR/IVDR + AI Act requirements.
  • ✓ Single conformity assessment procedure.
  • ✓ Single certificate (not separate MDR and AI Act certificates).
  • ✓ Sampling applies to the combined assessment.

2.2.3. What This Means in Practice: Efficiency Through Grouping

For example, your Portfolio: 15 AI-enabled devices clustered across 3 generic device groups. The core regulatory strategy is to minimize Notified Body (NB) assessment time by proving equivalence between devices that share a common platform.

Notified Body Assessment Plan 

The strategy focuses the NB’s attention on only one representative device per group, shifting the burden of proving consistency for the others back to the manufacturer.

Generic Device GroupTotal Devices in GroupNB Assesses (Representative Device)Manufacturer Must ProveTotal NB Assessment
Group 1: Radiology Workstations6 devices \Advanced MRI Workstation \Equivalence of 5 other devices \
Group 2: Ultrasound Devices5 devices \Portable General Ultrasound \Equivalence of 4 other devices \3 Devices \
Group 3: Cloud Services4 devices \Teleradiology Platform \Equivalence of 3 other devices \

Critical Success Factors

Your ResponsibilityManufacturer’s Leverage
Demonstrate the equivalence of 12 non-assessed devices to the 3 assessed representatives .Integrated Platform Documentation \ is the single tool to prove consistency across the portfolio .

The Stakes: Why Consistency is Non-Negotiable 

WARNING: ONE SHOT ONLY

Time Constraint: Notified Body capacity constraints mean you have ONE chance to get this right .

The Risk of Failure: Inconsistent or fragmented documentation is the primary cause of rejection, leading to a 12-24 month market delay .

The Solution: Platform-level consistency (shared QMS, AI algorithms, security, etc.) is essential for demonstrating the required technical and clinical equivalence .

Traditional (Failed) Approach 

The traditional approach is characterized by treating each major regulation as a standalone compliance silo. This fragmentation creates immense complexity, duplication, and gaps.

Part 3: The Traditional Approach Fails – Why Separate Systems Don’t Work

3.1 The Fragmentation Trap

Compliance SiloKey ActivitiesIssues & Impact (Implicit)
Separate QMS for MDRISO 13485 implementationCreates technical files that are legally compliant but not easily auditable across the product line.
MDR technical files (one per device)Duplication of effort for common platform elements.
Risk management files (ISO 14971)Inconsistent application of risk controls across devices.
Clinical evaluation reports
Separate AI Governance System for AI ActAI Act compliance programIsolation prevents leveraging common AI infrastructure (algorithms, data).
Data governance proceduresRisks missed connections between AI data and clinical data.
AI-specific risk assessments
Transparency documentation
Separate Privacy Program for GDPRData protection policiesPolicies are difficult to enforce uniformly across cloud, edge, and workstation devices.
DPIAs for each processing activityRedundant Data Protection Impact Assessments (DPIAs)
Consent management system
Data subject rights procedures
Separate EHDS Compliance InitiativeInteroperability standards implementationData exchange is compliant but disconnected from core QMS and AI governance.
EHRxF compliance
Data exchange protocols
Secondary use governance

The Result: Siloed Systems -> Regulatory Inconsistency

This fragmented approach makes it nearly impossible to prove the consistency and equivalence of the 12 non-assessed devices (as required by the grouping strategy) because the foundational documentation (QMS, AI Governance, etc.) lives in separate, non-integrated places.

For example, we could be in the position of having:

  • 4 x documentation volume 
  • Conflicting requirements
  • Inconsistent evidence 
  • Multiple audit streams 
  • Exponential complexity with device families 
  • Inevitable non-conformities

3.2 The Device Family Multiplication Problem 

When separate systems meet device families, complexity becomes unmanageable:

Example: Algorithm Update Across Product Line 

Scenario: Update « Nodule Detection Algorithm » from v2.1 → v2.2

Algorithm used in: 8 devices across 3 generic device groups

TRADITIONAL APPROACH (DISASTER):

1. Update MDR/IVDR technical files for 8 devices separately]

→ 8 × 60 hours = 480 hours 

2. Update AI Act compliance documentation for 8 devices 

→ 8 × 40 hours = 320 hours 

3. Update GDPR DPIAs for algorithm change in 8 devices

→ 8 × 20 hours = 160 hours 

4. Update EHDS data exchange documentation for 8 devices

→ 8 × 15 hours = 120 hours

5. Perform 8 separate change control evaluations

→ Risk of inconsistent conclusions

6. Submit 8 separate change notifications to notified body

→ High risk of non-alignment

Total: 1,080 hours, 6-12 months, high audit failure risk 

ACTUAL CONSEQUENCE: 

• Different risk conclusions for same algorithm in different devices 

• Inconsistent performance claims across product line 

• Failed audit due to conflicting documentation 

• Market delays for entire product line 

3.3 The Audit Failure Pattern

A common Non-Conformity Example could be as follows:

Auditor Finding: NCR-2025-047

Régulation: MDR/IVDR Article 10(9) + AI Act Article 11(2) 

State of Non-Conformity: « When reviewing the medical device file, it appeared that the Manufacturer maintains separate risk management files for MDR/IVDR (device safety) and AI Act (algorithm risks) for the same device. The MDR/IVDR risk file documents hazard H-017 (false negative diagnosis) with residual risk ‘acceptable.’ The AI Act risk file documents the same hazard with residual risk ‘requires additional mitigation.’ These conflicting assessments indicate inadequate risk management integration. » 

Root Cause: Separate compliance systems for MDR/IVDR and AI Act

Impact: Certificate suspended for entire device family

Time to Remediate: 8-12 months (requires complete risk management system redesign)

Cost: NB fees + in lost sales + remediation costs

Part 4: The Modular Solution – Integrated Architecture 

4.1 The Three-Tier Modular Architecture

The solution is a layered, modular system where:

  1. Foundation addresses universal requirements
  2. Modules address specialized domains
  3. Tiers organize by scope (platform → group → device)

TIER 1: Platform Documentation (Create Once)

Scope: Applies to ALL devices using the platform

Foundation: Integrated Quality Management System (IQMS)

  • ISO 13485:2016 (medical device QMS)            
  • Quality policy addressing all four regulations 
  • Management responsibility and review 
  • Resource management

Core Regulatory Modules (Specialized Silos Consolidated)

Module PillarSupporting Standard / RegulationKey Areas Addressed
AI ModuleISO 42001 (AI Management)* AI risk assessment
* Data quality (for AI training)
* Transparency documentation
* Human oversight procedures
Security ModuleISO 27001 (Information Security)* Information security controls
* Cybersecurity risk management
* Privacy controls (technical measures)
Data Governance ModuleGDPR, EHDS, AI Act (Article 10)* GDPR compliance (data protection)
* EHDS interoperability
* Data quality assessment
* Data Provenance tracking

Figure 4: Platform components shared across all devices 

Platform Components (Shared Across All Devices)

1. Master AI Algorithm Library

├─ Master AI Algorithm Library

│  ├─ Algorithm technical files (design, validation)

│  ├─ Training dataset documentation

│  ├─ Bias assessment methodology

│  └─ Performance metrics and testing

2. Platform Risk Management File

├─ Platform Risk Management File

│  ├─ Algorithm-level hazards (bias, drift, opacity)

│  ├─ Data security risks

│  ├─ Privacy risks

│  └─ Cybersecurity risks

3. Platform Data Governance System (AI Act Art. 10)

├─ Platform Data Governance System (AI Act Art. 10)

│  ├─ Data provenance tracking

│  ├─ Data quality assessment procedures

│  ├─ Bias monitoring protocols

│  └─ EHDS interoperability standards

4. Platform Clinical Evidence Base

└─ Platform Clinical Evidence Base

   ├─ Algorithm validation studies

   ├─ Performance benchmarking data

   └─ Subpopulation analysis

The next tier, Tier 2, applies the platform documentation to specific groups of devices, enabling the « equivalence » argument required by the Notified Body.

TIER 2: Generic Device Group Documentation (Adapt and Specify)

Scope: This tier adapts the Tier 1 Platform documentation to a specific clinical domain, enabling the critical equivalence argument for Notified Body assessment.

Common Structure for All Groups

Each Generic Device Group (GDG) requires a technical file that follows a standardized structure, proving consistency within that group.

Generic Device GroupsKey Regulatory FocusCore Requirement
Group 1: Radiology WorkstationsMDR/MDCG EquivalenceMust prove 5 devices are equivalent to 1 assessed device.
Group 2: Ultrasound DevicesMDR/MDCG EquivalenceMust prove 4 devices are equivalent to 1 assessed device.
Group 3: Cloud ServicesMDR/MDCG EquivalenceMust prove 3 devices are equivalent to 1 assessed device.

Detailed Content for Each Group (e.g., Group 1: Radiology Workstations)

The following seven items represent the specialized file required for each group (Group 1, Group 2, and Group 3).

1. Group Definition & Scope

  • Group definition and inclusion criteria: Precise rules defining which specific products (e.g., CT, MRI, Multi-Modality Workstations) belong to this group based on technology, intended purpose, and safety profile.

2. Clinical and Performance Baseline

  • Common intended use statement: A single, overarching clinical purpose that covers the range of all devices within the group, yet remains specific to the domain (e.g., « AI-assisted diagnostic image review in a clinical setting »).
  • Group-level clinical evaluation strategy: Documentation justifying the use of a single Clinical Evaluation Report (CER) or Performance Evaluation Report (PER) to cover all devices in the group.

3. Technical Consistency

  • Shared GSPR compliance matrix: A document showing that the Group has met all applicable General Safety and Performance Requirements (GSPRs), citing the shared Platform documentation (Tier 1) wherever possible.
  • Common hardware architecture specifications: Detailed documentation of any hardware or software components that are identical or functionally equivalent across all devices in the group.

4. The Equivalence Argument (The Critical Step)

  • Representative device selection justification: Detailed reasoning why the specific chosen device (e.g., Advanced MRI Workstation) is the most appropriate, highest-risk/most-complex member of the group to undergo the Notified Body assessment.
  • Equivalence demonstration for non-assessed devices: A comprehensive technical and clinical matrix proving that all other devices in the group are safe and perform equivalently to the representative device, primarily by referencing the shared platform components (Tier 1).

Key Purpose: Equivalence and Representativeness

This documentation is essential for:

  1. Justifying Equivalence: Providing the technical and clinical bridge to show that non-assessed devices in the group are equivalent to the single, assessed representative device.
  2. Defining Common Characteristics: Specifying the group-level clinical, performance, and risk profiles.

TIER 2: GENERIC DEVICE GROUP DOCUMENTATION   

Scope: Applies to a specific group (e.g., Group 1)

To summarise Tier 2, the required Documents (Adapted from Platform):

1. Group Clinical Evidence Report (CER/IVDR Performance Evaluation Report)

  • Intended Use Statement: Consolidated definition covering the range of all devices in the group.
  • Clinical Performance Data: Summary of shared clinical validation for the common AI algorithms and features in the context of the specific clinical domain (e.g., all radiology devices).
  • Equivalence Matrix: Table proving that all non-representative devices in the group share the same clinical principle, intended use, technology, and risk profile as the assessed representative device.

2. Group Technical File / Design Dossier

  • Shared Design Characteristics: Documentation confirming the common architecture layer (AI, security, data governance) is implemented identically across all group members.
  • Core Software Architecture: High-level design document detailing how the shared platform services (Image Analysis Engine, NLP, etc.) are called by the specific group’s applications.

3. Group Risk Management File (RM File)

  • Consolidated Hazard Analysis: Risk assessment specific to the clinical environment (e.g., Point-of-Care vs. Cloud) and technology employed by the group.
  • Shared Risk Controls: Documentation of controls inherited directly from the Tier 1 Platform RM File (e.g., bias monitoring, data security protocols) and applied uniformly across the group.

The documentation can be therefore structured as follows: 

Group 1: Radiology Workstations

├─ Group definition and inclusion criteria

├─ Common intended use statement

├─ Shared GSPR compliance matrix

├─ Common hardware architecture specifications

├─ Group-level clinical evaluation strategy

├─ Representative device selection justification

└─ Equivalence demonstration for non-assessed devices

Group 2: Ultrasound Devices

├─ [Same structure as Group 1]

Group 3: Cloud Services

├─ [Same structure as Group 1]

Visual Link to Tiers

[TIER 1: PLATFORM]

      │

     ▼

TIER 2: GROUP 1

(Radiology Workstations)

      │

     ▼

[TIER 3: DEVICE]

TIER 3: INDIVIDUAL DEVICE DOCUMENTATION (Final Technical File)

Scope and Purpose: Applies to one specific commercial product (e.g., « Entry-level CT Analysis Workstation » or « Handheld Cardiac Ultrasound »).

The purpose of Tier 3 is to consolidate the documentation from Tiers 1 and 2, adding only the final, unique details specific to the product’s function, interfaces, and specific hardware/software build.

TIER 3: INDIVIDUAL DEVICE DOCUMENTATION 

Scope: Applies to ONE specific commercial product 

Required Documents (Device-Specific Additions)

This tier primarily consists of references to Tiers 1 and 2, with the following documents providing the unique product details:

1. Unique Product Technical File (The Aggregator)

  • Device-Specific Intended Use: The final, narrow intended use statement and classification (e.g., Class IIa, Class C-IVDR) specific to this product’s market submission.
  • Bill of Materials (BOM): Complete list of all commercial and off-the-shelf software and hardware components (e.g., specific GPU models, OS versions, third-party libraries).
  • Final Software Validation Report: Results of verification and validation testing performed specifically on the integrated, final product build before release.
  • User Documentation: Instructions for Use (IFU), labeling, and training materials specific to the device.

2. Localized Risk & Security Documentation

  • Final Risk Management Report: A report that references the Tier 2 Group RM File and documents any residual risks unique to the device’s specific configuration or clinical deployment environment (e.g., risks related to running the Mobile Radiology Review App on a consumer tablet).
  • Hardening Guide / Installation Manual: Security procedures specific to installing and maintaining the device, including configuration, network setup, and user access controls.

3. Clinical Data (If Unique)

  • Device-Specific Post-Market Surveillance (PMS) Plan: A plan detailing the collection of unique feedback relevant only to this product’s performance and safety profile.
  • Limited Clinical Data: Any targeted clinical performance data or usability studies unique to this device’s specific interface or deployment environment, not covered by the Group’s general evidence.

The Full Architectural Flow

The complete structure demonstrates how all documentation ultimately flows into the individual device’s technical file, ensuring consistency while minimizing duplication.

[TIER 1: PLATFORM (Universal)]

(Shared AI, QMS, Security)

           ▼

[TIER 2: GROUP (Equivalence Proof)]

(Group-Specific CER, RM, Equivalence Matrix)

            ▼

[TIER 3: DEVICE (Final Technical File)]

To illustrate this, the documentation is structured as follows: 

Device A: Advanced MRI Workstation (Class IIb – MDR)

├─ Device technical file (references Tier 1 & 2) 

│  ├─ Section 1: Device description (unique hardware) 

│  ├─ Section 2: AI integration (reference platform)

│  ├─ Section 3: Risk management (device-specific)

│  ├─ Section 4: Verification/validation (unique tests)

│  ├─ Section 5: Clinical evaluation (device context)

│  └─ Section 6: Instructions for Use 

├─ Device-specific risk assessment 

│  ├─ Use errors (unique to this UI/workflow) 

│  ├─ Environmental hazards (MRI suite context)

│  └─ References platform algorithm hazards

└─ Device labeling and IFU

  ├─ Device-specific warnings 

  ├─ AI limitations in MRI context 

  └─ References platform AI transparency docs]

Device B: Histopathology Platform (Class C – IVDR) 

├─ Device technical file (references Tier 1 & 2) 

│  ├─ Section 1: Device description (unique hardware) 

│  ├─ Section 2: AI integration (reference platform) 

│  ├─ Section 3: Risk management (device-specific)

│  ├─ Section 4: Verification/validation (unique tests)

│  ├─ Section 5: Performance evaluation (device context) 

│  └─ Section 6: Instructions for Use

├─ Device-specific risk assessment

│  ├─ Use errors (unique to this UI/workflow) 

│  ├─ Environmental hazards (lab context) 

│  └─ References platform algorithm hazards

└─ Device labeling and IFU 

  ├─ Device-specific warnings]

  ├─ AI limitations in pathology context

  └─ References platform AI transparency docs

4.2 The Integration Principle: « Create Once, Use Everywhere »

The strategic advantage of the Modular Architecture (Tiers 1-3) is rooted in evidence reuse. By centralizing documentation at the Platform level (Tier 1), we replace redundant work with efficient, targeted integration testing.

4.2.1. Efficiency Gain: Algorithm Validation Example

This comparison highlights the significant reduction in effort and time achieved by shifting validation from a device-level activity to a platform-level activity.

Aspect❌ TRADITIONAL (Device-Centric) APPROACH✅ MODULAR (Platform-Centric) APPROACH
Core ActivityValidate the same AI Algorithm for every single device it’s used in.Validate the AI Algorithm once at the PLATFORM level (Tier 1).
Effort: Device A200 hoursIntegration testing only: 30 hours
Effort: Device B200 hoursIntegration testing only: 30 hours
Effort: Device C200 hoursIntegration testing only: 30 hours
Total Validation Time600 hours for identical algorithm.340 hours (250h Platform + 90h Integration).
Time ReductionN/A43% Reduction in total validation effort.

Quality Improvement: 

– Platform validation is more comprehensive (larger budget) 

– Consistency across devices guaranteed 

– Single source of truth for algorithm performance 

– Change control simplified (update platform, assess device impact)

Key Takeaway

Platform documentation (Tier 1) transforms repetition into reference. Instead of duplicating high-effort tasks (like initial algorithm validation) for every device, you only need to prove that the individual device successfully integrated the pre-validated platform component.

4.3 The Modular Evidence Crosswalk

The strategic tool that makes modular architecture auditable:

Evidence DocumentCreated AtMDR RequirementAI Act RequirementGDPR RequirementEHDS Requirement
Platform Algorithm Technical FilePlatform (Tier 1)Annex II Sect. 1.1 (software description)Article 11 (AI system description)Article 25 (data protection by design)Article 30 (software specifications)
Training Data Quality ReportPlatform (Tier 1)Annex XIV Part A (clinical data)Article 10 (data governance)Article 5(1)(d) (data accuracy)Article 77 (data quality label)
Algorithm Validation StudyPlatform (Tier 1)Annex II Sect. 6 (V&V)Article 15 (accuracy, robustness)N/AArticle 78 (performance metrics)
Platform Risk Management FilePlatform (Tier 1)Annex II Sect. 5 (risk analysis)Article 9 (AI risk management)Article 35 (DPIA)Article 40 (security measures)
Generic Device Group DefinitionGroup (Tier 2)MDCG 2019-13 (sampling)Article 11(2) (tech doc)N/AN/A
Device Integration SpecificationDevice (Tier 3)Annex II Sect. 1.1 (device description)Article 11 (integration details)Article 25 (security measures)Article 27 (interoperability)
Instructions for UseDevice (Tier 3)Annex I Ch. III (user info)Article 13 (transparency)Article 12 (clear information)Article 30 (user guidance)
Post-Market Surveillance PlanPlatform + DeviceArticle 83 (PMS)Article 61 (post-market monitoring)Article 32 (ongoing security)Article 41 (data monitoring)

Audit Scenario (Modular Response PASSES):

When asked for AI Act Article 10 compliance for Device A, the modular response is: « Article 10 is addressed at the platform level. Here’s our Platform Training Data Quality Report [DOC-PLATFORM-010], which applies to all devices. Device A’s technical file Section 2.3 references this platform document and confirms the algorithm implementation is consistent with the validated version. Additionally, this same evidence satisfies MDR Annex XIV requirements for clinical data quality, GDPR Article 5 accuracy requirements, and EHDS Article 77 data quality labeling. »

  • the auditor sees consistent evidence across the entire portfolio: Conformity confirmed.

Part 5: Platform Architecture for Device Families

5.1 The Shared AI Platform Concept

Modern medical device families are built on “shared platforms” where core AI capabilities serve multiple physical products. For example:

The AI DIAGNOSTIC PLATFORM v3.0 serves multiple products by providing reusable core capabilities:

  • Core Capabilities (Reusable Components):
    • Image Analysis Engine v3.2 (Nodule detection, Segmentation, Classification)
    • Quantitative Measurement Suite v2.1 (Volume calculation, Functional parameters, Growth rate analysis)
    • Decision Support Engine v1.8 (Risk stratification, Treatment recommendations, Urgency triage)
  • Platform Service Scope: Serves 6 imaging modalities, 15 device models, 3 generic device groups, and multiple clinical specialties.

5.2 Platform-Level Documentation Strategy

The PLATFORM TECHNICAL DOCUMENTATION (PTF-AI-DIAG-v3.0) is the comprehensive Tier 1 file:

  • Section 1: Platform Overview & Scope: Master list of devices, version control, relationship to generic device groups.
  • Section 2: AI System Description (AI Act Article 11): Algorithm architecture, platform-level intended use, key design choices (e.g., Accuracy vs. speed optimization).
  • Section 3: Data Governance (AI Act Article 10): Training data (composition, quality, representativeness, provenance), Bias Assessment and Mitigation, Privacy Protection (GDPR), EHDS Compliance (standards).
  • Section 4: Algorithm Validation (Platform-Level): Validation study design, Performance results (metrics, subgroup analysis), Robustness testing, Limitations.
  • Section 5: Platform Risk Management: Algorithm-level hazards (H-P-001: False negative, H-P-003: Algorithmic bias), Cybersecurity risks, Privacy risks, Benefit-risk analysis.
  • Section 6: Transparency & Human Oversight (AI Act Articles 13-14): Explainability approach, Human oversight mechanisms (Human-in-the-loop, Override), Transparency documentation standards.
  • Section 7: Platform Change Control: Version management, Significant change evaluation criteria (MDCG 2020-3), Update propagation to devices.
  • Section 8: Post-Market Surveillance (Platform-Level): Performance monitoring plan (AI Act Article 15), Vigilance procedures (MDR Article 87), Platform-wide incident reporting.

For illustration,

Section 1: Platform Overview & Scope

├─ 1.1 Platform description and architecture

├─ 1.2 Devices that implement this platform (master list)

├─ 1.3 Version control and change history

└─ 1.4 Relationship to generic device groups

Section 2: AI System Description (AI Act Article 11)

├─ 2.1 Algorithm architecture and design rationale

│  ├─ Neural network structures

│  ├─ Training methodology

│  ├─ Computational requirements

│  └─ Input/output specifications

├─ 2.2 Intended use statement (platform-level)

│  ├─ Medical purpose

│  ├─ Target pathologies

│  ├─ Intended user population

│  └─ Intended patient population

└─ 2.3 Key design choices and tradeoffs

   ├─ Accuracy vs. speed optimization

   ├─ Sensitivity vs. specificity balance

   └─ Explainability vs. performance

Section 3: Data Governance (AI Act Article 10)

├─ 3.1 Training Data

│  ├─ Dataset composition (N=50,000 cases)

│  ├─ Data sources and collection methodology

│  ├─ Data quality assessment

│  ├─ Representativeness analysis

│  │  ├─ Demographics (age, sex, ethnicity)

│  │  ├─ Disease severity spectrum

│  │  ├─ Image quality range

│  │  └─ Scanner manufacturer diversity

│  └─ Data provenance documentation

├─ 3.2 Bias Assessment and Mitigation

│  ├─ Subgroup performance analysis

│  ├─ Identified biases and root causes

│  ├─ Mitigation strategies implemented

│  └─ Ongoing bias monitoring plan

├─ 3.3 Privacy Protection (GDPR compliance)

│  ├─ Data anonymization procedures

│  ├─ Legal basis for data processing

│  ├─ Data minimization approach

│  └─ Data retention policies

└─ 3.4 EHDS Compliance

   ├─ Data formatting standards (DICOM, HL7 FHIR)

   ├─ Interoperability specifications

   └─ Secondary use considerations

Section 4: Algorithm Validation (Platform-Level)

├─ 4.1 Validation study design

│  ├─ Independent test dataset (N=10,000)

│  ├─ Reference standard definition

│  ├─ Statistical analysis plan

│  └─ Success criteria

├─ 4.2 Performance results

│  ├─ Primary metrics (sensitivity, specificity, AUC)

│  ├─ Subgroup analysis

│  ├─ Confidence intervals

│  └─ Clinical significance assessment

├─ 4.3 Robustness testing

│  ├─ Out-of-distribution detection

│  ├─ Adversarial testing

│  ├─ Image quality degradation scenarios

│  └─ Scanner variability testing

└─ 4.4 Limitations and contraindications

   ├─ Known failure modes

   ├─ Inappropriate use cases

   └─ Patient exclusion criteria

Section 5: Platform Risk Management

├─ 5.1 Algorithm-level hazards (ISO 14971 + AI Act Article 9)

│  ├─ H-P-001: False negative (missed pathology)

│  │  ├─ Severity: Critical

│  │  ├─ Probability: 8% (from validation)

│  │  ├─ Mitigations: Uncertainty quantification, human oversight

│  │  └─ Residual risk: Acceptable with controls

│  │

│  ├─ H-P-002: False positive (unnecessary intervention)

│  ├─ H-P-003: Algorithmic bias (performance disparity)

│  ├─ H-P-004: Data drift (performance degradation over time)

│  ├─ H-P-005: Adversarial input (security vulnerability)

│  └─ [Additional platform-level hazards]

├─ 5.2 Cybersecurity risks (ISO 27001 alignment)

├─ 5.3 Privacy risks (GDPR compliance)

└─ 5.4 Benefit-risk analysis (platform-level)

Section 6: Transparency & Human Oversight (AI Act Articles 13-14)

├─ 6.1 Explainability approach

│  ├─ Output interpretation guidance

│  ├─ Confidence scoring methodology

│  ├─ Limitations communication

│  └─ Uncertainty visualization

├─ 6.2 Human oversight mechanisms (platform design)

│  ├─ Human-in-the-loop requirements

│  ├─ Override capabilities

│  ├─ Review workflow integration

│  └─ Escalation procedures

└─ 6.3 Transparency documentation standards

   ├─ Template for device-specific IFU sections

   ├─ User training requirements

   └─ Clinical decision support context

Section 7: Platform Change Control

├─ 7.1 Version management procedures

├─ 7.2 Significant change evaluation criteria (MDCG 2020-3)

├─ 7.3 Update propagation to devices

├─ 7.4 Revalidation triggers and procedures

└─ 7.5 Notified body notification process

Section 8: Post-Market Surveillance (Platform-Level)

├─ 8.1 Performance monitoring plan (AI Act Article 15)

│  ├─ Real-world performance metrics

│  ├─ Bias monitoring in deployment

│  ├─ Data drift detection

│  └─ User feedback analysis

├─ 8.2 Vigilance procedures (MDR Article 87)

├─ 8.3 Platform-wide incident reporting

└─ 8.4 Continuous improvement process

5.3 Generic Device Group Documentation

The GENERIC DEVICE GROUP DEFINITION (e.g., GDG-RAD-WS-001) serves as the Tier 2 documentation for sampling:

  • 1. Group Scope & Rationale: Defines common intended use, common technology (AI Platform v3.0, core algorithms), common design, and common function.
  • 2. Devices Included in This Group: Lists all devices (e.g., RW-1 BasicRad, RW-2 ProRad, RW-6 ERRad) with their Class (IIa, IIb).
  • 3. Representative Device Selection: Justifies the choice of the representative device (e.g., RW-2 ProRad) based on highest risk, implementation of all algorithms, and common use case.
  • 4. Equivalence Demonstration: For each non-assessed device (e.g., RW-1 BasicRad), formally compares it to the representative device to prove equivalence or lower risk.
  • 6. Group-Level Clinical Evaluation Strategy: Defines how platform evidence, representative study, and literature review cover all devices in the group.

For illustration,

GENERIC DEVICE GROUP DEFINITION

Group: Radiology Workstations with AI Analysis

Group ID: GDG-RAD-WS-001

1.0 Group Scope & Rationale

├─ Common Intended Use:

│  « AI-enhanced diagnostic image analysis for radiological 

│   interpretation by qualified healthcare professionals »

├─ Common Technology:

│  ├─ All devices implement AI Platform v3.0

│  ├─ All use same core algorithms (Image Analysis Engine)

│  ├─ All connect to cloud processing infrastructure

│  └─ All implement EHDS interoperability standards

├─ Common Design:

│  ├─ Workstation architecture (display + processing)

│  ├─ DICOM viewer integration

│  ├─ Similar user interface patterns

│  └─ Common cybersecurity architecture

└─ Common Function:

   ├─ Image acquisition/import

   ├─ AI-assisted analysis

   ├─ Human review and override

   └─ Report generation

2.0 Devices included in This Group

IDModel NameClassPrimary Modality
RW-1BasicRadIIaCT, X-ray
RW-2ProRadIIbCT, MRI
RW-3EliteRadIIbCT, MRI, PET
RW-4MobileRadIIaX-ray, portable
RW-5CloudRadIIaMulti-modality
RW-6ERRadIIbCT, stat analysis

3.0 Representative Device Selection

Selected: RW-2 « ProRad » (Class IIb, CT/MRI Analysis)

Justification:

├─ Implements ALL platform algorithms used in the group

├─ Mid-range hardware complexity (not minimum, not maximum)

├─ Most common clinical use case (hospital radiology dept)

├─ Typical user population (board-certified radiologists)

├─ Adequate clinical evidence available

└─ Represents highest risk classification in group

4.0 Equivalence Demonstration

For each non-assessed device, demonstrate:

RW-1 (BasicRad) vs RW-2 (ProRad):

├─ Platform algorithms: IDENTICAL (same versions from platform)

├─ Hardware: LESS CAPABLE (subset of ProRad functionality)

├─ Clinical context: SIMILAR (hospital radiology, similar users)

├─ Risk profile: LOWER (fewer modalities, simpler analysis)

└─ Conclusion: ProRad assessment covers BasicRad

RW-3 (EliteRad) vs RW-2 (ProRad):

├─ Platform algorithms: IDENTICAL + ADDITIONAL (adds PET)

├─ Hardware: MORE CAPABLE (faster processing)

├─ Clinical context: SIMILAR (hospital, same users)

├─ Risk profile: SAME (IIb classification, similar hazards)

├─ Delta documentation: PET-specific integration validation

└─ Conclusion: ProRad + delta assessment adequate

[Similar analysis for RW-4, RW-5, RW-6]

5.0 Group-Level GSPR Compliance Matrix

[Common compliance for all devices in group]

6.0 Group-Level Clinical Evaluation Strategy

├─ Platform algorithm evidence (applies to all)

├─ Representative device clinical study

├─ Literature review for device variants

└─ Equivalence rationale for non-assessed devices

5.4 Device-Specific Documentation (Minimal Supplement)

The DEVICE TECHNICAL FILE (e.g., ProRad Advanced MRI/CT Workstation, Model RW-2) is the minimal Tier 3 supplement:

  • Section 1: Device Description: Unique hardware specifications, device-specific intended use.
  • Section 2: Platform Integration: References PTF-AI-DIAG-v3.0 and lists implemented algorithms.
  • Section 3: Device-Specific Risk Assessment: References Platform Risk Management File and documents unique hazards (Use errors, Environmental factors, Hardware failures).
  • Section 4: Verification & Validation: References platform validation and documents device-specific testing (integration, UI, human factors).
  • Section 5: Clinical Evaluation (Device Context): References platform clinical evidence and documents device-specific clinical context.
  • Section 6: Instructions for Use: Device operation, AI functionality guidance, device-specific warnings.
  • Key Principle: Clear references ensure traceability and minimize duplication.

Here is a proposed detailed structure of the device technical file 

DEVICE TECHNICAL FILE

Device: ProRad Advanced MRI/CT Workstation

Model: RW-2

Classification: Class IIb

Generic Device Group: GDG-RAD-WS-001

Section 1: Device Description

├─ Hardware specifications (unique to this device)

├─ Software configuration (references platform)

├─ Intended use (device-specific context)

└─ Indications for use

Section 2: Platform Integration

├─ References: Platform Technical File PTF-AI-DIAG-v3.0

├─ Algorithms implemented:

│  ├─ Image Analysis Engine v3.2 ✓

│  ├─ Measurement Suite v2.1 ✓

│  └─ Decision Support v1.8 ✓

├─ Platform integration testing results

└─ Configuration parameters

Section 3: Device-Specific Risk Assessment

├─ References: Platform Risk Management File (algorithm hazards)

├─ Device-specific hazards:

│  ├─ Use errors (unique to this UI design)

│  ├─ Environmental factors (MRI suite EMI)

│  └─ Hardware failures (display, network)

└─ Combined risk analysis

Section 4: Verification & Validation

├─ Platform algorithm validation (by reference)

├─ Device integration verification

├─ Device-specific testing (hardware, UI, workflow)

└─ Human factors validation

Section 5: Clinical Evaluation (Device Context)

├─ Platform clinical evidence (by reference)

├─ Device-specific clinical study or literature

└─ Clinical context and user population

Section 6: Instructions for Use

├─ Device operation instructions

├─ AI functionality guidance (references platform transparency docs)

├─ Device-specific warnings and precautions

└─ Training requirements

Part 6: Implementation Roadmap (MDR Focus)

6.1 The Transformation Journey

The transformation follows five phases:

PhaseDurationFocus MilestonesKey Deliverables
1: Assessment & PlanningMonths 1-3Map current portfolio to generic groups, Gap analysis against integrated architecture, Establish integrated governance.Current state report, Target architecture design, Resource allocation.
2: Foundation BuildingMonths 4-8Rewrite Quality Policy (all 4 regulations), Integrate core procedures (Risk, Design Control, Data Gov.), Implement ISO 42001 and ISO 27001 modules.Integrated Quality Policy, Updated QMS Manual, ISO modules.
3: Platform DocumentationMonths 9-14Create master Platform Technical File (PTF), Consolidate evidence, Establish single AI Act + MDR documentation.Platform Technical File (complete), Platform Risk Management File, Cross-regulation evidence crosswalk.
4: Device Family RestructuringMonths 15-20Define all Generic Device Groups, Create minimal Device-Specific Supplements (referencing PTF), Create master traceability matrix.Generic device group definitions, Device technical file supplements, Master traceability matrix.
5: Notified Body EngagementMonths 21-26NB consultation (architecture/sampling), Submit PTF for Platform Assessment, Submit Representative Devices for assessment (MDCG 2019-13 sampling).Platform approval, Representative device approvals, CE certificates for device families.
6: Operational ExcellenceOngoingMaintain platform version control, Scale to new devices efficiently (leveraging platform), Monitor regulatory landscape.Continuous improvement, Accelerated time to market.

In details the documentation architecture could look like: 

Phase 1: Assessment & Planning

Milestone 1.1: Current State Analysis

├─ Inventory existing documentation

│  ├─ How many separate technical files?

│  ├─ How many devices share AI algorithms?

│  ├─ What platform components exist (undocumented)?

│  └─ Where are inconsistencies and gaps?

├─ Map current device portfolio to generic groups

│  ├─ Define grouping criteria

│  ├─ Identify representative devices

│  └─ Assess equivalence relationships

└─ Gap analysis against integrated architecture

   ├─ Missing platform documentation

   ├─ Incomplete AI Act compliance elements

   ├─ EHDS interoperability gaps

   └─ GDPR/privacy documentation needs

Milestone 1.2: Architecture Design

├─ Define platform boundaries

│  ├─ Which algorithms belong to platform?

│  ├─ What data governance is shared?

│  ├─ What risk management is common?

│  └─ What clinical evidence is reusable?

├─ Design three-tier structure

│  ├─ Platform documentation outline

│  ├─ Group documentation templates

│  └─ Device supplement templates

└─ Create evidence crosswalk

   ├─ Map each document to all applicable regulations

   ├─ Identify reuse opportunities

   └─ Define referencing conventions

Milestone 1.3: Governance & Roles

├─ Establish integrated governance structure

│  ├─ AI Governance Officer (ISO 42001 lead)

│  ├─ Data Protection Officer (GDPR lead)

│  ├─ EHDS Compliance Coordinator

│  └─ Cross-functional review board

├─ Define responsibilities

│  ├─ Who owns platform documentation?

│  ├─ Who manages generic group definitions?

│  ├─ Who maintains device-specific supplements?

│  └─ Who coordinates notified body interactions?

└─ Training plan

   ├─ Modular architecture concepts

   ├─ Documentation standards

   └─ Evidence referencing procedures

Deliverables:

✓ Current state assessment report

✓ Target architecture design document

✓ Implementation project plan

✓ Resource allocation and budget

Phase 2: Foundation Building 

Milestone 2.1: Integrated QMS Foundation

├─ Rewrite Quality Policy

│  ├─ Address all four regulations in single policy

│  ├─ Commit to platform-based approach

│  └─ Define integration principles

├─ Update QMS Manual

│  ├─ Document modular architecture

│  ├─ Define three-tier structure

│  ├─ Establish module ownership

│  └─ Create management review agenda covering all regulations

└─ Integrate core procedures

   ├─ Risk management (ISO 14971 + AI Act Art. 9)

   ├─ Design control (ISO 13485 + IEC 62304)

   ├─ Data governance (AI Act Art. 10 + GDPR)

   └─ Change control (MDCG 2020-3 + platform updates)

Milestone 2.2: ISO Standards Implementation

├─ ISO 42001 (AI Management System) module

│  ├─ AI governance framework

│  ├─ AI risk assessment procedures

│  ├─ Data management standards

│  └─ Transparency and human oversight requirements

├─ ISO 27001 (Information Security) module

│  ├─ Security governance structure

│  ├─ Risk assessment methodology

│  ├─ Technical and organizational measures

│  └─ Incident response procedures

└─ Integration with ISO 13485

   ├─ Common management review

   ├─ Unified audit program

   ├─ Integrated corrective action system

   └─ Single document control system

Deliverables:

✓ Integrated Quality Policy

✓ Updated QMS Manual

✓ Core integrated procedures

✓ ISO 42001 module documentation

✓ ISO 27001 module documentation

Phase 3: Platform Documentation

Milestone 3.1: Platform Technical File

├─ Create master platform documentation

│  ├─ AI system architecture and design

│  ├─ Algorithm library technical specifications

│  ├─ Training data governance documentation

│  ├─ Platform validation evidence

│  └─ Platform risk management file

├─ Consolidate existing evidence

│  ├─ Collect validation studies from device files

│  ├─ Synthesize into platform validation report

│  ├─ Identify and fill gaps

│  └─ Create single source of truth

└─ Establish version control

   ├─ Platform version numbering scheme

   ├─ Change history documentation

   └─ Device-to-platform mapping table

Milestone 3.2: Cross-Regulation Integration

├─ Single AI Act + MDR technical documentation

│  ├─ Integrated structure (not separate files)

│  ├─ Cross-referenced sections

│  └─ Evidence mapping tables

├─ GDPR compliance integration

│  ├─ Platform-level DPIA

│  ├─ Privacy by design documentation

│  └─ Legal basis assessments

└─ EHDS interoperability evidence

   ├─ Standards compliance documentation

   ├─ Interoperability testing results

   └─ Data exchange specifications

Deliverables:

✓ Platform Technical File (complete)

✓ Platform Risk Management File

✓ Master Training Data Documentation

✓ Platform Validation Report

✓ Cross-regulation evidence crosswalk

Phase 4: Device Family Restructuring 

Milestone 4.1: Generic Device Group Definitions

For each group:

├─ Define inclusion criteria

├─ List all devices in group

├─ Select representative device with justification

├─ Create equivalence demonstration matrix

├─ Document common GSPR compliance

└─ Define group-level clinical strategy

Milestone 4.2: Device-Specific Supplements

For each device:

├─ Create minimal device technical file

│  ├─ References platform documentation

│  ├─ Documents unique hardware/integration

│  ├─ Addresses device-specific risks

│  └─ Provides device-specific clinical context

├─ Update Instructions for Use

│  ├─ Device operation guidance

│  ├─ References platform AI transparency docs

│  ├─ Device-specific warnings

│  └─ Training requirements

└─ Link to group documentation

   ├─ Confirm group membership

   ├─ Document any group deviations

   └─ Establish equivalence to representative

Milestone 4.3: Evidence Consolidation

├─ Eliminate duplicate documentation

├─ Establish referencing standards

├─ Create master traceability matrix

└─ Archive legacy documentation

Deliverables:

✓ Generic device group definitions (all groups)

✓ Device technical file supplements (all devices)

✓ Updated Instructions for Use (all devices)

✓ Master traceability matrix

✓ Archived legacy documentation

Phase 5: Notified Body Engagement

Milestone 5.1: Pre-submission Strategy

├─ Schedule notified body consultation

├─ Prepare architecture presentation

│  ├─ Explain platform-based approach

│  ├─ Show three-tier documentation structure

│  ├─ Demonstrate compliance with MDCG 2025-6

│  └─ Propose sampling strategy

└─ Prepare demonstration materials

   ├─ Sample platform-to-device traceability

   ├─ Evidence reuse examples

   └─ Change control process illustration

Milestone 5.2: Platform Assessment

├─ Submit platform technical documentation for review

├─ NB assesses platform as foundation for device family

├─ Address any findings or questions

└─ Achieve platform approval

Milestone 5.3: Representative Device Assessment

├─ Submit representative devices for each group

├─ NB assesses integration of platform into devices

├─ Demonstrate equivalence of non-assessed devices

└─ Achieve device approvals

Milestone 5.4: Ongoing Surveillance

├─ Annual surveillance audits

│  ├─ Platform-level audit (comprehensive)

│  ├─ Group-level sampling (select devices)

│  └─ Change control review

└─ Change notification procedures

   ├─ Platform update evaluation

   ├─ Device impact assessment

   └─ Notified body communication

Deliverables:

✓ Notified body strategy document

✓ Platform approval

✓ Representative device approvals

✓ CE certificates for device families

✓ Surveillance audit schedule

Phase 6: Operational Excellence (Ongoing)

Continuous Improvement Activities:

├─ Monitor regulatory landscape

│  ├─ MDCG guidance updates

│  ├─ AI Act implementing regulations

│  ├─ EHDS technical standards evolution

│  └─ Harmonized standards publication

├─ Maintain platform documentation

│  ├─ Regular review and updates

│  ├─ Version control discipline

│  ├─ Change control rigor

│  └─ Continuous validation

├─ Scale to new devices

│  ├─ Assess fit to existing groups

│  ├─ Leverage platform documentation

│  ├─ Minimize device-specific work

│  └─ Accelerate time to market

└─ Share best practices

   ├─ Internal training programs

   ├─ Cross-functional collaboration

   └─ Industry participation

6.2 Resource Requirements

Core Implementation Team: Program Manager, Regulatory Affairs Lead, Quality/RA Documentation Specialists (3 FTE), AI Governance Lead, Data Protection Officer, Information Security Manager, EHDS Compliance Specialist.

Part 7: The Business Case – Why This Matters

1. Strategic Value Proposition

Manufacturer TypeChallengeValueImpact
Small-Medium(1-10 devices)Limited resources, stretched teams, can’t afford duplicationReduces documentation burden by 40-60%Survival – maintain compliance without unsustainable headcount
Large(10-50+ devices)Portfolio complexity, , change management nightmare, scaling issuesEnables portfolio growth without linear cost increaseCompetitive advantage – faster innovation cycles
Startups (Pre-market)Building from scratch, establishing first QMS, limited expertiseIntegrated architecture avoids technical debtRight-first-time market entry

2. Risk Mitigation

  • Audit Failure Risk: Modular approach offers greater risk reduction compared to the Traditional approach.
  • Change Control Nightmare Prevention: Modular approach is much faster and cheaper when updating an algorithm across 8 devices.

3. Competitive Advantage

  • Time to Market for New Devices
  • Scalability: Modular Approach can easily allow for portfolio growth in shorter times. 

4. Organizational Benefits

  • Knowledge Management: Knowledge is centralized and reusable, developing cross-device expertise.
  • Quality Culture: Compliance is integrated into the innovation process, accelerating innovation with a reusable platform.

Part 8: Critical Success Factors

8.1 Critical Success Factors: Organizational Alignment

Critical FactorRequirement DetailExpected Outcome
Executive Sponsorship* C-level commitment to transformationSecures necessary Resource Allocation Authority and drives Strategic Priority Setting for the project.
* Resource allocation authority
* Strategic priority setting
* Change management leadership
Cross-Functional Buy-In* R&D, Quality, Regulatory, Clinical, Legal alignmentFosters a Shared Understanding of the Value Proposition and ensures Collaborative Problem-Solving Culture.
* Shared understanding of value proposition
* Willingness to change existing processes
* Collaborative problem-solving culture
Long-Term Perspective* Accept 12-18 month investment phaseEnsures commitment to realizing the 3-5 Year ROI and measuring success holistically, rather than focusing on short-term costs.
* Focus on 3-5 year ROI
* Resist pressure to revert to old approaches
* Measure success holistically

8.2  Critical Capabilities for the Integrated Compliance Team

These capabilities are vital for transforming the compliance process from a bureaucratic task into a strategic, value-driven function.

Critical CapabilityDescriptionStrategic Impact
Documentation Quality* Clear, concise, traceable writingEnables efficient Notified Body review and rapid $\text{Device } \leftrightarrow \text{ Platform}$ traceability.
* Consistent terminology across regulations
* Robust version control
* Professional presentation
Systems Thinking* Understanding of platform architecture conceptsEssential for designing the Modular QMS and maximizing the « Create Once, Use Everywhere » principle.
* Ability to design for reuse
* Recognition of interdependencies
* Modular design principles
Regulatory Intelligence* Deep understanding of all four regulationsEnsures accurate mapping of overlaps and synergies to satisfy multiple requirements with single evidence.
* Ability to identify overlaps and synergies
* Proactive monitoring of guidance updates
* Strategic interpretation of requirements

8.3 Operational Discipline: Maintaining the Modular Architecture

Operational discipline is the final, ongoing phase that ensures the continued effectiveness and integrity of the integrated three-tier documentation structure.

Operational Discipline AreaKey Process RequirementsOutcome and Integrity Check
Change Control Rigor* Every platform change assessed for device impactMaintains Consistency across the entire device portfolio, critical for Notified Body sampling strategy.
* Automated impact assessment where possible
* Clear decision criteria (MDCG 2020-3)
* Documented rationale for all decisions
Document Control Excellence* Master document hierarchy maintainedGuarantees Traceability from individual device supplements back to the core Platform evidence.
* Reference integrity preserved
* Version synchronization across tiers
* Archive and retrieval procedures
Continuous Improvement* Regular architecture reviewsDrives Efficiency and ensures the compliance system remains robust and adaptable to new regulations (e.g., updates to the AI Act).
* Lessons learned documentation
* Efficiency metric tracking
* Innovation in compliance approaches

8.4 Stakeholder Management: Sustaining the Ecosystem

Effective management of key relationships is vital for navigating the complex regulatory landscape and ensuring the architecture remains relevant and accepted.

Stakeholder GroupKey Relationship RequirementsStrategic Outcome
Notified Body Partnership* Early engagement and transparencyBuilds Trust and ensures the NB has a Mutual Understanding of the Approach <->, streamlining the assessment process.
* Regular communication
* Proactive problem-solving
* Mutual understanding of approach
Internal Alignment* Regular cross-functional meetingsEnsures Shared Metrics and Goals across R&D, Quality, Regulatory, and Legal, preventing internal fragmentation.
* Shared metrics and goals
* Conflict resolution mechanisms
* Celebration of milestones
Industry Collaboration* Participation in trade associationsSupports External Benchmarking and ensures the architecture evolves with industry best practices and emerging standards (e.g., in AI and EHDS).
* Sharing of best practices (non-competitive)
* Contribution to standards development
* Thought leadership

Conclusion: The Imperative for Transformation

The Bottom Line

The convergence of EU MDR, AI Act, GDPR, and EHDS creates a regulatory environment that cannot be navigated successfully through traditional, siloed approaches. When combined with the reality of device families and platform-based products, the complexity becomes unmanageable without systematic integration.

The modular compliance architecture is not optional—it is essential for:

  1. Regulatory Survival
    • Meeting the integrated requirements of MDCG 2025-6
    • Demonstrating platform consistency for MDCG 2019-13 sampling
    • Avoiding audit failures due to inconsistent documentation
    • Maintaining certificates across device families
  2. Economic Viability
    • Reducing documentation costs 
    • Accelerating time-to-market 
    • Enabling portfolio scaling without linear cost increases
    • Creating competitive advantage through efficiency
  3. Innovation Enablement
    • Removing compliance as a bottleneck to innovation
    • Enabling rapid deployment of platform improvements across portfolio
    • Supporting agile development methodologies
    • Fostering a culture of systematic excellence
  4. Strategic Positioning
    • Building organizational capabilities for the AI era
    • Establishing industry leadership in compliance approaches
    • Creating barriers to entry for competitors
    • Positioning for future regulatory evolution

The Path Forward

For manufacturers at any stage:

Starting Fresh: Build integrated architecture from day one

  • Avoid technical debt
  • Establish best practices early
  • Scale efficiently from the beginning

Transforming Existing Systems: Commit to 18-24 month transformation

  • Accept initial investment
  • Follow disciplined implementation roadmap
  • Measure and communicate value

Scaling Operations: Leverage modular architecture for growth

  • Add devices efficiently
  • Maintain consistency automatically
  • Innovate without compliance barriers

The Future State

Organizations that successfully implement modular compliance architectures will:

✓ Navigate regulatory complexity with confidence ✓ Launch new AI-enabled devices in months, not years ✓ Maintain consistent quality across growing portfolios ✓ Respond to regulatory changes efficiently ✓ Attract talent seeking systematic excellence ✓ Lead their markets through innovation and compliance mastery

The regulatory landscape for AI-enabled medical devices is complex and demanding. The solution is not to work harder within broken systems, but to transform those systems into integrated, modular architectures that turn regulatory requirements from obstacles into strategic advantages.

The question is not whether to adopt modular architecture, but how quickly you can transform your organization to compete in the new regulatory reality.

Appendix I: Quick Reference Resources

Key Regulatory Documents:

  • EU MDR 2017/745
  • EU AI Act 2024/1689
  • MDCG 2025-6: Interplay between MDR/IVDR and AI Act
  • MDCG 2019-13: Sampling rules for conformity assessment
  • MDCG 2020-3: Significant change determination

Standards Framework:

  • ISO 13485:2016 (Medical device QMS)
  • ISO 42001:2023 (AI management system)
  • ISO 27001:2022 (Information security)
  • ISO 14971:2019 (Risk management)
  • IEC 62304:2006+A1:2015 (Medical device software lifecycle)
  • IEC 62366-1:2015 (Usability engineering)

EHDS Technical Standards:

  • EEHRxF (European EHR Exchange Format)
  • HL7 FHIR (Fast Healthcare Interoperability Resources)
  • SNOMED CT (Clinical terminology)
  • LOINC (Laboratory observations)
  • ICD (Disease classification)

Appendix II: Clause-by-Clause Regulatory/Evidence Mapping Table

Compliance Evidence / SectionMDR/IVDRAI ActGDPREHDSISO / Technical Standard
Platform Algorithm Technical FileAnnex II, Sect. 1.1 (software), Annex II, Sect. 6 (VV)Art. 11 (system description); Art. 15 (accuracy, robustness)Art. 25 (data protection by design)Art. 30 (software specs), Art. 77 (data quality label)ISO 13485:4.2, IEC 62304
Training Data Quality ReportAnnex XIV, Part A (clinical data)Art. 10 (data governance); Art. 51d (data accuracy)Art. 5 (data accuracy), Art. 25Art. 77 (data quality label)ISO 42001:8.2, ISO 14971:4–7
Algorithm Validation StudyAnnex II, Sect. 6 (VV)Art. 15 (accuracy/robustness), Art. 78 (performance metrics)N/AArt. 78 (performance metrics)ISO 13485:7, ISO 14971:6
Risk Management FileAnnex II.5 (risk analysis), PMS (Art. 83, 87, Annex III)Art. 9 (AI risk mgmt)Art. 35 (DPIA), Art. 40 (security)Art. 40 (security measures), 41 (data monitoring)ISO 14971, 27001, 42001:8.5
Data Governance SystemAnnex XV (GSPR data/proc. validation)Art. 10 (data governance)Art. 5–9, 25 (DPbD), Art. 35 (DPIA)Art. 77 (data quality), technical annexISO 42001:8.2, 27001:9.2
Device Integration SpecificationAnnex II.1.1 (integration details)Art. 11 (system integration)Art. 25 (security)Art. 27 (interoperability)EEHRxF, HL7 FHIR, SNOMED CT
Instructions for Use (IFU)Annex I, Ch. III (user info)Art. 13 (transparency), Art. 12 (clarity of info)Art. 30 (user guidance)Art. 12, 30 (user guidance)IEC 62366, ISO 13485:4.2
Platform Change ControlAnnex IX (change mgmt.), PMS Art. 87MDCG 2020-3; Art. 15, 23Art. 32 (ongoing security)Art. 41 (data monitoring/updating)ISO 13485:7.2.3, 14971:7, 42001:11.3
Post-Market Surveillance PlanArt. 83, 87Art. 61(post-market monitor), 15Art. 32 (ongoing security)Art. 41 (data monitoring)ISO 13485:8.2, ISO 14971:8
Master Quality Management System (QMS)Art. 10(9), Annex IX–XI, Annex XI.BArt. 9, 10 (AI mgmt.), 12 (monitoring)Art. 25, 35Security/quality in EHDS, Art. 30, 40ISO 13485 (whole), 42001:8, 11.3
Data Protection / Privacy DocumentsThroughoutArt. 10 (data minimization)Art. 5–9, 25 (DPbD), 35 (DPIA)Art. 30 (data protection); Art. 77ISO 27001, 42001:8.6
Security / Interoperability EvidencePMS, Annex II, IIIArt. 15 (cybersecurity)Art. 5(f), 32, 33EEHRxF, FHIR, EHDS Art. 40ISO 27001, HL7 FHIR, EEHRxF

Laisser un commentaire

En savoir plus sur myQARAmate

Abonnez-vous pour poursuivre la lecture et avoir accès à l’ensemble des archives.

Poursuivre la lecture