PROCESS VALIDATION

Test Method Validation for Visual Inspection of CDI Components in Incoming Quality Assurance

A cardiovascular device manufacturer required validated evidence that Incoming Quality Assurance (IQA) operators could reliably distinguish between acceptable and unacceptable CDI (Cardiovascular Device Incorporated) components during incoming inspection. The challenge involved validating visual inspection methods across 118 PIP (Part Inspection Plan) features categorized into 15 distinct inspection categories including adhesive placement, color verification, wire inspection, barcode labels, label printing, presence/absence checks, and surface finish. Critical concerns included ensuring operator repeatability and reproducibility across varying component types, preventing acceptance of defective incoming materials, and maintaining compliance with medical device quality requirements.

Project Metrics

Test Samples42 Samples
Operator Agreement98.46%
PIP Features Covered118 Features
01

Problem Definition

A cardiovascular device manufacturer required validated evidence that Incoming Quality Assurance (IQA) operators could reliably distinguish between acceptable and unacceptable CDI (Cardiovascular Device Incorporated) components during incoming inspection. The challenge involved validating visual inspection methods across 118 PIP (Part Inspection Plan) features categorized into 15 distinct inspection categories including adhesive placement, color verification, wire inspection, barcode labels, label printing, presence/absence checks, and surface finish. Critical concerns included ensuring operator repeatability and reproducibility across varying component types, preventing acceptance of defective incoming materials, and maintaining compliance with medical device quality requirements.

02

System Constraints

ISO 13485 compliance for medical device quality management
FDA QSR requirements for incoming inspection
74 Part Inspection Plans requiring coverage
118 PIP features across 15 inspection categories
Attribute data (Pass/Fail) requiring statistical validation
Manufacturing Review Detection (MRD) risk ranking system (1-3 scale)
Process Validation Assessment Matrix (PVAM) requirements
03

Engineering Decisions

Categorized 118 PIP features into 15 homogeneous inspection categories

Created 42 test samples (29 photos + 13 physical parts): 25 'True-Pass', 17 'True-Fail'

Implemented MRD ranking system to identify worst-case inspection scenarios

Selected minimum one PIP per category ensuring representative coverage

Used randomized presentation order to minimize learning effects and bias

Conducted testing over multiple days to simulate real-world inspection variability

Replaced specific samples between Rev A and Rev B to address operator recognition bias

04

Validation Strategy

  • Rev A execution unsuccessful: Both associates passed a 'True Fail' sample, requiring protocol revision
  • Rev B execution successful: All 'True Pass' units correctly identified (except 1 sample by Associate 1 on Trial 1)
  • All 'True Fail' units (17 samples × 2 operators × 2 trials = 68 evaluations) correctly identified as Fail
  • Inter-associate consistency AC1: 0.9683 (p < 0.001)
  • Intra-associate consistency: Associate 1 AC1 = 0.9692, Associate 2 AC1 = 1.00 (both p < 0.001)
  • Overall consistency AC1: 0.9846 (p < 0.001), exceeding threshold of 0.7
  • Stability demonstrated across multi-day testing interval
  • System suitability confirmed through Attribute Gage R&R
05

Risk & Mitigation

Initial validation (Rev A) failed due to operator error on defect recognition
Mitigation:
Deviation 856985-02 documented and protocol revised to address failure mode
Mitigation:
Sample selection bias mitigated by replacing parts between revisions
Mitigation:
MRD ranking system identified highest-risk inspection points requiring focus
Mitigation:
Zero false accepts in final validation (Rev B) - no defective parts passed
Mitigation:
One false reject documented but met statistical acceptance criteria
Mitigation:
06

Final Outcome

Successfully validated visual inspection methods for incoming CDI components across all 15 inspection categories with near-perfect operator agreement (AC1 = 0.9846). Demonstrated that trained IQA operators can reliably detect component defects including adhesive placement errors, color variations, wire defects, barcode/label issues, and surface finish problems. The validation covered 74 Part Inspection Plans and 118 PIP features, establishing a robust incoming inspection process with documented evidence of operator capability and process reliability for regulatory compliance. The iterative validation approach (Rev A failure leading to Rev B success) demonstrated continuous improvement and rigorous quality standards.

Test Samples42 Samples
Operator Agreement98.46%
PIP Features Covered118 Features