Why Medtech Fails Clinicians: Lessons from Aviation’s Human Factors Engineering

Portrait of Dennis Lenard in the UX design agency.

Dennis Lenard

Mar 2026

Medical devices often fail due to poor interface design, causing alarm fatigue and errors. Aviation’s proven human factors engineering offers a roadmap to safer, more effective medtech interfaces.

Every year, hundreds of patients die from preventable medical device interface failures. The tools to stop this have existed in aviation for four decades. Here is why medtech hasn't used them, and what it would look like if it did.

This article draws on Creative Navy's project work in medtech UX, spanning practice management software, surgical equipment, ventilators, blood pumps, infusion systems, and patient monitoring devices, including Class II and Class III regulated products. Our work in this sector covers clinical environments including the ICU and operating theatre, designing for surgeons, nurses, and biomedical engineers. Dennis Lenard, who leads this work at Creative Navy, is the author of User Interface Design For Medical Devices And Software, the practitioner reference on UX design for medical devices and software. Our approach integrates IEC 62366 usability engineering requirements and FDA Human Factors guidance as structural inputs to the design process, not post-hoc compliance activities.

Alarm Fatigue: The Stark Numbers

  • 566 alarm-related patient deaths recorded in the FDA's MAUDE database between 2005 and 2010
  • 85–99% of hospital alarms require no clinical intervention, yet they fire anyway
  • 80 deaths from alarm-related events in the Joint Commission's Sentinel Event database in just 42 months of voluntary reporting
  • 56,000 adverse event reports for infusion pumps between 2005 and 2009, resulting in 710 deaths
  • 47% reduction in surgical death rates from implementing a structured checklist protocol (Haynes et al., NEJM, 2009)
  • 12-fold reduction in commercial aviation fatal accidents per million flights between 1970 and 2019
  • 60% of infusions are associated with errors in clinical practice, according to peer-reviewed research

2 AM in the ICU: A Real Shift

A nurse is managing five patients. Four are connected to devices: a ventilator, two infusion pumps, a cardiac monitor, a pulse oximeter. Three different manufacturers. Each device has its own alarm logic, its own interface language, its own threshold defaults.

Over the past hour, fourteen alarms have sounded. Thirteen required no action.

The Joint Commission estimates that between 85% and 99% of hospital alarms are non-actionable. They demand attention without requiring it. Over a twelve-hour shift, that pattern corrodes the very vigilance the alarms were designed to trigger.

The fourteenth alarm fires. She silences it.

This time, it mattered.

In 2012, a patient at Waikato Hospital in New Zealand died because a cardiac monitor alarm had been turned down to an inaudible level. The coroner's inquiry found that alarm fatigue among nursing staff had made volume reduction a routine coping behaviour. The device performed exactly as its engineers specified. The patient still died.

Was this a human failure or a design failure?

The answer is design failure. Another industry proved this conclusively, more than four decades ago, and built a documented, replicable methodology to prevent it.

Alarm Fatigue: Scale & Human Cost

Between January 2005 and June 2010, the FDA's MAUDE database recorded 566 alarm-related patient deaths. The Joint Commission's Sentinel Event database recorded 98 alarm-related adverse events between January 2009 and June 2012. 80 resulted in death. 13 in permanent loss of function.

From 2005 to 2009, the FDA received approximately 56,000 adverse event reports related to infusion pump use. 710 deaths. 87 recalls, 14 classified as Class I.

The FDA classifies "inadequate user interface design (human factors problems)" as an equivalent engineering failure mode, named as a device defect.

The UCSF Medical Center infusion pump case: A physician accidentally prescribed a 38-times overdose because the default unit was not sufficiently legible. Alert fired. Cleared. Patient suffered seizure and respiratory arrest.

In our work across medical device and clinical software projects, organisations invest heavily in hardware and compliance, but the interface arrives late, under-resourced, treated as cosmetic rather than safety-critical.

Aviation Human Factors: What Worked

Commercial aviation fatal accidents per million flights fell from 6.35 in 1970 to 0.51 in 2019, a 12-fold reduction.

The Crash That Created Crew Resource Management

United Airlines Flight 173 (1978) ran out of fuel due to poor crew communication and instrument hierarchy. The NTSB investigation led to mandatory Cockpit Resource Management (CRM), a designed communication architecture with verification redundancy.

70–80% of accidents involved human error. Aviation treated these as system design problems, not personal failures.

Aviation Principles for Med Device UX

Principle 1: Standardisation as Cognitive Load Reduction

Cockpit standardisation reduces cognitive load in stress. Medtech lacks interface layout standardisation across devices.

Principle 2: Alert Hierarchies That Are Experienced, Not Just Documented

Aviation alerts are perceptually distinct. IEC 60601-1-8 tried this but 2020 amendment showed original sounds were too similar in real clinical noise.

Principle 3: Confirmation Workflows That Verify, Not Just Acknowledge

Aviation checklists verify state. WHO Surgical Checklist reduced deaths 47% when genuinely engaged, not when ticked as theatre. Haynes et al. 2009.
We examine how this plays out in depth, and why 98% compliance can still produce zero safety benefit, in Why the WHO Surgical Safety Checklist Fails at Scale

Principle 4: Usability Testing That Replicates Stress, Not Competence

Aviation simulators test under stress. Medtech labs test rested users in quiet conditions, missing real clinical stressors. Governed by IEC 62366 and ANSI/AAMI HE75.

Limits of the Aviation Analogy

ICU nurses manage multiple heterogeneous devices with no unified display, more like air traffic control than a single cockpit. Commercial incentives work against integration standards.

10 Safety-Critical Device UX Principles

  • Design for the stressed user, not the trained one.
  • Treat standardisation as a safety investment.
  • Build confirmation workflows that verify, not acknowledge.
  • Design your alert hierarchy so urgency is felt before it is understood.
  • Build stress into your usability testing.
  • Design for multi-user operation and explicit handover.
  • Physical affordances are stress-environment features.
  • Make device state unambiguous at a glance.
  • Treat interface changes as safety events.
  • Invest in understanding clinical reality before designing for it.

FAQ: Alarm Fatigue & Human Factors

What is alarm fatigue in hospitals?
Desensitisation from 85–99% non-actionable alarms, linked to delayed response and patient deaths.
What is IEC 62366?
Usability engineering standard requiring formative and summative testing with representative users/scenarios. If you're working through confirmation workflow design specifically, this article goes deeper on what makes a verification step genuinely protective under IEC 62366.
What is IEC 60601-1-8?
Alarm priority standard (high/medium/low). Amended 2020 after sounds proved too similar in noisy wards.
What does the FDA classify as a human factors failure?
"Inadequate user interface design" listed as a device defec, separate from operator error.
Why do usability studies often fail to predict real-world errors?
Lab conditions lack concurrent alarms, gloves, interruptions, time pressure, the real stressors.

Creative Navy is a specialist UX/UI design agency working with medical device manufacturers on safety-critical interfaces. Visit our site or get in touch.

In this story

Every day, clinicians face critical medical device alarms that confuse rather than guide. Alarm fatigue contributes to preventable patient harm. Aviation’s human factors engineering shows how standardised, stress-tested interfaces can save lives. This post explores principles medtech can adopt to improve safety and usability.

6 min read

You might also like

When Flexibility Becomes the Enemy of Good Design
Industrial GUI

When Flexibility Becomes the Enemy of Good Design

A four-iteration design project for an aircraft engine manufacturer found that designed constraints reduce error risk more reliably than flexible interfaces. Here is what the evidence showed at each stage.

16 min read
Mass Photometry Software UX Benchmarking: a systematic review
Scientific Interfaces

Mass Photometry Software UX Benchmarking: a systematic review

Five mass photometry analysis tools reviewed against a consistent UX framework. DiscoverMP, PhotoMol, ImageJ, CellProfiler, and BioImageIT each fail in documented ways that compound reproducibility risk across shared facilities.

22 min read
AI-Generated Medical Device Interfaces and the EU AI Act Localization Problem
Medtech & Healthcare Design

AI-Generated Medical Device Interfaces and the EU AI Act Localization Problem

AI-generated interface content in medical devices cannot be pre-translated, pre-tested, or submitted under EU MDR. The Adaptive UI Within Validated Language Boundaries principle defines the compliant design response before August 2027.

22 min read