In 2011, after sustaining serious injuries during a hotel stay, I was taken to a hospital in severe pain. Despite visible signs of trauma, the emergency department withheld pain medication until X-rays confirmed multiple fractures, causing an unnecessary and prolonged delay in relief.
Over a decade later, studies from 2020 to 2024 show these delays persist, with patients frequently waiting for pain management until diagnostic confirmation. This reflects a systemic bias toward certainty over compassion, despite clinical guidelines supporting early relief based on professional judgment. Contributing factors include communication breakdowns, uneven training, and fear of masking symptoms.
Current emergency workflows fail to integrate pain management with diagnostics, leading to avoidable suffering. Addressing this challenge requires a technological solution that streamlines assessment, accelerates diagnostics, and enables timely, evidence-based pain care.
Goal
Design an AI-assisted human–computer interaction (HCI) solution that empowers emergency department staff to make timely, evidence-based pain management decisions. The system will integrate clinical assessments, patient input, and automated protocols while aligning with existing workflows, communication systems, and clinical roles. By enhancing both the user interface and the broader care delivery process, the solution aims to reduce patient suffering without compromising diagnostic safety.
Assess the Users
In the discovery phase, the goal is to understand the needs, behaviors, and challenges faced by emergency department staff, particularly triage nurses and physicians, when managing patients in acute pain. This includes observing real-time workflows, conducting interviews, and reviewing how clinical documentation and decision-making unfold in practice. Areas of focus include how pain is currently assessed, what delays treatment, how information is communicated, and where existing systems fall short. For example, staff often wait for diagnostic imaging before providing pain relief, even when clinical signs strongly indicate injury. This project aims to uncover both functional needs, such as quicker data entry and improved decision support, and emotional needs, such as reducing stress and reinforcing clinical confidence. The insights gathered will serve as the foundation for designing a user-centered solution that fits the high-pressure environment of emergency care.
AI Application / Algorithm Selection
For this HCI solution, which supports timely pain management decisions by classifying patients as candidates for early analgesia versus those needing diagnostic confirmation, I evaluated machine learning algorithms based on real-time performance, interpretability, and fit for emergency care.
Considered options
• The Support Vector Machine (SVM): Accurate but complex and less interpretable.
• Artificial Neural Networks (ANN): Powerful, but the “black-box” nature limits clinical trust.
• Logistic Regression: Simple and interpretable, but may miss nonlinear patterns.
• Random Forest: Improves accuracy but reduces transparency.
• Decision Tree: Fast, interpretable, and aligns well with clinical reasoning.
Selected algorithm
I selected the decision tree algorithm for its balance of speed, accuracy, and transparency. Its logic mirrors clinical workflows (e.g., “If pain >7AND mechanism suggests fracture AND no contraindications, THEN consider early analgesia”), making it intuitive for emergency staff.
Decision trees also support easy protocol updates without full model retraining. While the decision tree will serve as the primary classifier, future versions may incorporate ensemble methods or simple neural networks to improve performance on complex cases, with the decision tree providing the final, interpretable output.
Interface Design
Interaction Design (IxD) centers on designing engaging interfaces with carefully considered behaviors. At its core is an understanding of how users and technology communicate. This insight allows us to anticipate user interactions, address potential issues early, and explore innovative approaches to interaction.
Questions and answers for the interface design process
• What commands can a user give for interaction with the interface?
Users can interact via touch (e.g., tapping buttons, sliders), voice commands with noise cancellation (e.g., “record pain score seven,” “confirm medication”), keyboard/mouse input, and gesture controls for hands-free operation are especially useful when wearing gloves.
• What about the appearance (color, shape, size, etc.) gives the user a clue about how it may function?
Buttons follow medical interface color standards (e.g., red for critical alerts, amber for caution, green for confirmation), are consistently shaped and sized for fast recognition, and use universal medical icons. Tooltips and high-contrast modes ensure clarity across lighting conditions.
• What information do you provide to let a user know what will happen before they perform an action?
Hover states, microtext descriptions, and confirmation modals preview outcomes (e.g., “Submitting will notify the attending physician and trigger a medication protocol”). Voice feedback confirms critical actions (e.g., “Confirming morphine 2mg IV”) before execution.
• Are there constraints put in place to help prevent errors?
Yes, real-time field validation with clinical range checking, context-specific filtering of options, clearly marked required fields, and two-factor confirmation for high-risk medications. Smart defaults based on patient age and weight reduce entry errors.
• Do error messages provide a way for the user to correct the problem or explain why the error occurred?
Error messages use plain language, are context-aware, and include suggestions (e.g., “Heart rate 180 exceeds normal range (60–100 bpm). Verify reading or confirm if accurate”). Clinical reasoning is provided for constraint violations (e.g., allergy-related contraindications).
Wireframe
The wireframe includes the user application interface components and how the interface interacts with both the user and the application.
Usability Heuristics of PainPoint V1.0
Below are five of Jacob Neilson's ten principles for IxD heuristics that were used to evaluate the interface for PainPoint V1.0.
1. Visibility of System Status
The design maintains transparency with real-time feedback, confidence scores, and visual action confirmations, keeping clinicians informed of system status and AI decisions.
2. Match Between System and Real World
The interface uses standard clinical language, triage categories, and familiar pain scales. Its decision tree logic mirrors clinical reasoning, making AI recommendations intuitive for ED staff.
3. User Control and Freedom
Clinicians retain full control through options such as “Emergency Override” and alternate pathways (e.g., Proceed with Analgesia, Wait for Imaging), enabling immediate flexibility and quick reversibility without unnecessary steps.
4. Error Prevention
Real-time validation, clinical range checks, and two-factor confirmation for high-risk actions reduce error risk. Smart defaults and context-based filtering prevent input mistakes before they occur.
5. Aesthetic and Minimalist Design
The interface uses collapsible panels to present 5–7 key items at a time, with progressive disclosure. Critical information is prioritized, while secondary details remain accessible but visually unobtrusive.