Designing an AI-Powered Remote Patient Monitoring System

project type
University Group Project (Hogeschool van Amsterdam)  / May 2023


London Air Ambulance (NHS)

Collaboration with

Blizzard Institute
Royal Mary Hospital
team members

Sjoerd Simmerman, Pin Chun Lin, Farzaneh Salehi, Yasmine Khalil

Value delivered

A first-of-its-kind AI-powered remote-patient monitoring system designed for the Royal Mary Hospital trauma bay unit to remotely monitor code-red trauma patients en-route to the hospital.


We designed a remote patient-monitoring screen for the London Air Ambulance and Royal Mary Hospital trauma bay team.

The London Air Ambulance (NHS) needed a tool to be able to facilitate communication between the helicopter paramedics team and the trauma bay team to save severely ill patients by monitoring them remotely en-route to the hospital. Paramedic-to-traumabay communication as it stands now is mostly verbal and there is no standardized or digitized way to stay in contact. 

The brief 

The first 60 minutes after a traumatic injury is the golden window that can determine a patient's life or death.

When a London Air Ambulance is dispatched to a trauma scene, the treatment starts immediately as soon as the injuries are identified and diagnosed. The helicopter emergency medical service (HEMS) team do their best to control and stabilize the patient on-scene and en-route until they arrive to the hospital trauma bay where they can recieve more complex treatments and surgeries.

However, the hospital trauma bay team often know very little information about the situation, events, and interventions that occur during and after the trauma. Upon arrival, a 30-second verbal handover from the paramedics to the trauma bay team takes place as they debrief them of the essential information. Soon after, the trauma bay team take over to re-examine and evaluate the patient once more, creating a redundant process.


initial Research

Modes, mediums, and styles of communication between the medical teams in emergency procedures were noted in a flow chart. 

research Insights

Voice-powered surveys gave us insights to many problems faced by medical personnel in the trauma bay.

We recieved insights and ideas that later influenced our design such as the “REBOA Blood Control Timer” - a device installed in patients to control severe blood loss. 

We also recieved primary research regarding the medical team’s roles and tasks within the trauma bay. 

strategy + Pillars

Based on the insights, we decided on a visual communication platform with certain features for the trauma bay team using a chronological-based information display screen.


Designing a system to co-exist with the current process in place to compliment rather than replace.


Displaying data with the medical industry’s current visual standards and conventions.


Designing with an understanding of cognitive load, and glance-based legibility.

Ideation & wireframing

The Time Belt Method: A linear timeline with a past, present and future narrative where events and interventions are noted.

The “time-belt” idea was a concept I came up with during paper-prototyping when we explored the idea of noting down events and interventions performed by the paramedics team. I iterated on the concept by exploring different placements of panels, interventions, and data visualizations.  The idea was inspired by having a linear story start-to-finish.

Failed concept

The Accordion Concept aimed to display historical and live trending patient data, but was too complex to be understood at a glance.

Medical personnel informed us that historical data is important to track the patient across time. However, the current live trending data is more important as it displays the patient’s actual state in the current time.

The Accordion Concept aimed to display both at the same time by zooming in towards the last 7-9 seconds documented. I named it “accordion concept” as it aimed to expand the perception of time, then shrink again.

After testing this concept in comparison to the other concepts, it did not prove to be successful as it was too unfamiliar and too complex to follow in an urgent use case scenario. It was not properly percieved.

Slowing down the last 7-9 seconds


Font-size, colors and text-spacing have a significant effect on the rate of accuracy of a visual search and the reaction time during each search period.

When designing the information, on-glance legibility was an important thing to consider as the surgeons, doctors and nurses would use the screen as a fast means of retrieving information. Their attention is limited so the cognitive load had to be minimal. High contrast was used to display quantitative physiology data such as HR and Sp02 numbers.

The dark background is scientifically proven to have better results when paired with high-contrast colored numbers. Each physiology indicator was assigned a color based on conventions observed through patient monitoring systems. A proper font and size balance was also achieved by using a bold Helvetica weight commonly used with modern patient monitoring systems.

new concept

The Calendar Concept creates 2 time axis, one for paramedic-intervention events, and another for physiology waveforms.

Pin Chun, my teammate, came up with the idea of separating patient interventions and physiologies, instead of having one time axis with everything noted down. In this way, we have a list of past interventions done, the current time-marker, and the predicted future through the AI algorithm.

It proved to be the best decision when we tested it alongside the Accordion concept.

data visualization

We designed a panel with a live-stream visualization of patient data across a 5-second time scale and event-entry methods.

The patient’s ECG, oxygen saturation, and non-invasive blood pressure, are livestreamed through the ZOLL® X Series® monitor/defibrillator that is used by the HEMS team on-scene and en-route to the hospital. The monitor allows data entry in the form of interventions that update the screen’s intervention panel with respective timestamps.

With the assistance of the LAA dispatcher and senior hospital nurse, they can access the screen’s backend system to update manual data entries if needed, such as respiratory rate, C02 saturation, and the Glasgow coma scale. The visualization of the data graph was designed according to conventional patient monitoring systems.

User perception

The role of human and clinical judgment in our AI feature was heavily criticized.

User testing revealed how our product was percieved positively by medical personnel. Most claimed to understand what it does, and what it aims to solve. However, the AI prediction concept was under-developed, as many did not understand the need for it, let alone how it worked.

AI Iconography
xAI Research

AI in healthcare is heavily biased towards eurocentric machine learning data, which can be catastrophic in healthcare.

Explainable AI aims to change this.

Artificial intelligence today

Explainable Artificial Intelligence (XAI)

Explainable AI aims to create a model to explain the processing rationale behind an AI’s prediction or decision.

It aims to integrate human judgement within the machine learning model in order to remove biases or prejudices, which could be catastrophic in the realm of healthcare.

XAI concept

XAI symptom tags were added to explain algorithm rationale behind the predictions, but were not successful.

I included a panel with a list of symptoms in tags to explain the AI predictions. The symptoms aim to create rationale to why the AI chose the particular prediction it did. It also aims to build a form of trust with the trauma bay team. People trust what they understand more of.

However, after testing this concept,  feedback indicated that the symptoms were too small to read and are redundant. It didn’t make sense to read in a fast urgent scenario. 

Final design & Results

The final screen was designed with an AI-prediction risk analysis chart and a diversified body map. 

Instead of the symptom tags, I opted for a chart that visualizes risk analysis for AI predictions. In this way, the medical team can visibly see the risk predictions in proximity to the events and physiology.

This initial prototype has since been funded and is in progress to be further designed, developed, and trialed across 4 major hospitals in London, and scheduled to be published by late 2026/early 2027.

This project has been featured on the Amsterdam University of Applied Sciences website: