Effective, ethical digital health tools start, and end, with a human-in-the-loop

New digital tools that improve disease diagnosis and treatment are just some of the exciting opportunities technology can bring to healthcare.

But clinical care can be complicated, emotionally fraught and messy. And health data isn’t perfect. To be effective and ethical, digital healthcare tools must take all of this into account.

“To build AI that is useful for health care, it must be explainable. It must complement existing decision-making processes. And it must be able to function in varied and often imperfect clinical settings,” says Wray Buntine, Professor of Data Science and AI in the Faculty of Information Technology (IT) at Monash University.

“This is why it’s vital we keep a human-in-the-loop in all aspects of our digital health work.”

Human-in-the-loop is an overarching philosophy at the Faculty of IT. It ensures real people are involved in technology design and implementation, as well as data preparation, visualisation, interpretation and application.

Factoring in human frailty

Imagine a busy emergency department, with ill patients and distressed family members. Clinicians and nurses are making life and death decisions, with only enough time to record the bare minimum of patient details. They’re navigating numerous patient management systems and databases running on different types of hardware and software.

“You can’t just throw a bit of AI in the middle of all this and expect it to work,” Professor Buntine says.

“Digital health tools have to work in the context of human behaviours, and factor in key aspects of medical care such as accountability and legal responsibility.”

Reflecting human frailty and focusing on ‘explainability’ are key aspects of a project Professor Buntine is running with Monash Addiction Studies expert Professor Dan Lubman, along with the Eastern Health Foundation and addiction treatment, research and education centre Turning Point.

Funded by Google, the work applies AI to develop a national suicide monitoring system.

“For deaths that may be suicides, there’s often a lag between an incident taking place and its classification for public health records,” says Professor Buntine.

“Legally, this makes sense. But this delay acts as a block on creating effective community programs for suicide prevention.”

To understand suicide better, Professor Buntine and colleagues are developing AI that uses automated natural language processing to classify cases. They’re using written records taken by ambulance staff as the starting point.

“The difficulty is that ambulance staff only have time to write hasty notes on what takes place in an emergency,” Professor Buntine says.

In their raw format, these records are not suitable to use as the AI’s starting point. And so the project employs qualified experts to interpret the notes, and generate data suitable to train the AI. Building this human-in-the-loop approach into the early stages of the project ensures the data sustains real human meaning, but is also workable from the technology perspective.

The AI can reach an acceptable degree of accuracy in many of the easier cases.  This frees up the experts to spend more time on the harder cases and allows a larger volume of the data to be fully processed, more kinds of mental health cases evaluated, and possibly other kinds of emergency records addressed as well.

Revealing the humans aspects of technology

Dr Sarah Goodwin is also a staunch practitioner of the human-in-the-loop approach in technology development. She’s a lecturer in the Immersive Analytics research group in the Faculty of IT at Monash.

“Mainly, my work is about visualising data effectively, and helping people understand their complex data,” says Dr Goodwin.

“People can feel a bit afraid of technology in health, and so we try and find ways to reveal the human aspects of algorithms.”

Dr Goodwin says she offers the most benefit to projects when she is involved from the outset.

“I like to get in early and get a sense of what sort of data will be generated, know who the end users will be, think about any potential issues or problems with the data that might arise,” says Dr Goodwin.

Dr Goodwin aims to ensure transparency in highly technical projects through including analysts and expert users as humans-in-the-loop when projects are in the planning phase.

“Using AI and machine learning and modelling is great, but only to the extent that you actually understand what it’s doing,” she says.

“We run focus groups with different user groups – for example researchers, the general public and health practitioners – and try to understand what they want to get out of a data visualisation,” she says.

“Then we can tailor data collection so we can serve that purpose.”

Making a useful tool for humans

While a human-in-the-loop is vital for technology design, it’s also crucial in shaping the output of a digital health tool.

Supported by Ambulance Victoria and Safer Care Victoria, Dr Goodwin is working with Monash Faculty of Information Technology’s (IT)  Interim Dean Professor Ann Nicholson and Associate Professor Burak Turhan to develop AI technology that quickly flags a likely cardiac arrest to a triple zero call operator . This allows the call taker to both rapidly dispatch an ambulance, and provide appropriate CPR guidance over the phone.

“In the situation of a real emergency phone call, we don’t need to present the complexity of all the data to the person taking the call,” says Dr Goodwin.

“We just need to create some sort of clear signal as to whether a patient has a high likelihood of being a cardiac arrest case.”

But the model must still have enough transparency to keep a human-in-the-loop.

“Even with the technology up and running, the triple zero call taker must still be able to use their own skills to make the final decision,” Dr Goodwin says.

“In turn, including this capability will only improve the algorithm in the long run.”

With other colleagues at Monash, Dr Goodwin is also working with a large group of researchers looking at the impact of COVID-19 on diverse aspects of life in Melbourne.

“We’re looking at things like mobility of people, shopping habits and energy consumption,” says Dr Goodwin.

“We want to be able to visualise the data, find patterns in data and help people understand that COVID-19 is not just about health. It’s having an effect on so many other things.”

There’s no doubt that digital capabilities are transforming our understanding, practise and delivery of health. Through a firm commitment to keeping a human-in-the-loop throughout technology development, researchers at Monash IT are working to ensure digital health tools are useful, acceptable and ethical as judged by their fellow human beings.