I talk about the magic mix of AI and medicine with xMatter’s Abbas Haider Ali.

CTO at San Francisco-based xMatters, Abbas Haider Ali has helped the company create the vision for adopting connected communication strategies across business scenarios. They are at the forefront of reshaping how we communicate – especially when connecting data to people and systems including monitoring, testing, development and operations, and chat. xMatters has healthcare customers who use the communication platform for IT and DevOps, plus a subset that uses it for clinical cases.

‘There’s a great need to make information available to people wherever they happen to be,’ Abbas begins. ‘We provide a software platform that’s used in lots of different industries, so our foray into the chat world is not so much an end unto itself, it’s just to make the information in our application available anywhere.’

Increased accessibility

Abbas describes the use of chatbots as ‘just another way to make your product’s capabilities available where your users are. It’s also generally easy to use because you don’t have to install any specialised applications – you just go through a major chat platform.’

He goes on to describe how one xMatters client, Intermountain Healthcare, is using xMatters technology to remove some of the burden from over-stretched medical staff and speed up useful intervention. For example, say a patient arrives at the hospital displaying the symptoms of a stroke. ‘Before xMatters, the hospital’s approach was to ask a nurse to pick up the phone and start calling through a long list of neurologists – and then wait to be called back. It’s a manual process that takes away from patient care.

‘Now they use our product and simply click a button to find a neurologist and connect him in. The system knows who’s on call and the best way to contact them. It’s relentless in tracking people down, including their backup. The ‘click’ is ended when the doctor turns up or – in this case, gives a virtual assessment through telehealth technology.’

But it can go further than that. ‘In the case of a chat application, you’d be asking a question rather than just clicking a button,’ Abbas notes. ‘It’s in its early phases, but we see it as an interface replacement for complex technology systems.’

Such time-sensitive functions as finding the right doctor at the right time are also time-consuming. It’s a problem many have tried to solve, but companies all over the world suffer from a lack of interoperability and gaps in technological knowledge by frontline staff.

‘Every new application that you have to learn is cognitive load – we’re just one application of the dozens that a typical clinical staff member may have to use,’ Abbas explains. ‘xMatters can connect those applications by integrating them into a toolchain now, including chat applications to make communication faster and easier. But if companies can access all their applications through a chat interface, it would certainly make things a lot more efficient.’

Healthcare applications are ‘horrifically bad in terms of user experience,’ Abbas believes. ‘So if you can make them accessible through a chat interface it’s so much easier for a new staff member to come in and learn what’s going on. Why should a nurse have to learn a whole roster of application interfaces? They should be able to access the system and let AI help them.’

The robot-patient relationship

Therapeutic and patient-facing chatbots are another huge growth area, as evidenced by the work at both Babylon Health and Your.MD. And they’re nothing new – ELIZA, a program created to demonstrate the superficiality of communication between man and machine, but which actually demonstrated the human need to anthropomorphise computer interaction – was developed over 50 years ago.

‘The natural language processing element is crucial,’ Abbas explains. ‘If you can train the chatbot so that you lose the text interface and use a voice interface instead, you can speed things up even more. That’s possible today but I don’t know of any hospitals that are doing it right now. I don’t believe there’s a technology barrier, it’s more a matter of adoption and providing solutions.’

The race between Google’s Deepmind project and the IBM Watson programme to crack the full health potential of AI is making things even more exciting. ‘What you’re effectively doing is building a virtual doctor who theoretically has more experience and more capability than any single physician or clinical staff member ever could have,’ Abbas says. ‘If I’m working on a computer problem and I have an issue, I have a group of my peers that I trust, and I can reach out to say “what do you think of this problem?” I have experts on tap. So the next stage is a specialist chatbot that a doctor could contact and consult – not with a person but a system that aggregates information across lots of areas.

Apps are dead. Bots will be our new best friends

With over 260,000 mHealth apps available, users are growing weary of their impersonal touch. Will AI be the solution to better managing your health?

‘We’ve already seen that these AI systems can already exceed the diagnostic capabilities of most of our physicians. For instance, I recall reading an articleabout a study by Houston Methodist to detect breast cancer from mammogram imagery and the computer is already 30 times better at it than most doctors, and 99% accurate. The whole goal is to use AI so things aren’t missed and detected earlier.’

A collective knowledge

‘We learn from our experience, but AI learns from the experience of all the doctors available – it’s surpassing anything we could hope to do. A doctor can see thousands of images in their career but the AI, in the first few days of its existence, might see millions.’

Sounds like a no-brainer, right? ‘It’s expensive to create these sorts of technologies but the benefit is unbelievable. When you consider the cost and the time it takes to train a medical expert… this knowledge isn’t locked into one person’s head. If you’ve built it, it’s available to everyone in the world. That means you could take an MRI machine to a small village in China or a remote part of Africa and take the imagery and have the world’s best diagnostic capabilities do the analysing. That would be impossible in today’s world.

‘Healthcare is notoriously slow at adopting changes but, to be fair, we are talking about people’s lives… We’ve built these systems that are pretty good but now we’re testing them – they need to be validated, but I think we’ll see these kinds of innovations start to take place in five years’ time. But it will be 10 to 20 years before it becomes truly mainstream and day-to-day.’

Let me Google that symptom

Abbas Haider Ali

Abbas Haider Ali

‘Any time something goes wrong and you feel ill, you go to Google and the inevitable answer is that you’re going to die,’ Abbas laughs. ‘It’s untrained so you end up with the worst possible answer – even if you’ve only got a hangnail, you’re going to die. We all do it – and by the time you go and see the doctor you’re asking insane questions.’ Not only is it frightening but it’s a waste of time. Abbas has something much better in mind.

‘Why can’t we ask basic questions of a system that’s the equivalent of a GP – you can train it to look for certain patterns and key information that would give much better answers or, at the very minimum, it could say “you probably should go to a doctor.” Or if you’ve got the symptoms of a cold, you could be told to stay home, take it easy and have some soup and don’t go to the doctor’s office and make other people ill.

‘However, I see this as being more challenging, as you would run into the question of medical liability. In remote parts of the world where people are bypassing landlines and going straight to mobile, it would be great for them to be able to text to ask if it’s worth making a huge journey to see a doctor. That would democratise access to basic healthcare.

A symbiotic future

When we replace specialists with software, then we have a much more scaleable health system. In 10 years time I would be equal parts surprised and disappointed if, when you went to the doctor, you were not utilising some form of AI to help with diagnosis. If my doctor could submit blood tests or scans to an AI system while talking to me and then discuss the suggestions with me, I’d feel a lot more comfortable that we would naturally all get a better quality of care.’

Technology always comes in waves and surprises us by how quickly it can do things. As humans we’re terrible at recognising exponential patterns – so they can sneak up on us very quickly. Simple things we take for granted now (we all walk round with a very powerful computer in our pockets that have lots of diagnostic capabilities) when you combine that with software smart systems and we could be literally five years away from a healthcare revolution that we can’t see in front of our own faces,’ he suggests.

‘When all these things come together, we can only imagine what we might be able to do with it and I don’t think it’s as far away as we might think. You can be very optimistic about all this – I try to temper it as much as I can but I’m very excited about the possibilities.’

Useful AI and Machine Learning Articles

Leave a Reply