The 5Q — Seeing is Believing: Looking at the Next Wave of Healthcare Data Visualization with Stephen Marshall

Neural HALO signaling from MUSE monitor.

Neural HALO signaling from MUSE monitor.

In the 5Q, Tincture sits down with leaders to discuss their day to day work and share their perspectives on healthcare, medicine, and progress.

Stephen Marshall is a technologist and entrepreneur at the forefront of digital media. He’s also a published author, and award-winning film director. As CEO of ORA Systems he is bringing a new kind of real-time, multi-dimensional data visualization to patient care.

1. Can you share an overview of what you’re working on these days? At the highest level, what problem are you trying to solve?

At the highest level, as a company, we’re developing a new visual language to aggregate and communicate multiple big data flows. It is our vision that this volume of complex data will signal in dimensional objects (think clouds, or flowers which are essentially produced by complex data inputs).

In the medical context: we’re developing software that can aid in visually transmitting a patient’s medical narrative — as an intuitive and meaningful experience for physicians, nurses, and the patients themselves — in both momentary (meaning dynamic) and cumulative modules.

So we’re addressing the problem that current modes of structuring and displaying patient data, whether it’s your doctor’s office or in the delivery room, are not optimized for the current technologies, let alone the sheer volume of actionable data, nor for the comprehension and engagement of downstream practitioners and the patients themselves. More specifically, there is a huge opportunity to develop objects that can intuitively and dynamically signal the health of a person.

The first of these developed by our team is called the HALO, a patented visualization that can signal up to 10 distinct data streams (you can play with the SDK, here). We’re working with the H.I.P. Lab at the Mayo Clinic on a few initial applications that will bring it to market for the medical sector.



2. Could you tell us a little bit about your pilot with the Mayo Clinic? In a nutshell, what will you need to demonstrate in order to make this technology available to more patients and doctors?

The new generation of heart rate (HR) wearables pushed the fitness tech sector closer to a medical grade enterprise. With the higher-end devices, users are acquiring a much higher fidelity picture of overall health than the standard Steps-based analytics. But for the average person, without personalized physician interpretation and directives, the HR data is almost meaningless.

With Dr. Bruce Johnson’s H.I.P. lab at Mayo, we’re deploying the HALO to signal data from wearables as a way of educating and engaging users in their overall heart health. Because it’s one thing to watch your steps climb toward the 10K milestone as a metric of performance. It’s way harder to track a recommended HR level, and duration, in order to meet personalized optimal health thresholds.

So we’ve developed the Health HALO as a digital system that allows users to track their daily progress towards optimal heart health by integrating Mayo’s proprietary algorithms in the HALO. Which means each HALO is customized to the user’s own Mayo-prescribed fitness thresholds. The user’s cumulative performance is then tabulated for classification in a four Tier national ranking system of Bronze, Silver, Gold and Elite, also programmed by Mayo Clinic.

The pilot testing for Apple Watch users began in early August and the key here is to gauge user engagement in terms of the readability and intuitiveness of the system as well as changes it brings about in their daily fitness performance. It’s a small group but so far the feedback has been very positive.

Screen Shot 2021-03-31 at 10.45.54 AM.png

Our next step is to integrate a spectrum of wearable devices into the HALO software and release the application for both consumers and as an add-on for wellness platforms. Once we get proven engagement for users, we’ll begin to introduce a dashboard for physicians — the beginning of the next generation EHR — so they can monitor patient stats and, more critically, program HALOs with thresholds determined by stress testing and other factors.

As an aside, we also see a nexus here between patient, physician and insurance companies where we engage payors who might use this technology to reward responsible behavior.



3. Please describe a few of the use cases for real-time patient data visualizations. In general, are these scenarios better suited for short-term, acute episodes?

Real-time visualizations are definitely more suited for short-term episodes. As stand-alone’s they just aren’t designed to architecture cumulative data. The exception here, of course, is the Health HALO, which builds and signals over the course of a 24-hour day. But in the context of overall heart health and fitness, a day is itself a short-term episode.

We’re developing for several use cases, some of which are more sensitive than others. But here are two:

We are currently working on a stress test application with the Johnson lab that creates a HALO from six algorithms (including a patient’s VO2, heart rate recovery, and fitness score across three stress tests). The result is a HALO that gives physicians, nurses, and patients an immediate and intuitive-to-read visualization of their heart health.

The user’s HALO can be normatively compared to optimal HALOs and the user’s previous HALOs. We also coded group HALOs so enterprise and insurers can see distributions of the population on the spectrum.

The other real-time visualization we’re working on is called the Fetal HALO. Current displays used in delivery rooms for fetal tracing data are complex and very difficult for most nurses (and some doctors) to aggregate into a coherent picture of a labor, from initial contractions to birth. The mothers certainly have no idea what the machines are signaling (which, depending on who you talk to, can be considered a good thing.)

And rewinding the data to look at different points during the labor is also difficult.

ORA’s Peter Crnokrak presenting the Fetal HALO in his Visualized Keynote address

ORA’s Peter Crnokrak presenting the Fetal HALO in his Visualized Keynote address

We’ve been working with a leader in the obstetrics field to push fetal data into the HALO. Not only have we matched the color coding used in the fetal charting, as well as integrated the key data flows into an easy-to-read object that a mom-to-be can read, the HALO can even mimic contractions in a way that shows an up-to-the moment picture through transition and delivery.

And the HALO can rewind like a movie of the entire birth for any of the birth team to see. We’re really excited about this one.



4. Can you talk about the origins of the HALO visualization itself? It’s not hard to imagine people getting a plant, or a digital pet, or other gamified visuals. Where do you see this layer of the platform evolving?

The HALO is part of a larger 3-dimensional, 4-component system called the Protostar, which is still in development.

Without getting too deep into what that is and how it works — it’s safe to say that it’s a self-populating coalescence structure that aggregates and distributes data from the life of a person or entity. Think of it as a 3D ‘browser’ that is navigable and explorable that, if we’re right its application to medical, could become a next generation EHR.

Anyway, the Protostar is obviously a pretty large-scale development initiative. And with the pressure to move our start-up to a viable revenue model, we decided to extract and develop one component of that system as a stand-alone product. That was the piece that signals moment-to-moment health; the HALO.

We actually see a future where social platforms will become object-based. Where the words, pictures, and sounds that comprise a person’s profile will become signaling aspects in objects that are a next-level identity. Beyond the constrictions of DNA, but also algorithmically precise and individualized. Like plants and even worlds.

But that is a crazy BHAG and it’s good to have as a guiding vision. In the near-term, we can imagine a navigable platform of objects that represent an ecosystem of health organizations, their practitioners, and their patients. And a way to start seeing our communities, our geographic regions, and our civilization as a whole in a way that tells us an immediate story about our collective health.

This begins by working out a visual language that can integrate and grow through a progression of data-architected objects. And this necessarily starts with biological signals, and the multitude of applications that that project implies.



5. The healthcare system has shown little interest in improving data visualization for patient engagement. For example, most patient portal lab results are unformatted raw data that are never explained to patients. Practically speaking (culturally, politically, financially), what will it take to leapfrog forward into the modern digital era? Will this sort of thing ever become a standard of care?

I think so. Because we’ve already seen advancements in this regard.

My co-founder Peter Crnokrak says that whenever we significantly decrease the time between a person and their ability to receive, comprehend, and act on data, we mark an evolutionary thrust.

I don’t know what the initial reaction to the development of MRI was, but it would be hard to imagine the forces that would have stood against it, except time and money. And a lack of imagination. Can you imagine any one saying, ‘we have no need for a highly versatile imaging technique that can give us pictures of our anatomy that are better than X-rays?’

That is a prime example of improving data visualization for patient engagement. I know because when my father who was a scratch golfer started missing 2 foot putts, he was able to be diagnosed with neurological cancer. But like the X-ray before it, these are practitioner-side innovations.

The developed world is in the throes of a paradigm shift in medicine. For lots of reasons that I am sure everyone of your readers is deeply familiar with, people are going to have to start learning to take care of themselves. They are going to have to understand this amazing technology called the human body and how it works. And how it lives and why it dies. And to the extent that they take up that challenge, there will be parallel innovations that give them the tools and processes to do that.

Seeing the body and its components in visualizations that not only show their present and historical health, but also guide the user to optimize them will be a demand that technologists and caregivers will meet.

More, we are now moving into a stage — call it population management — in which large medical institutions want to be tracking patients remotely. And there is still a role for humans in that endeavor. By this I mean, we still need people to watch and care for people. In the future this will mean pattern recognition of individuals and their health signals as opposed to an entire health monitoring system being controlled by machine learned thresholds. (Talk about a dystopian nightmare)

And in that scenario, which is already coming fast upon us, would you rather have nurses staring at screens of multiple complex data flows, or objects that can instinctively be read, isolated and acted upon?

I’m going with the latter.