Using a Data Science Value Framework to Harness the Benefits of IoMT

The Internet of Medical Things (IoMT) poses unique technological challenges on two fronts: engineering and data science. 

All use cases of advanced data analytics face challenges about the actual mathematics and the access to data. Further, there are nontechnical challenges to overcome to ensure the effort and expense actually delivers value. 

Much has already been written about data scientists spending only around 10% of their time doing data science, and the rest in some form of data wrangling. However, even beyond data issues, the entire effort needs to be managed within a framework to provide direction to the project. 

We call this the Data Science Value Framework

[FREE WHITE PAPER] How IoMT Can Power Personalized Healthcare

The Data Science Value Framework

  1. Define the desired outcome.
  2. Define how a behaviour or a process needs to change to deliver the desired outcome.
  3. Understand what new information needs to be presented, and how it needs to be presented to change behaviours to deliver the outcome.
  4. Consider what analytical process (and data) is required to deliver the new information.

This framework defines the direction of the project and minimises the need for serendipitous discovery through experimentation. This framework also addresses The Challenge of Right Sizing the project and to maintain ROI throughout. 

Just like its more general parent, IOT, IoMT is a challenge on two fronts. 

First and foremost is the challenge of collecting data and transmitting it to a data platform in the required format and recency. This is primarily a challenge of hardware and data engineering. 

However, once the data has been made accessible in a centralised platform, it has to be processed to deliver value. At this stage a lot of new challenges present themselves — and the actual data science and analytics is possibly the least of them. 

The Data Scientist Dilemma

Data scientists have a wonderful arsenal of tools at their disposal and access to modern cloud based technologies that allow instant scale up on processing power. 

However, all of those tools and computation-hungry processes are not free. So, a data scientist needs to choose the path carefully because there are limited resources (time and money) available to endlessly experiment. 

At DMI, this is a challenge that we help all of our clients to understand and overcome. 

We have developed a six-stage framework that ties a data science program to an economically rational model starting with the definition of the desired outcome.

It’s surprising how often this seemingly obvious step is missed in technology or data science-driven projects. Even in a business context, there tends to be an assumption that advanced engineering like IoMT coupled with data science will automatically deliver value. But actually, the path to value cannot be taken for granted and needs to be managed. That management begins with understanding what is the desired outcome.

1. DEFINE THE DESIRED OUTCOME

In a business context, this is usually a P/L impact. However in IoMT, we are usually talking about patient outcomes which could mean: 

  • Ensuring patients adhere to care plans 
  • Hospitals improving their patient tracking accuracy
  • Reducing bed stays 

The outcomes are endless when it comes to how IoMT can transform healthcare data from traditional modes of in-person visits or point-in-time data recording, to constant remote monitoring or data and proactive response to potential data outliers. 

It is critical however that the desired outcome is defined at the outset. Only then can you accurately discuss how you want to influence behaviour in order to achieve the outcome.  

2. WHAT AND WHOSE BEHAVIOUR NEEDS TO CHANGE?

Where you want to improve sales, reduce cost or improve patient outcomes (in section 1) you need to understand that someone’s behaviour needs to change in order to deliver that outcome. 

The second step in our IoMT framework defines that change, and considers how change will be achieved. The entire data industry tends to have an assumption of “build it and they will come” — i.e. just delivering new and better information will mean that people will act upon that information. 

However we know that humans are creatures of habit, and behavioural change is more complex than simple delivery of new information. This has particularly been documented in health sciences from smoking, to compliance to treatment regimes for deadly disease. 

Empowering patients through creativity and regular feedback, will not only provide guidance and insight into their progress, but they will be able to leverage their creative freedom to adjust accordingly. For example, if the digital healthcare app pings you that you’re falling behind on your weight loss goals, then you can explore different options by perusing its list of various diets and meal plans.

DMI works with the Fogg model of behavioural change, where we consider the intersection of: 

  • Motivation
  • Ability
  • Prompt

If this is not considered in your project, then you may successfully deliver IoMT data to a platform and conduct data analysis upon it, but if that information is ignored, your project will fail!

3. WHAT NEW INFORMATION IS REQUIRED TO CHANGE BEHAVIOUR?

For the purposes of this article, we are going to stick with the data challenge, and so while motivation, ability and the delivery mechanism of new information is critical (as described above) we also have to ask what new information is to be delivered. 

We need to know what the new information is that will change the user behaviour (if it is delivered correctly), so that our project will realise the desired outcome. 

This step is also critical to ‘right size’ your project. We do not need a complex (and expensive) deep learning model looking at everything. This part of the framework helps define the complexity of the analysis required. 

4. WHAT ANALYSIS IS REQUIRED TO DELIVER THE NEW INFORMATION?

From descriptive analytics, to predictive and prescriptive there are layers of analysis. 

In terms of methodology, we can run from presentation of historical data, rules based engines. Machine Learning regression analysis, to deep learning models and incorporate the full arsenal of data science methodology. 

Only with the answers to the above steps in hand can data scientists realistically start to appropriately design the project. 

5. WHAT DATA IS REQUIRED TO ENABLE THE ANALYSIS

The data challenge is on two fronts: 

The immediacy of data being pulled from devices, and the integration of third-party data for the analysis. 

5.1 Streaming big data can get very expensive so if it is not going to be processed in real time , there is little point. Depending on the desired outcome in step 1, it is possible that steaming is utterly essential in IoMT in order to issue alerts that may change a behaviour and shape the desired outcome. However, it is likely that not all data needs to be received this way. Even the data which is required in this velocity may not need streaming in entirety, if transmission of the delta (the change in the data) is all that is required.

This last point requires consideration at the device design and engineering stage. 

5.2. No human is immune to their environment and therefore third-party data sets should be considered in any IoMT program including weather (temperature, air pressure, wind velocity), socio-economic data of the region in which the IoMT device is (subject to data privacy laws, and the analytical models are tuned such at getting this data incorrect does not massively sway the recommendation).  

6. WHAT TECHNOLOGY IS REQUIRED TO HOST ALL OF THE ABOVE?

As addressed in point five before taking the ‘stream everything approach,’ it is practical to understand the role that each data set plays in the overall program and ensuring that the project doesn’t have unnecessary cloud economic expense and engineering.

Each piece should be considered in its own use case which understands the value that that dataset brings to the project and considers the impact of it being steamed, or uploaded in a batch process periodically (and indeed the periods of those batch processes).

A Framework to Leverage IoMT Data

Leverage IoMT data requires a well-defined and measured approach. 

Here at DMI, our innovative digital solutions are able to seamlessly connect devices and software to deliver a complete patient data ecosystem. We enable devices, develop and integrate applications, and connect the back-end databases needed to support a true Internet of Medical Things: An infrastructure of health systems and services that improves patient outcomes while reducing the burden and costs of healthcare.

Contact us to learn more. 

New call-to-action

Back to Blog
l

Related Content

The Future of the Internet of Medical Things

Before many of us knew much about the Internet of Medical Things (IoMT), which is in essence ...

Making the Case for Transformation: The Dangers of Lagging Technology in State Agencies

At its core, digital transformation is the process of integrating forward-thinking technology into...

IoMT: Exposing Vulnerabilities & Cybersecurity Challenges

Recent advancements around the Internet of Medical Things (IoMT), medical devices, smart sensors,...