crossroads

Share this post
Everything begins with Data but AI needs to be local
www.omou.in

Everything begins with Data but AI needs to be local

AI in healthcare is software + services!

sa
Aug 22, 2022
Share this post
Everything begins with Data but AI needs to be local
www.omou.in

Approximately 30% of world data comes from healthcare.

US healthcare generates 1.2 billion clinical documents

This is growing by 48% every year

80% of this data is unstructured

There is a firehose of data coming in via ‘omics, streams and SDoH

Then there is the User Generated Data (wearables, surveys, …)

Now add claims data, provider data, credentials to the mix.

Getting to clean, complete and validated datalake is a multi-step challenge:

- For the health system

- For the network

- For the patient

- For the cohort

This right here is the task at hand. We fix this, we can begin fixing all the interoperability and cost curve challenges:

  • Eliminate error ridden provider directories

  • Redundant testing

  • Automated Prior auths & reduced denials/delays

  • Faster access to care (34% reduction in time to begin care - specific cases - MS)

  • Richer insights at point of care

  • Better AI algorithms

  • Faster automation (since systems can talk to each other easier)

Given this amazing potential smart use of data can deliver in health and care, whats stopping us?

AI is local. 

Actually, valuable, differentiated AI is local. 

This emerges from a poorly recognized fact that some key areas of healthcare - clinical, payments, operations and staffing are very local and regional.

  1. What makes up the mix of care elements that results in great outcomes for HTN in your market?

  2. How does OR scheduling happen in the hospitals of that region - driven by patient behavior, staff dynamics, casemixes, weather and other factors?

  3. What are the payor terms and network models in play in the region?

These and many more like this make the difference between a model that supercharges performance vs becoming distracting noise.

The things that make a difference in that local market need to be factored into decision support for the models to be valuable and differentiated.

Pre-trained models are fine and dandy for scaled areas such as procurement but not for the areas we discussed.

For hospitals and other care frontline firms (payors etc), these AI models are the lifeblood. The uplift that is going in the sector to integrate data sources, make them talk to each other and surface useful datasets is non-trivial at the health system level. All the effort that goes into it is supposed to lead to thoughtful and relevant development of ML models that enable care delivery/operations/financing for that health system of payor. This is mission critical stuff and should not be bought out of box.

There is already emerging proof in the ways that matter most: Gross margins for AI startups in health:

a16z analysis shows that 

In particular, many AI companies have:

1. Lower gross margins due to heavy cloud infrastructure usage and ongoing human support;

2. Scaling challenges due to the thorny problem of edge cases; - in healthcare, edge is BIG (added).

3. Weaker defensive moats due to the commoditization of AI models and challenges with data network effects.

Anecdotally, we have seen a surprisingly consistent pattern in the financial data of AI companies, with gross margins often in the 50-60% range – well below the 60-80%+ benchmark for comparable SaaS businesses.

So what does this mean? AI should well be a software + services model and not a SaaS to win.

There are and will be areas where pre-trained models fit well and should be used. But for things that matter, a better approach is to own the models. These are the moats, the secret sauce that will make the health systems or other health firms hum. 

Now combine this with a clear trend in systems towards a small group of trusted partners who have the expertise and stewardship to aggregate point solutions, work with health firms and help build this critical capability inhouse.

That’s where I think the future will lie:

  • Owned lakehouses of data and engineering capabilities (mix of inhouse and trusted service provider partners)

  • Pre-trained models deployed with minimal tuning/fitting where relevant and valuable

  • Owned models developed with a seed model or ground up along with the trusted partner ecosystem.

We got to keep in mind the very real operational and financial challenges of CloudOps/MLOps as inhouse AI efforts increase.

Share this post
Everything begins with Data but AI needs to be local
www.omou.in
Comments
TopNew

No posts

Ready for more?

© 2023 sa
Privacy ∙ Terms ∙ Collection notice
Start WritingGet the app
Substack is the home for great writing