The promise of an AI-powered future in healthcare
I spent the morning watching thought videos talking about the promise of AI in facilitating drug discovery. The videos were about a year old and mainly the work of partnerships of pharmaceutical and tech companies, showing how expanding the size of trials and data sets from hundreds to hundreds of thousands of people courtesy of the reach and scale of AI, would perfect models and open up a whole new world where researchers will truly flex their mental muscles and make discoveries imminent.
Three things. First it’s clear to me that all AI is today, is really an expanded data set with a machine powering the ability to do rapid searches on that data and find patterns in ways that people can’t do at scale and speed, but all it is, is a really fast control-F matching process.
Second, by expanding the data set from a few hundred to a few hundred thousand, all they’ve done is gone from an r score of .9 to .91 or .95. Better, yes, but when you’re talking about discovery, the incremental gain is a fallacy. You’re nowhere near the stage of putting medicine into patients so an extra two or 3% accuracy is meaningless at best, an inhibitor at worst because it just makes more precise noise.
Third, and this one is most important — it’s so clear to me that what’s needed to advance the state of AI powered discovery isn’t more processing or data. These software driven data models are built just because “we have the data” not because they’re actually structured in a way that will help science to move forward.
We need to inject humanistic, liberal arts thinking that appreciates data for what it is within the power of machine driven statistical models. And that has ideas. That can ask questions and shape models around ideas that really drive discovery, focus on what will matter, that brings together very unlike things or goes past rabbit holes and gets to the point. Or at least can call bullshit on self aggrandizing data for data’s sake insights.
When we have a hammer and we get good at hammering, everything in our labs or markets will look like nails. Instead of even more AI, we need to make teams in laboratories, just as in the halls of marketing, that integrate hard-core data science with humanistic thinking, this will be the only way to teach AI-powered anything to be human, to help care, to think about the right routes, and actually generate something new that matters, not just do massive copy paste jobs at scale.
I’m not at all anti tech or against data, you need to understand both to get to this point and to be ready to really uncork the bottle and deliver something meaningful, not just the most precise level of average.
Comments
Post a Comment