The Real Limit of AI: It’s Not the Models—It’s Your Data, Culture, and Expectations
In today’s corporate world, it’s almost a given that #AI will be mentioned in every investor call. Not discussing your AI strategy might even make you seem like you’re falling behind. But here’s the reality: many organizations are enamored with shiny new tools, searching for problems to apply them to. As the saying goes, “If all you have is a hammer 🔨 , everything looks like a nail.”
This is especially evident in the pharmaceutical industry, where the key metrics are staggering—and not in a good way. Projects fail 95% of the time, it takes around 12 years to launch a drug, and costs can range from $2.5 billion (industry average) to $6.5 billion for top pharma companies. Adjusted for inflation, these figures have increased by 80x over the past 50 years. Technological advancements were supposed to streamline processes, but instead, they’ve often made problems worse.
How so?
Technical and scientific developments have exponentially increased the quantity of #data in an industry already drowning in it. Yet, the industry hasn’t paused to think,
“If we’re producing more data, perhaps we should find better ways to handle it.”
Boldly put, most pharma companies don’t have a data culture. And without a strong data foundation, implementing AI is like having a co-pilot flying a plane ✈️ through thick fog without radar—how’s that for a ride?
Then there’s the cultural aspect. Scientists are skeptical 🧐 by nature—that’s how they’re trained to think. Introducing new solutions often meets with skepticism, sometimes to the point where the focus is solely on the negatives of new AI tools, overlooking their potential. They don’t view it as a “new experiment that needs many iterations before it actually works.”
So, what’s the way forward?
Recommended by LinkedIn
Understand What AI Is Good at Today
Find the Right Problems
Invest in Your Data
Drive Cultural Change
In conclusion, the limit of AI isn’t the models themselves—it’s the quality of your data (quantity and diversity is important too … but if you don’t have quality is a non starter!), the readiness of your organization, and the alignment of your expectations. Fixing data problems isn’t glamorous, but it’s essential. By focusing on these areas, we can truly unlock the transformative potential of AI.
What are your thoughts on building a data culture and aligning expectations for AI adoption? I’d love to hear your experiences!
Life Sciences Leader Seeking Strategy & Advisory/BOD Opportunities in AI, Robotics, cloud lab & Digital ventures in the space. Expertise in Driving Growth & Innovation. Interested in driving 2.0 for Contract Research.
3moThibault - I 100% agree with you. I’ve tried many things to motivate organizations to understand how the underlying data impacts the success of wanting to live quickly with their digital ambitions. I’d be curious to hear from others on how our industry can address this? It will hold us back from making step improvements with theee technologies.
Partnership Director RWE/RWD | Clinical trials | CRO | Digital Healthcare
3mo🤖 🧠 Thibault GEOUI 🧬 💊 I totally agree with you! AI models are getting better and better, BUT it turns out the data they’re being fed with is unstructurized, full of gaps, without standardization or too many standards. At CliniNote I’m on the side of „help to create quality data first” :)))
Research scientist in AI for Chemistry
3moThanks for highlighting this. While it is known since 10/20 years in the scientific community, it remains neglected by most of the people who want to use « AI » but do not know what really is 🙄☺️