The Real Limit of AI: It’s Not the Models—It’s Your Data, Culture, and Expectations
“De l’autre côté du miroir” - By Joseph Géoui

The Real Limit of AI: It’s Not the Models—It’s Your Data, Culture, and Expectations

In today’s corporate world, it’s almost a given that #AI will be mentioned in every investor call. Not discussing your AI strategy might even make you seem like you’re falling behind. But here’s the reality: many organizations are enamored with shiny new tools, searching for problems to apply them to. As the saying goes, “If all you have is a hammer 🔨 , everything looks like a nail.”

This is especially evident in the pharmaceutical industry, where the key metrics are staggering—and not in a good way. Projects fail 95% of the time, it takes around 12 years to launch a drug, and costs can range from $2.5 billion (industry average) to $6.5 billion for top pharma companies. Adjusted for inflation, these figures have increased by 80x over the past 50 years. Technological advancements were supposed to streamline processes, but instead, they’ve often made problems worse.

How so?

Technical and scientific developments have exponentially increased the quantity of #data in an industry already drowning in it. Yet, the industry hasn’t paused to think,

“If we’re producing more data, perhaps we should find better ways to handle it.”

Boldly put, most pharma companies don’t have a data culture. And without a strong data foundation, implementing AI is like having a co-pilot flying a plane ✈️ through thick fog without radar—how’s that for a ride?

Then there’s the cultural aspect. Scientists are skeptical 🧐 by nature—that’s how they’re trained to think. Introducing new solutions often meets with skepticism, sometimes to the point where the focus is solely on the negatives of new AI tools, overlooking their potential. They don’t view it as a “new experiment that needs many iterations before it actually works.”

So, what’s the way forward?

Understand What AI Is Good at Today

  • AI excels at repetitive tasks involving large quantities of data—analyzing, comparing, combining, and summarizing faster and more accurately than any human.
  • However, it’s not yet ready to predict complex outcomes like the toxicity of new compounds in animals and their translation to humans. While it’s important to invest in these areas for the future, don’t expect them to be production-ready now.

Find the Right Problems

  • Identify tasks where AI can add immediate value, such as automating repetitive processes (as mentioned above☝️) - and trust me, your organizaiton is full of that (who has never done copy/paste from one excel document to another???🤯).
  • Today’s AI, especially Large Language Models (#LLMs), is particularly effective at these tasks.

Invest in Your Data

  • Adopt #FAIR data principles—making your data Findable, Accessible, Interoperable, and Reusable.
  • If you’re not familiar with FAIR, bring in external expertise to help get your data in shape. Without good data, even the best AI models can’t deliver meaningful results.

Drive Cultural Change

  • Start small and demonstrate value through pilot projects.
  • Create internal champions and evangelists who can advocate for broader adoption.
  • Embrace the iterative nature of implementing new technologies.

In conclusion, the limit of AI isn’t the models themselves—it’s the quality of your data (quantity and diversity is important too … but if you don’t have quality is a non starter!), the readiness of your organization, and the alignment of your expectations. Fixing data problems isn’t glamorous, but it’s essential. By focusing on these areas, we can truly unlock the transformative potential of AI.

What are your thoughts on building a data culture and aligning expectations for AI adoption? I’d love to hear your experiences!

Holly Lynch

Life Sciences Leader Seeking Strategy & Advisory/BOD Opportunities in AI, Robotics, cloud lab & Digital ventures in the space. Expertise in Driving Growth & Innovation. Interested in driving 2.0 for Contract Research.

3mo

Thibault - I 100% agree with you. I’ve tried many things to motivate organizations to understand how the underlying data impacts the success of wanting to live quickly with their digital ambitions. I’d be curious to hear from others on how our industry can address this? It will hold us back from making step improvements with theee technologies.

Bartlomiej Przyborowski

Partnership Director RWE/RWD | Clinical trials | CRO | Digital Healthcare

3mo

🤖 🧠 Thibault GEOUI 🧬 💊 I totally agree with you! AI models are getting better and better, BUT it turns out the data they’re being fed with is unstructurized, full of gaps, without standardization or too many standards. At CliniNote I’m on the side of „help to create quality data first” :)))

Like
Reply
Frédéric Célerse, PhD

Research scientist in AI for Chemistry

3mo

Thanks for highlighting this. While it is known since 10/20 years in the scientific community, it remains neglected by most of the people who want to use « AI » but do not know what really is 🙄☺️

To view or add a comment, sign in

More articles by 🤖 🧠 Thibault GEOUI 🧬 💊

Insights from the community

Others also viewed

Explore topics