Accelerating modernization

Accelerating modernization

In today’s rapidly evolving digital landscape, modernization has become a critical imperative for organizations across industries. Faced with growing demands for agility, resilience, and innovation, businesses can no longer afford to rely on outdated systems or approaches and must embrace transformation to thrive amidst constant change.

For organizations looking to turn modernization from a concept into reality, it’s not just about upgrading technology — it’s about transforming core capabilities and future-proofing the entire business. In this edition of Tech to Know, we’ve curated essential insights and resources to help leaders confidently navigate this complex journey. 

CEO’s top IT priorities in 2024:

Source: Foundry / CIO .com

In our latest Perspectives edition, Ashok Subramanian , Shodhan Sheth and Tom Coggrave dive deep into the nuanced challenges and opportunities that modernization presents in today’s digital-first economy. It explores strategies to help companies not only maximize ROI on their transformation investments but also lay the foundation for long-term innovation.

Here are some of the key takeaways:

➡️ Evolving architectures for agility: Understand why traditional architectures often fall short in today’s fast-paced digital landscape, and learn how to adopt more flexible, adaptive solutions that support continuous delivery and faster time-to-market.

➡️ Data-driven decision-making: Explore the pivotal role data plays in driving modernization, empowering organizations to make informed decisions and rapidly respond to change.

➡️ Cultural transformation: Modernization isn’t just about technology — it’s about people. Our experts share insights on fostering a culture that embraces change, promotes cross-functional collaboration, and strengthens digital resilience. Read more.

For more practical insights like these, direct to your inbox, don’t forget to subscribe to Perspectives. Stay up to date with the latest trends, strategies, events, and expert advice which will empower you to navigate organizational change with confidence. 


Leveraging AI for Legacy Modernization

Making the most of AI in the modernization process

Outdated systems can often stall progress, but advancements in AI are opening new pathways for modernization. In this insightful episode of Pragmatism in Practice, Erik Doernenburg , CTO Europe, delves into how AI-driven tools, especially large language models (LLMs), are transforming the legacy modernization process. By harnessing the power of LLMs, businesses can streamline software engineering, improve productivity, and enhance user experience.

Through practical examples, Erik highlights how LLMs excel at analyzing and interpreting code—enabling software teams to quickly understand complex, outdated codebases and even infer the original intent behind legacy code. This capacity for "explanation and intent capture" allows developers to more efficiently map a modernization path, ultimately speeding up transformation efforts and reducing reliance on hard-to-find expertise.

🎧 Listen now ⤵️


Legacy modernization – A transformation opportunity

From left to right: Omar Bhasir, Principal Consultant, Shodhan Sheth, Global Head of Enterprise Modernization and Luke Vinogradov, Head of Digital Transformation, UK

Modernizing legacy systems can be daunting, yet it also presents a unique opportunity to transform your enterprise. Thoughtworks’ E-Book, Legacy modernization: A transformation opportunity, offers a deep dive into strategies to evolve legacy infrastructures and embrace new technologies that enable agility and resilience. With a focus on value-driven modernization, this guide equips leaders with frameworks to assess, prioritize, and implement modernization initiatives, ensuring that legacy assets become enablers, rather than obstacles, to growth.

Download the e-book, authored by Omar Bashir Shodhan Sheth Luke Vinogradov


Join us at Amazon Web Services (AWS) re:Invent for an exclusive roundtable on AI-powered modernization.

🗓️ On December 3rd, from 12:00 pm to 2:00 pm PDT, industry experts will explore how generative AI is transforming modernization efforts, enhancing decision-making, and predicting potential challenges. Gain practical insights from successful case studies and learn how AI can support skill development, drive cultural change, and maximize ROI in your modernization initiatives.

Spaces are limited — don’t miss this opportunity to engage with industry experts and innovators. Register now to secure your spot!


Hope you enjoyed this edition of Tech to Know. Read all past editions and subscribe for more.

Alex Benjamin

Founder @MoMapping | Generative Artificial Intelligence | AI Agents Engineering | Python | Devops | Cloud | Mainframe Modernization Leader

1mo

Great podcast! I think the greatest problem with legacy code, is getting all the interrelationship on the legacy environment. So there must be a compreenhsive knowledge not only of the code, but the environment too. For example, in mainframe context, the llm must be aware of the schedulling tool(tws/jobtrac/changeman), jobs sequence order and its interdependencies. Additionaly it must be aware of the JCL, and its resources, for example, for each dataset, what is the name, and number, so can further be mapped inside the program, specially if you are modernizing Natural programs. Using reengineering tools is a smart choice, but can become a headache if working with no integration of contexts or different levels of abstraction. To solve this problem, a unified parsing strategy could be a ideal solution, feeding the rag/graphrag pipeline with all relevant context. So the llm would be able to generate more accurate responses.

Giulia Solinas, Ph.D.

Data Scientist | Working on ML and LLMs | Mum

1mo

It's a very insightful podcast. The triangulation between reverse-engineering tools, the RAG pipeline, and the call to LLM is an excellent approach. The first step with the reverse-engineering tools for parsing old legacy code can really nail down the "context" of the code with its dependencies, and this can be a game-changing insight for the LLM. Could you share some insights on how you use smaller models, as hinted at in the last part of the podcast?

To view or add a comment, sign in

More articles by Thoughtworks

Insights from the community

Others also viewed

Explore topics