Snowflake’s New Cloning Optimization Explained

Snowflake’s New Cloning Optimization Explained

Thank you for reading my latest article Snowflake’s New Cloning Optimization Explained

Here at LinkedIn I regularly write about modern data platforms and technology trends. To read my future articles simply join my network here or click 'Follow'. Also feel free to connect with me via YouTube.


If you’re familiar with Snowflake, you probably already appreciate zero-copy cloning. It’s that magical feature allowing you to create fast, independent copies of databases, schemas, and tables without physically duplicating data. This capability has been indispensable for developers needing test or development environments without the cost or time required for full data replication.

But as organizations scale, the sheer volume of metadata involved can transform fast cloning into a bottleneck. What used to take seconds can start taking minutes—or worse. That’s where Snowflake’s latest optimization steps in, making a significant impact on cloning performance for even the largest databases.

A Quick Recap on Cloning

Zero-copy cloning works by duplicating metadata, not the data itself. Both original and cloned objects reference the same underlying data files, allowing for independent modifications moving forward. This unique approach minimizes storage usage and boosts speed, crucial for CI/CD pipelines and rapid development cycles.

How to use Cloning in Snowflake for beginners | Create test environments FAST! | Snowflake Tutorial

The New Optimization

The challenge comes when metadata grows from managing a few tables to thousands. Snowflake addressed this with an enhanced cloning process that parallelizes metadata copying by allocating more resources during the operation. The result? A dramatic reduction in execution time, even for metadata-heavy databases.

Performance Gains:

  • Small databases: Median execution time improved by 12%.
  • Medium databases: Saw a 41% faster cloning.
  • Large databases: Experienced an 82% reduction, with some users seeing a 7x faster clone time.

For instance, a healthcare provider reduced their schema cloning process from over 35 minutes to just five. This optimization was seamlessly rolled out across all Snowflake regions, providing users with better performance without any configuration changes.

Why This Matters

This isn’t just a technical tweak; it’s a quality-of-life improvement for data teams. Faster cloning ensures that developers can spin up test environments or snapshot data without disrupting their workflow. It helps maintain the agility that Snowflake promises, even as usage scales.

Final Thoughts

Snowflake has addressed what could have become a pain point as data operations scale, which helps maintain its commitment to a reliable, high-performance and easy-to-use cloud data platform. If you’re working with large datasets or running complex workflows, take advantage of these improvements for more efficient and cost-effective development cycles.

To stay up to date with the latest business and tech trends in data and analytics, make sure to subscribe to my newsletter, follow me on LinkedIn, and YouTube, and, if you’re interested in taking a deeper dive into Snowflake check out my books ‘Mastering Snowflake Solutions and SnowPro Core Certification Study Guide’.


About Adam Morton

Adam Morton is an experienced data leader and author in the field of data and analytics with a passion for delivering tangible business value. Over the past two decades Adam has accumulated a wealth of valuable, real-world experiences designing and implementing enterprise-wide data strategies, advanced data and analytics solutions as well as building high-performing data teams across the UK, Europe, and Australia. 

Adam’s continued commitment to the data and analytics community has seen him formally recognised as an international leader in his field when he was awarded a Global Talent Visa by the Australian Government in 2019.

Today, Adam is dedicated to helping his clients to overcome challenges with data while extracting the most value from their data and analytics implementations. You can find out more information by visiting his website here.

He has also developed a signature training program that includes an intensive online curriculum, weekly live consulting Q&A calls with Adam, and an exclusive mastermind of supportive data and analytics professionals helping you to become an expert in Snowflake. If you’re interested in finding out more, check out the latest Mastering Snowflake details.

To view or add a comment, sign in

More articles by Adam Morton

Insights from the community

Others also viewed

Explore topics