Navigating Snowflake's Time Travel

Navigating Snowflake's Time Travel

Thank you for reading my latest article Navigating Snowflake's Time Travel

Here at LinkedIn I regularly write about modern data platforms and technology trends. To read my future articles simply join my network here or click 'Follow'. Also feel free to connect with me via YouTube.


Introduction

In this excerpt is taken from my top-rated SnowPro Core Study Guide, I explore Snowflake’s Time Travel feature, which lets you access historical data and easily recover dropped objects—critical tools for managing data over time.

Time travel allows you to query data at a point in time, within the data-retention window, using SQL. This gives you the ability to view the exact state the data was in at a specific point in time. 

As with many features Snowflake provides, there’s nothing you need to do in the background to maintain these copies of data. You just need to define how long to keep the version of the data, which we’ll cover in the next section on data retention. By using some of the SQL extensions provided for time travel operations, you can move through versions of tables, schemas, and databases at various points in time. 

You can also use cloning and time travel in tandem, which allows you to create a clone of an object by specifying an earlier point in time. You can imagine how useful this would be when, for example, you need to compare a table of data from before and after a load process.

Data-retention periods

To cater to the flexibility time travel offers, Snowflake maintains versions of data as they change over time. Think of this as an audit trail or ledger of transactions applied to the data. Each change creates a new version of that data in the background. Snowflake will keep these versions for as long as the data-retention period is set.

In the Standard edition of Snowflake, the retention period is, by default, one day or 24 hours. As you can see in Table 9-1, it is possible to set time travel to 0, which is equivalent to disabling it, meaning historical data is no longer available to be queried. 

You must make the right choice from the outset. Extending the data-retention period from a lower to a higher number—for example, 0 to 1—doesn’t mean you’ll have access to that data immediately. In this instance, you’ll have to wait for a day to pass until you have a full day’s worth of data to access via time travel.

Table 9-1: Retention periods

Note: Make sure you understand the different default behaviors between the different editions as well as the retention periods for the types of tables.

To change the data-retention period, you can use the ACCOUNTADMIN role to set the value for the DATA_RETENTION_TIME_IN_DAYS parameter. Interestingly, you may use this parameter when creating a database, schema, or table to override the global default. This means if you have a small amount of business-critical data in your database (and you’re running the Enterprise edition or above), you could decide to set the retention period to 90 days for the objects holding that data while leaving all other objects at the default of one day. 

Note: It is important to remember that increasing the data-retention period will also increase storage costs. 

The following code snippet shows how you can set the data-retention period at the point when you create a table and then amend the retention at a later point in time:

CREATE TABLE ORDERS(ORDER_ID INT, PRODUCT_ID INT, ORDER_QUANTITY INT)
DATA_RETENTION_TIME_IN_DAYS = 60

ALTER TABLE ORDERS SET DATA_RETENTION_TIME_IN_DAYS=30;        

Querying historical data

To query previous versions of objects, you can use the AT or BEFORE clauses. The specified point can be a timestamp, time offset (from the current point in time), or a previously executed statement, as the following examples demonstrate.

Querying table data at a specific point in time using a timestamp:

SELECT * 
FROM ORDERS AT(TIMESTAMP => '2022-07-07 15:22:11.00 -0700'::timestamp_tz);        

Querying table data 30 minutes ago using a time offset:

SELECT * 
FROM ORDERS AT(OFFSET => -60*30);        

Querying table data up to, but not including, any changes made by the specified statement:

SELECT * 
FROM ORDERS BEFORE(STATEMENT => '019db306-3200-7542-0000-00005db6d821');        

Dropping and undropping historical data

When an object is dropped and time travel is enabled, the data is not removed from the account. Instead, the version of the object is held in line with the data-retention period.

This means if someone mistakenly drops an object, you can restore it by using the UNDROP command. There’s no need to concern yourself with locating and restoring backups in this situation!

You can list any dropped objects using the SHOW command along with the HISTORY keyword.

This code lists the history of all tables where the name is like ‘%TABLE_NAME’:

SHOW TABLES HISTORY LIKE '%TABLE_NAME';         

This code shows the history of schemas within the Sales database:

SHOW SCHEMAS HISTORY IN SALES_DB;        

This code shows the history of all databases in the Snowflake account:

SHOW DATABASES HISTORY;        

The result set includes all dropped objects. You’ll see multiple records if an object has been dropped more than once. By looking for the ‘DROPPED_ON’ column, you can see the time and date when the object was dropped.

Once you have found the object that you’d like to restore, you can run a statement like this one:

UNDROP DATABASE SALES_DB;        

This command will restore the Sales_DB database, including all the database objects and data, exactly as it was when it was dropped.

Fail-Safe

After the data-retention period associated with time travel ends, data cannot be viewed within your account. But that’s not the end of the story. There is one final resting point for your data, which you cannot see or access directly. For a further, non-configurable 7 days after data leaves the database, data from permanent objects ends up in something called a fail-safe. Think of this as a last-chance saloon where your data rests for a few days before finally biting the dust.

Note: Transient and temporary tables do not have a fail-safe period associated with them. Make a note of this for the exam.

If you cannot see the data in the fail-safe, then who can? Well, only Snowflake employees can access the fail-safe, so it may take several hours to recover the data from this area. Snowflake states that it is provided on a best-endeavor basis, meaning you should not rely on this as part of a disaster recovery scenario.

To stay up to date with the latest business and tech trends in data and analytics, make sure to subscribe to my newsletter, follow me on LinkedIn, and YouTube, and, if you’re interested in taking a deeper dive into Snowflake check out my books ‘Mastering Snowflake Solutions and SnowPro Core Certification Study Guide’.


About Adam Morton

Adam Morton is an experienced data leader and author in the field of data and analytics with a passion for delivering tangible business value. Over the past two decades Adam has accumulated a wealth of valuable, real-world experiences designing and implementing enterprise-wide data strategies, advanced data and analytics solutions as well as building high-performing data teams across the UK, Europe, and Australia. 

Adam’s continued commitment to the data and analytics community has seen him formally recognised as an international leader in his field when he was awarded a Global Talent Visa by the Australian Government in 2019.

Today, Adam is dedicated to helping his clients to overcome challenges with data while extracting the most value from their data and analytics implementations. You can find out more information by visiting his website here.

He has also developed a signature training program that includes an intensive online curriculum, weekly live consulting Q&A calls with Adam, and an exclusive mastermind of supportive data and analytics professionals helping you to become an expert in Snowflake. If you’re interested in finding out more, check out the latest Mastering Snowflake details.

To view or add a comment, sign in

More articles by Adam Morton

  • Snowflake Expands to Mexico and South Korea: What It Means for You

    Snowflake Expands to Mexico and South Korea: What It Means for You

    Thank you for reading my latest article Snowflake Expands to Mexico and South Korea: What It Means for You. Here at…

    1 Comment
  • Snowflake’s New Cloning Optimization Explained

    Snowflake’s New Cloning Optimization Explained

    Thank you for reading my latest article Snowflake’s New Cloning Optimization Explained. Here at LinkedIn I regularly…

  • Sustainable Technology Examples

    Sustainable Technology Examples

    Thank you for reading my latest article Skills Over Degrees - Is this the future of tech careers? At Future Proof, I…

  • 5 best practices for unlocking Document AI

    5 best practices for unlocking Document AI

    Thank you for reading my latest article 5 best practices for unlocking Document AI. Here at LinkedIn I regularly write…

  • Courage to Speak

    Courage to Speak

    Thank you for reading my latest article Courage to Speak At Future Proof, I regularly explore the evolving landscape of…

  • Truth About AI Hallucinations: Why Transparency Matters

    Truth About AI Hallucinations: Why Transparency Matters

    Thank you for reading my latest article The Truth About AI Hallucinations: Why Transparency Matters. Here at LinkedIn I…

    1 Comment
  • A Big Gap at the Snowflake World Tour

    A Big Gap at the Snowflake World Tour

    At Future Proof, I regularly explore the evolving landscape of next-generation tech jobs and emerging technology trends…

  • A Visit to Adelaide and a Lesson on Life's Fragility

    A Visit to Adelaide and a Lesson on Life's Fragility

    Thank you for reading my latest article A Visit to Adelaide and a Lesson on Life's Fragility. At Future Proof, I…

    1 Comment
  • A Journey Across the Globe

    A Journey Across the Globe

    Thank you for reading my latest article A Journey Across the Globe. At Future Proof, I regularly explore the evolving…

    1 Comment
  • Snowflake's File Sizing and Loading Tips

    Snowflake's File Sizing and Loading Tips

    Thank you for reading my latest article Snowflake's File Sizing and Loading Tips. Here at LinkedIn I regularly write…

    1 Comment

Insights from the community

Others also viewed

Explore topics