Set up Gmail logs in BigQuery

Supported editions for this feature: Enterprise Standard and Enterprise Plus; Education Standard and Education Plus. Compare your edition

Gmail logs store records for each stage of a message in the Gmail delivery process. To analyze Gmail flow through the delivery process, assign Gmail logs to a dataset in a BigQuery project. After the Gmail logs are assigned, you can review reports.   

Note: Gmail logs created before you set up Gmail Logs in BigQuery can't be exported to BigQuery.

To streamline your BigQuery tasks and let you manage all your BigQuery data in one place, we’re merging Gmail logs in BigQuery with Workspace logs and reports in BigQuery, starting in February 2023. You’ll manage BigQuery logs and reports for all your Workspace services in the same place in your Google Admin console. We recommend that you migrate your Gmail BigQuery views, queries, and scripts to Workspace logs and reports in BigQuery as soon as possible. For detailed steps, go to Migrate Gmail logs in BigQuery to Workspace logs in BigQuery.

Assign Gmail logs to a BigQuery dataset

  1. Sign in to your Google Admin console.

    Sign in using your administrator account (does not end in @gmail.com).

  2. From the Admin console home page, go to Appsand thenGoogle Workspaceand thenGmail​and thenSetupand then 

    Email Logs in BigQuery.

  3. Click Enable.
  4. Enter a description that will appear within the setting’s summary.
  5. Select the BigQuery project you want to use for Gmail logs. Select a project with write access. 
  6. Enter a dataset name where Gmail logs are stored, or use the default name gmail_logs_dataset.
  7. (Optional) Click Restrict the dataset to a specific geographic locationand thenselect a location (for example, United States).
  8. Click Save.

  9. After saving settings, go back to your BigQuery project. A dataset with this information is now in the project: 
    • The standard roles: project owners, project editors, and project viewers
    • Four service accounts that are designated dataset editors: 
      gmail-for-work-logs-writer1@system.gserviceaccount.com: Writes the logs. 
      gmail-for-work-logs-writer2@system.gserviceaccount.com: Writes the logs.
      gmail-for-work-logs-recoverer@system.gserviceaccount.com: Automatically restores the template table if it's accidentally removed. 
      gmail-for-work-logs-schema-updater@system.gserviceaccount.com: Updates the schema in the future.

      Note: Do not remove these service accounts or change their roles. These are required accounts.
  10. To verify these service accounts are added, point to the new dataset and click Down next to the dataset name.
  11. Click Share dataset. Daily email logs are now exported to BigQuery.

Changes can take up to 24 hours but typically happen more quickly. Learn more

daily_ table

After you turn on email logs in BigQuery, a new table named daily_ is added to the dataset. This table is a template that provides the schema for the daily tables. After you create the daily_ template, daily tables are automatically created in your dataset. The logs are then available for use. 

What you should know about the daily_ table:

  • It's always empty and never expires. 
  • Don't remove, modify, rename, or add data to this table.
  • It's a date-partitioned table. Actual data is written to a table named daily_YYYYMMDD, based on the GMT time when an event occurs.

Gmail log queries

Example queries

Try some example queries for Gmail logs in BigQuery. The examples are common use cases for Gmail logs.

Custom Queries

Compose your own, custom queries using the schema for Gmail logs in BigQuery.

SQL dialects for queries

BigQuery supports two SQL dialects for queries

Data might be truncated for some fields

It’s important to note that BigQuery has a maximum row size limit of 1MB. For this reason, some fields are truncated to make the log shorter than 1MB - 1KB, so that it can be inserted successfully into BigQuery. The 1KB is intentionally left as a buffer.

The following fields might be truncated if the log is too long, or the number of triggered rules (triggered_rule_info) in the log is too big:

message_info.subject
message_info.source.from_header_displayname
message_info.triggered_rule_info.string_match.match_expression
message_info.triggered_rule_info.string_match.matched_string

For more information, see Schema for Gmail logs in BigQuery.

Related information

Streaming inserts

Sandbox expiration

The expiration time for these BigQuery sandbox objects is 60 days:

  • Tables
  • Partitions
  • Partitions in partitioned tables
  • Views

You can change the default table expiration time for tables.

If a table expires or is removed, it can be restored within 2 days.

Was this helpful?

How can we improve it?
Search
Clear search
Close search
Google apps
Main menu
5626477390166746935
true
Search Help Center
true
true
true
true
true
73010
false
false
  翻译: