Understanding Lytics / Integrations

Google BigQuery

Lytics collects your data from a wide variety of data sources (you can find an up-to-date list of supported connections in the Integrations section of Lytics) and we ensure that this data is available for easy export. We can automatically take data from many sources and make it available in Tableau for reporting. So email, web, Facebook, Twitter, and mobile data is pre-loaded and available in your Tableau on the world's most powerful query engine, Google BigQuery. Learn more about the integration between Tableau and BigQuery.

This article covers:

Set up a Google BigQuery account

To set up a Google BigQuery account, follow the quickstart guide from Google. You may also want to check out Google's documentation on BigQuery.

Authenticate with Google BigQuery

There are three different options for authenticating with Google BigQuery:

  • Lytics Managed
  • User Account
  • Service Account

See Google's documentation on authentication for details on the differences between user account and service account.

Lytics managed

Lytics provides an authorization for Google BigQuery with no setup is required. Select Use Lytics Managed instead when selecting an authoriztion.

User account

The user account authorization uses a user account to access Google BigQuery.

  1. Log into your Lytics account.
  2. Click Data > Integrations and select Google from the integrations list. webadmin integration google cloud
  3. Select one of the workflows you want to use.
  4. From the Add new authorization dropdown, select Big Query OAuth. BigQuery google auth
  5. From user selection window, select the google account you want to use from the list of accounts. BigQuery google popup
  6. Click Allow. BigQuery google authorize
  7. In the Description textbox enter a meaningful description for this authorization.
  8. Click Authorize. BigQuery google auth descrip

Service account

The Service Account authorization uses a service account to access your Google BigQuery data.

  1. You will need a service account credential file to use this authorization type. If you do not have one, follow the instructions in Google's documentation to create one.

    • For imports, the service account must have at least Bigquery Data Viewer level permissions.
    • For exports, give the service account must be able to create tables and write to the table, so the permission must be at least BigQuery Data Editor level permissions.
  2. Log into your Lytics account.
  3. Click Data > Integrations and select Google from the integrations list. webadmin integration google cloud
  4. Select the workflows you want to use.
  5. From the Add new authorization dropdown, select Big Query Service Account JWT. BigQuery google auth
  6. In Cert JSON textbox, copy the contents of the service account's credential file.
  7. In the Description textbox enter a meaningful description for this authorization.
  8. Click Authorize. BigQuery google JWT

BigQuery User Export

Summary
FrequencyOne-time only / Daily at midnight
Exports toBigQuery table(s)
Nameuser_ + audience slug
Identifiersnone
Mapped Fieldsyes
Segment ExportsUser selected
Typeadd & remove

The Google BigQuery user export allows user profiles (fields and audience memberships) to be exported to BigQuery. There are a few things to know when running a BigQuery user export:

  • The user exports scans the complete Lytics audience and exports all entries into a BigQuery table.
  • The export will run at most once a day, but large audiences may take more than a day to complete.
  • For large audiences (bigger than a million users), users will first be written into a dated user table user_ + audience slug + _ + date as the table builds in BigQuery. As more batches of users are exported, they will be appended to the dated table. Once the complete audience has been sent to BigQuery, the table will be renamed with the date removed.
  • If continuous export is selected, the work will start running again on the next day at the time specified. A new temporary dated audience table will be created in BigQuery alongside the completed non-dated table.
  • Fields named in BigQuery will not be an exact match of field names in Lytics. Names will be formatted to fit BigQuery's field naming schema.
  • If two field names are transformed to BigQuery format and conflict, one will get an integer appended to the end of the field name, e.g. if both "first_name" and "First Name" exist in Lytics and are exported, they'll be written to "first_name" and "first_name1" in BigQuery.
  • If non-scalar fields are exported, each field will be written to a separate table, with a unique identifier id field that can be used to join the main user table to the fields tables.
  • This id is not an identifier that exists in Lytics, but rather it's a hash of all the user BY fields. This is calculated for the express purpose of providing a way to join the non-scalar field tables to the main user table.

Setting Up the BigQuery User Export

  1. Verify that you have a Google BigQuery account.
  2. Log into your Lytics account
  3. Click Data > Integrations and select Google from the integrations list. webadmin integration google cloud
  4. Select Export Users.
  5. If prompted, set up a BigQuery authorization.
  6. From the Project dropdown, select the BigQuery project to export to.
  7. From the Dataset dropdown, select the BigQuery dataset to export to.
  8. From the Audience dropdown, select the Lytics audience to export to BigQuery.
  9. In the Maximum textbox, enter the maximum number of users to export. If left blank all users will be exported.
  10. Select the Audience Membership checkbox to export the audience membership for each user in a separate table.
  11. From the Export Non-Scalar Fields multi-select, select the non-scalar fields to export. Non-scalar fields are nested fields such as arrays and maps. These will be created in separate tables as our integration tries to flatten field values as rows to fit BigQuery SQL tables.
  12. From the Export Scalar Fields multi-select, select the scalar fields to export. Scalar fields, such as integers and strings, hold a single value.
  13. Click Show advanced options to view additional options.
  14. Select the Keep Updated checkbox to export users on a daily basis. If not selected the export will run once.
  15. Select Keep Old to create a new table each time the export runs. Table names will include the date the table was created (ex: user_all_20180516). If not selected the table will be rewritten each time the export runs.

BigQuery Event Export

Summary
FrequencyOne-time only / Every 6 hours
Exports toBigQuery table(s)
Nameevents_ + stream
Identifiersnone
Mapped Fieldsno
Segment ExportsUser selected
Typeadd, no removal

The Lytics to BigQuery event export allows you to export event data from any Lytics data-stream to Google BigQuery. NOTE: Unlike user fields, events are not represented within the Lytics dashboard. Events are the raw data received from integrations as seen in your data streams. There are a few things to know when running a event export:

  • The event export scans the complete data stream from the first collected event to the current event, unless otherwise specified by the import configuration.
  • The export will run continuously throughout the day.
  • Fields names in BigQuery will not be an exact match of field names in Lytics. Names will be formatted to fit BigQuery's field naming schema.
  • If two field names are transformed to BigQuery format and conflict, one will get an integer appended to the end of the field name, e.g. if both "first_name" and "First Name" exist in Lytics and are exported, they'll be written to "first_name" and "first_name1" in BigQuery.

Setting Up the Event Export:

  1. Log into your Lytics account.
  2. Click Data > Integrations and select Google from the integrations list. webadmin integration google cloud
  3. Select Export Events.
  4. Select the authorization you want to use.
  5. From the Streams multi-select, select the data streams to export
  6. In the Maximum textbox, enter the maximum number of users to export. If left blank all users will be exported.
  7. In the Start Date textbox, enter a date in yyyy-mm-dd format (ex: 2015-08-13). Only events from this date onwards will be exported.
  8. In the End Date textbox, enter a date in yyyy-mm-dd format (ex: 2015-08-14). Only events before, but NOT including this date will be exported.
  9. Click Show advanced Options to view additional options.
  10. Select the Flatten Streams? checkbox to send all events into one table. If this is not selected, each stream will be exported to its own table.
  11. Select the Keep Updated checkbox to export events every six hours. If not selected the export will run once.
  12. Select the Start Export From Now Onwards checkbox, to only export new events (events that occure after the export starts). If not selected, historical events will also be exported.

Note: In order to comply with BigQuery limits, the maximum number of unique fieldnames for each data stream is 10,000.

Export Audience Changes to BigQuery

Summary
FrequencyOne-time only / Daily at midnight
Exports toBigQuery table(s)
Namelytics_seg_changes
IdentifiersUser selected
Mapped Fieldsno
Segment ExportsUser selected
Typeadd, no removal

The Google BigQuery audicence changes export allows audience change events to be exported to Google BigQuery.

  1. Verify that you have a Google BigQuery account.
  2. Log into your Lytics account
  3. Click Data > Integrations and select Google from the integrations list. webadmin integration google cloud
  4. Select Export Users.
  5. If prompted, set up an authorization.
  6. From the Project dropdown, select the BigQuery project to export to.
  7. From the Dataset dropdown, select the BigQuery dataset to export to.
  8. From the Audiences multi-select, select the Lytics audiences you want to export to Google BigQuery.
  9. From the ID Field dropdown, select the ID field in Lytics that’ll get mapped to an identifying field in BigQuery.
  10. Click Start Export.

Importing Events

Summary
FrequencyOne-time only / Every 6 Hours
StreamsUser Selectable
User fieldsUser defined
Provider fieldsNone
User activityUser defined
Campaign contentUser defined
Click URL paramUser defined

To setup an import from BigQuery to Lytics:

  1. Log into your Lytics account.
  2. Click Data > Integrations and select Google from the integrations list.
  3. Select Import Events.
  4. Select the authorization you want to use for this import.
  5. In the Stream textbox enter the stream where imported data will go.
  6. From the BigQuery Project dropdown, select which BigQuery project to use. Make sure you have the right privilegies to import from your project of interest.
  7. From the BigQuery Dataset dropdown, select the group of tables where your table is.
  8. From the BigQuery Table dropdown, select the table to be imported.
  9. From the Timestamp Field dropdown, select the name of the field that contains the event timestamp. On continuous imports, the most recent time in this field will be saved. On the next import, only rows with a timestamp greater than that value will be imported.
  10. Select the Keep Updated checkbox to run this import every 4 hours. If this option is selected, the Timestamp Field must be provided. If not selected, the import will run once.
  11. Click on Start Import to begin the import.

The time necessary to import a table depends on table size. It can take from 30 minutes to several hours to complete the import. It's important to be mindful that big imports can make your entire Lytics account slower as data is processed processed and audience memberships are updated.

How to Give 3rd Parties Access to Export Data in BigQuery

When exporting data from Lytics to BigQuery, you can specify email addresses that you would like to give access to. However, if you later decide that you would like to give access to another person, you can do this from within the BigQuery. See Google's documentation regarding access controls for datasets.

Exporting data from BigQuery

Please see the Google BigQuery documentation on Exporting Table Data for options and instructions for exporting data from BigQuery.