Google BigQuery: Export Events
Export event data from any Lytics data stream to Google BigQuery. NOTE: Unlike user fields, events are not represented within the Lytics dashboard. Events are the raw data received from integrations as seen in your data streams.
- Implementation Type: Server-side.
- Implementation Technique: REST API.
- Frequency: Batch.
- Resulting data: Raw event data is stored in the resulting table.
This integration utilizes the Google BigQuery APIs to send user data. Once the export is started the workflow will:
- Check if dataset exists, if not create it in Google BigQuery.
- Check if a table exists, if not create one.
- Generate a CSV for each table to be created or updated in BigQuery by:
- Scan through events on the data stream.
- Generate CSV row(s) for each scanned event.
- Write CSV row to Google Cloud Storage.
- Upload the CSV(s) to BigQuery.
There are a few things to know when running an event export:
- The event export scans the complete data stream from the first collected event to the current event, unless otherwise specified by the export configuration.
- The export will run continuously throughout the day.
- Field names in BigQuery will not be an exact match of field names in Lytics. Names will be formatted to fit BigQuery's field naming schema.
- If two field names are transformed to BigQuery format and conflict, one will get an integer appended to the end of the field name. For example, if both
First Nameexist in Lytics and are exported, they'll be written to
- Low usage fields (fields populated on <0.1% of total volume for the exported stream) will not be included within the event export.
The fields included will depend on the raw event in Lytics data. All fields in the selected stream will be included in the exported table. To see a list of fields in a stream, select the stream name from Data > Data Streams in the Lytics platform.
Follow these steps to set up and configure an export of events from Lytics to Google BigQuery.
- Navigate to Google in the Integrations section of Lytics.
- Select Workflows from the menu on the left.
- Select Export Events from the list of workflows.
- Select the authorization you would like to use.
- From the BigQuery Project input, select the Google BigQuery Project you want to export data to.
- From the BigQuery Dataset input, select the Google BigQuery Dataset you want to export data to.
- From the Data Streams to Export input, select data streams to export. A stream is a single source/type of data. You may choose more than one. If none are selected, all streams will be exported.
- In the Maximum numeric field, enter the number of events to be exported. If left blank, all events will be exported.
- In the Start Date text input, enter the date of the oldest event you want to export. Events from this date onwards will be exported. Use
- In the End Date text input, enter the most recent date you want to export. Events before, but NOT including this date will be exported. Use
- Click on the Show Advanced Options tab to expand the advanced configuration.
- Select the Partitioned Table checkbox to use partitioned tables that can help lower query costs by allowing you to target a specific date range of data instead of the whole data set. Tables will be partitioned by the event date.
- Select the Flatten Streams checkbox to export all streams to one BigQuery table.
- Select the Keep Updated checkbox to continuously run this export.
- Select the Start Export From Now Onwards checkbox to only export events collected from now onwards. This will override any start or end date configuration made.
- Click Start Export.