Documentation / Product / Integrations / Microsoft Azure

Azure: Import CSV from Blob Storage

Importing custom data stored in Azure Blob Storage to Lytics allows you to leverage Insights provided by Lytics data science to drive your marketing efforts.

Integration Details

This integration uses the Storage Service API to read the CSV file stored in Azure Container. Each run of the workflow will proceed as follows:

  1. Finds the file selected during configuration using Blob Storage service of Azure.
  2. If found, reads the contents of the file.
  3. If configured to diff the files, it will compare the file to the data imported from the previous run.
  4. Filters fields based on what was selected during configuration.
  5. Sends event fields to the configured stream.
  6. Schedule the next run of the import if it is a scheduled batch.

Fields

Fields imported via CSV through Azure Storage Service will require custom data mapping. For assistance mapping your custom data to Lytics user fields, please reach out to Lytics support.

Configuration

Follow these steps to set up and configure a CSV Import from Azure Storage in the Lytics platform.

  1. From Data > Integrations navigate to Microsoft Azure. azure tile
  2. Select Workflows.
  3. Select Import CSV from Blob Storage.
  4. Select the authorization you created during the authorization step.
  5. From the Stream box, enter or select the data stream you want to import the file(s) into.
  6. From the Container drop-down, select the Azure container with the file you want to import.
  7. In the File drop-down, select the file to import. Listing files may take up to a couple minutes after the bucket is chosen.
  8. (Optional) In the Custom Delimiter box, enter the delimiter used. For tab delimited files enter "t". Only the first character is used (e.g. if "abcd" was entered, only "a" would be used as a delimiter).
  9. (Optional) From the Timestamp Field drop-down, select the name of the column in the CSV that contains the timestamp of an event. If no field is specified, the event will be timestamped with the time of the import.
  10. (Optional) Use the Fields input to select the fields to import. Leave empty to import all fields. If no field names appear, the Custom Delimiter may need to be adjusted. Also check to ensure the CSV file has an appropriate header row.
  11. (Optional) Select the Keep Updated checkbox to run the import on a regular basis.
  12. (Optional) Select the Diff checkbox on repeating imports to compare file contents to the previous file contents and import only rows that have changed. This is useful when full data dumps are provided. azure-import-p1
  13. Click on the Show Advanced Options tab to view additional Configuration options.
  14. (Optional) From the Time of Day drop-down, select the time of day for the import to be scheduled after the first import. This only applies to the daily, weekly, and monthly import frequencies. If no option is selected the import will be scheduled based on the completion time of the last import.
  15. (Optional) From the Timezone drop-down, select the time zone for the Time of Day.
  16. (Optional) From the File Upload Frequency drop-down, select the frequency to run the import. azure-import-p2
  17. Click the Start Import button to begin the export.

NOTE: For continuous imports, files should be in the following format: timestamp.csv. The workflow will understand the sequence of files based on the timestamp. If no next file is received, the continuous import will stop and a new export will need to be configured.