Documentation / Product / Integrations / Amazon Web Services / AWS S3

AWS S3: JSON Import

Many applications let you write JSON files to Amazon S3, you can easily import this custom data to Lytics. Once imported you can leverage powerful insights on this custom data provided by Lytics data science to drive your marketing efforts.

Integration Details

This integration uses the Amazon S3 API to read the JSON file selected. Each run of the workflow will proceed as follows:

  1. Query for a list objects in the bucket selected in the configuration step.
  2. Try to find the file selected by matching off of name of prefix.
  3. If found, fetch the file.
  4. If configured to diff the files, it will compare the file to the data imported from the previous run.
  5. Filter fields based on what was selected during configuration.
  6. Send event fields to the configured stream.
  7. Schedule the next run of the import if it is a scheduled batch.

Fields

Fields imported via JSON through S3 will require custom data mapping. For assistance mapping your custom data to Lytics user fields, please reach out to Lytics support.

Configuration

Follow these steps to set up and configure the S3 JSON import workflow in the Lytics platform.

  1. From Data > Integrations Select the Amazon Web Services tile. aws
  2. Click on New workflow.
  3. Select Import JSON.
  4. Select the authorization you created during the authorization step.
  5. (Required) From the Stream box, enter or select the data stream you want to import the file(s) into.
  6. (Required) From the Bucket drop-down list, select the bucket to import from. If there is an error fetching buckets, your credentials may not have permission to list buckets, use the Bucket Name (Alt) box instead.
  7. (Optional) In the Bucket Name (Alt) box, enter the bucket name to read the file(s) from.
  8. (Required) From the File drop-down list, select the file to import. Listing files may take up to a couple minutes after the bucket is chosen.
  9. (Optional) From the Timestamp Field drop-down list, select the name of the column in the JSON that contains the timestamp of an event. If no field is specified, the event will be timestamped with the time of the import.
  10. (Optional) Select the Keep Updated checkbox to run the import on a regular basis. AWS S3 import JSON cfg-1
  11. Additional Configuration options are available by clicking on the Show Advanced Options tab.
  12. (Optional) From the Time of Day drop-down list, select the time of day for the import to be schedualed after the first import. This only applies to the daily, weekly, and monthly import frequencies. If no option is selected the import will be schedualed based on the completion time of the last import.
  13. (Optional) From the Timezone drop-down list, select the time zone for the Time of Day.
  14. (Optional) From the File Upload Frequency drop-down list, select the frequency to run the import. AWS S3 import JSON cfg adv
  15. Click the Start Import button when you're ready to begin the export.

NOTE: For continuous imports, files should be in the following format: prefix_timestamp.json. The workflow will understand the sequence of files based on the timestamp. If no next file is received the continuous import will stop and a new export will need to be configured.