- Unloading data from a data warehouse into an S3 bucket.
- Report events in very large volume.
Event format
This S3 integration requires that events are in the shape of the event API schema, the format of the files can be either.jsonl
, .json
or .csv
.
For fields that you want to include as part of the dimensions
object, just add the dimensions.
prefix, for example dimensions.storage_bytes
.
JSONL Example
CSV Example
Integration setup
Create 2 separate S3 buckets:- S3 bucket for raw events - This bucket will include the raw events and Stigg automatically pull events from.
- S3 Dead-letter bucket - in case of ingestion failure due to parsing or validation error, Stigg will write the events back to this S3 bucket.
Policy for raw events S3 bucket:
Policy for dead-letter S3 bucket:
s3:objectCreated:*
event type must be configured via the AWS console.
This SQS queue will be in the same region as your bucket, so Stigg will provide you with the provisioned SQS ARN once the bucket region is known.