Skip to main content
Skip table of contents

Integrate Your Exports

Once you have successfully created an export process, the exported content will become available via your selected connector. The data available in this location may update depending on the delivery frequency you have selected (static or repeating). Each export uses a unique connector path (prefix) that can be used to integrate the exported data into your own environment.

Desktop Download

If your export uses the Desktop connector:

  • Navigate to the Export Definition page.

  • Click Download to initiate a browser-based download of the exported data.

S3 Bucket

If your export uses an S3 Bucket connector:

  • A unique sub-folder (named using a random GUID) is created within the connector's configured S3 path.

  • The data snapshot from the moment of export creation is written to this sub-folder.

  • You can access the data using any authenticated AWS S3 access method.

Google Cloud Storage (GCS)

If your export uses a GCS connector:

  • A unique sub-folder (GUID-named) is created in the specified GCS location.

  • The data snapshot is written to this sub-folder at the point of export creation.

  • You can access the content via any authenticated GCP Cloud Storage method.

Azure Blob Storage

If your export uses an Azure Blob Storage connector:

  • A GUID-named sub-folder is created within the configured Azure location.

  • The data is written to this location when the export is created.

  • You can access the data via any authenticated Azure Blob Storage method.

SFTP

If your export uses an SFTP connector:

  • A unique sub-folder (GUID-named) is created within your user-specific zone on the platform-hosted SFTP server.

  • The exported data is delivered to this sub-folder at the time of export creation.

To access this folder:

  1. Retrieve the SSH private key from the connector details (only the connector owner can download this key).

  2. Use the key with an SFTP client to connect and download files.

For Mac (recommended: CyberDuck):

  1. Download CyberDuck: https://cyberduck.io/download

  2. Launch CyberDuck and select Open Connection.

  3. Set the connection type to SFTP.

  4. Enter:

    • Server

    • Username

    • SSH Private Key (downloaded from the connector)

    • Click Connect.

For Windows (recommended: WinSCP):

  1. Download WinSCP: https://winscp.net/download/WinSCP-5.15.9-Setup.exe

  2. Use the SSH key and connection details to log in and access files.

For security purposes, only the connector owner can access the SSH key via the platform UI. If you are using an SFTP connector that you do not own, contact the owner to obtain the SSH key.

Delivery Frequency

When creating an export from updating assets, you can select either a one-off (static) or repeating (automated) export process.

Static Export

  • A Static Export provides a single snapshot of the data at the time the export is created.

  • For connector-based exports, the data is saved in an unscheduled sub-folder under the export's GUID-based root folder.

Repeating Export

Repeating exports provide an on-going data feed that automatically push data content to the connector location each time the product or asset updates. Each distinct export delivery comprises of :

  • Delivery of the data to a newly generated GUID sub-folder within the top level home location of the export. This results in a folder structure such as (where the initial path identifier will be cloud specific, here we use AWS S3 for the example) : s3://<connector_bucket>/<export_GUID>/<export_delivery_GUID>/

  • Delivery of a Transfer Notification File (TNF) to the /tnfs/ sub-folder created beneath the top level home location of the export. These TNFs are named using a timestamp convention to track the delivery history and each contain the export_delivery_GUID value of the corresponding data transfer in the DataFolder JSON field to facilitate path resolution.

Cloud Storage event triggers can be set-up against the /tnfs/ folder to programmatically detect delivery and ingest into downstream processes where exported data is used in operational workflows.

Transfer Notification File (TNF)

A Transfer Notification File (TNF) is a JSON text file that is used to notify the transfer of files to and from the .tnf location. These are core to supporting updating assets. .tnfs are used to trigger update jobs that ingest data from a specified location.

The TNF file JSON structure will contain the name of the newly created, unique subdirectory that contains the updated data asset within the base object storage location. Here is a template TNF file.

TNFtemplate.tnf

JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.