Skip to main content
Skip table of contents

Cloud Providers

For clarity on whether your Platform is based on GCP or AWS, please contact Support.

Your Region

GCP

The native cloud provider for this platform is Google based in the us-central-1 region with a failover facility in us-west-1

Check the status of your region here https://status.cloud.google.com/

AWS

The native cloud provider for this platform is Amazon web services based in XYZ region.

*AWS Regional Options

  • ap-southeast-1

  • ap-southeast-2

  • ap-northeast-1

  • us-east-1

  • us-west-2

  • eu-west-1

Check the status of your region here https://health.aws.amazon.com/health/status

Connector Limits

This describes the typical file size limits and transfer speeds for bringing data to the platform and exporting it, by data source.

Note that these limits should only be used as a guideline, given differences across factors such as internet connection, file type, region and your native cloud provider.

The limits when transferring data from or to a desktop are:

  • Desktop upload = 7GB

  • Desktop download = 8GB

If your data exceeds these limits then it is recommended you use a cloud endpoint instead.

AWS

If the native cloud provider for the platform is AWS, these are the transfer limits.

Enlarges the table by opening it in a full screen dialogOpen

Connector

Transfer file size

Transfer speed

S3

No specific limit

up to 2GB/s

GCS

No specific limit, recommended size up to 50-100GB due to transfer speed time

up to 6MB/s

Azure Blob Storage

No specific limit, recommended size up to 50-100GB due to transfer speed time

up to 6MB/s

Snowflake

No specific limit, tested up to 400GB

-

BigQuery

No specific limit, tested up to 200GB

-

 GCP

If the native cloud provider for the platform is GCP, these are the transfer limits.

Enlarges the table by opening it in a full screen dialogOpen

Connector

Transfer file size

Transfer speed

Azure Blob Storage

Up to 7-10 GB for a single file

If product consists of multiple files, then limitation applies to each individual file

up to 6MB/s

GCS

Up to 200GB

up to 400MB/s

S3

Up to 7-10 GB for a single file

If product consists of multiple files, then limitation applies to each individual file

up to 6MB/s

Snowflake

No specific limit, tested up to 400GB

-

BigQuery

No specific limit, tested up to 200GB

-

Space Types

The platform is provided with the default Space Types outlined below deployed as standard. These Space Types are based upon managed Hadoop cluster services provided by your native cloud provider.

Name

Master Nodes

Worker Nodes

Batch Analyitcs

1

3

Intensive Analytics

1

6

It is not currently possible to import 3rd party software into a Space for additional analysis (for example, tableau, excel, additional python libraries, etc.)

AWS

The following components and configurations are used to support the default Space Types

Enlarges the table by opening it in a full screen dialogOpen

Component

Configuration

Cloud Service

Elastic MapReduce (EMR)

Cloud Service Image (most recently utilised)

6.11

Master Node Configuration

Machine Type : m4.xlarge

Machine Memory : 16Gb

Machine CPUs : 4

Worker Node Configuration

Machine Type : m4.xlarge

Machine Memory : 16Gb

Machine CPUs : 4

GCP

The following components and configurations are used to support the default Space Types

Enlarges the table by opening it in a full screen dialogOpen

Component

Configuration

Cloud Service

Dataproc

Cloud Service Image (most recently utilised)

2.2

Master Node Configuration

Machine Type : n1-standard-8

Machine Memory : 30Gb

Machine CPUs : 8

Worker Node Configuration

Machine Type : n1-standard-8

Machine Memory : 30Gb

Machine CPUs : 8

 

JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.