Skip to main content

Connect to Azure Blob Storage

info

This is a beta feature available to select customers. Imply must enable the feature for you. Contact your Polaris support representative to find out more.

To ingest data from Azure Blob Storage into Imply Polaris, create an Azure Storage connection and use it as the source of an ingestion job. Create a unique connection for each Azure Blob Storage container from which you want to ingest data.

The Azure Blob Storage connection also supports access to Azure Data Lake Storage Gen2.

This topic provides reference information to create a connection to Azure Blob Storage.

tip

For an end-to-end guide to Azure Blob Storage ingestion in Polaris, see Guide for Azure Blob Storage ingestion.

Create a connection

Create an Azure Blob Storage connection as follows:

  1. Click Sources from the left navigation menu.
  2. Click Create source and select Azure Storage.
  3. Enter the connection information.
  4. Click Test connection to confirm that the connection is successful.
  5. Click Create connection to create the connection.

The following screenshot shows an example connection created in the UI. For more information, see Create a connection.

Azure Blob Storage connection UI

Connection information

Follow the steps in Create a connection to create the connection. The connection requires the following information from Azure:

  • Storage account name: Name of the storage account in Azure Storage. Ensure that the networking settings for the storage account enable public network access for all networks, or to Imply specifically (IP address 172.171.87.178). For more details, see the Azure documentation on configuring network access on a storage account.

  • Container name: Name of the container in the storage account.

  • Prefix (optional): Specify a prefix if you want to limit access to designated files in the container. The connection will be limited to the set of files matching this prefix.

    For example, if the container contains the following blobs, file1, file2, folder/file3, folder/file4, then a prefix of folder would make only file3 and file4 available through the connection.

  • Authorization to access the storage account.

    • To authenticate using a storage account access key, supply the access key in the dialog to create the connection.
    • To authenticate using a shared access signature (SAS) token (recommended), generate a SAS for the storage account, then provide the SAS token including the delimiter character ?. The SAS must meet the following criteria:
      • The SAS applies to the storage account level. You cannot create a valid connection when the SAS is created on the container or object level.
      • The SAS allows access to the container and object resource types, with read and list permissions.
      • The SAS is immediately usable. Note that clock skew may occur when you set the start time for a SAS to the current time. If the current time is past the SAS end time, generate a new SAS token and update the connection.

Ingest data by API

A guide for ingesting data from Azure Blob Storage by API is coming at a later date. For now, you can use the sample request in the S3 ingestion guide to create a batch ingestion job from Azure, with the following changes:

  • Change source.type from s3 to azure
  • Set connectionName to an Azure connection
  • If your source uses uris, change s3:// to azureStorage:// for each item in uris

Learn more

See the following topics for more information: