• Developer guide
  • API reference

›Developer guide

Getting started

  • Introduction to Imply Polaris
  • Quickstart
  • Navigate the console
  • Key concepts

Ingestion sources

  • Ingestion sources overview
  • Supported data formats
  • Create a connection
  • Ingest from files
  • Ingest from S3
  • Ingest from Kinesis
  • Ingest from Confluent Cloud
  • Kafka Connector for Imply Polaris
  • Push event data

Tables and data

  • Overview
  • Introduction to tables
  • Table schema
  • Create an ingestion job
  • Timestamp expressions
  • Data partitioning
  • Introduction to rollup
  • Approximation algorithms
  • Replace data

Analytics

  • Overview
  • Manage data cubes
  • Visualize data
  • Data cube dimensions
  • Data cube measures
  • Dashboards
  • Create a dashboard
  • Visualizations reference
  • Set up alerts
  • Set up reports
  • Embed visualizations
  • Query data

Monitoring

  • Overview

Management

  • Overview
  • Pause and resume a project

Billing

  • Overview
  • Polaris plans
  • Estimate project costs

Usage

  • Overview

Security

    Polaris access

    • Overview
    • Invite users to your organization
    • Permissions reference
    • Manage user groups
    • Enable SSO
    • SSO settings reference
    • Map IdP groups

    Secure networking

    • Connect to AWS

Developer guide

  • Overview
  • Authentication

    • Overview
    • Authenticate with API keys
    • Authenticate with OAuth
  • Manage users and groups
  • Migrate deprecated resources
  • Create a table
  • Define a schema
  • Upload files
  • Create an ingestion job
  • Ingestion sources

    • Ingest from files
    • Ingest from a table
    • Get ARN for AWS access
    • Ingest from Amazon S3
    • Ingest from Amazon Kinesis
    • Ingest from Confluent Cloud
    • Push event data
    • Kafka Connector for Imply Polaris
    • Kafka Connector reference
  • Filter data to ingest
  • Ingest nested data
  • Ingest and query sketches
  • Query data
  • Update a project
  • Link to BI tools
  • Connect over JDBC
  • Query parameters reference
  • API documentation

    • OpenAPI reference
    • Query API

Product info

  • Release notes
  • Known limitations
  • Druid extensions

API overview

The Imply Polaris API provides programmatic access to working with tables, files, and ingestion tasks in Polaris.

Polaris supports API key and OAuth authentication methods. For more information on authenticating REST API requests, see Authentication overview.

The following diagram presents a high-level approach to working with the Polaris API. Navigate to the associated developer guide in the list below to learn more about any step.

API workflow diagram

  1. Create a table: Create a table using the Tables API.

  2. Define a schema: Validate and set a table's schema.

  3. Perform a batch ingestion:
    a. Upload files to a staging area.
    b. Start batch ingestion job from files.

  4. Push event data into a table:
    a. Create a connection to push data from a source data stream.
    b. Start the streaming ingestion job.
    c. Push events to the table.

  5. Ingest data from an event provider:
    a. Create a connection to an event provider.
    b. Start the streaming ingestion job.

  6. Query data: Submit SQL queries using the Query API.

See the following topics for integrating external applications with Polaris:

  • Link to BI tools: Access Polaris data from external business intelligence tools.
  • Connect over JDBC: Connect a JDBC driver to Polaris.

For more information on the APIs, see API reference.

← Connect to AWSOverview →
Key links
Try ImplyApache Druid siteImply GitHub
Get help
Stack OverflowSupportContact us
Learn more
BlogApache Druid docs
Copyright © 2023 Imply Data, Inc