• Developer guide
  • API reference

›Product info

Getting started

  • Introduction to Imply Polaris
  • Quickstart
  • Navigate the console
  • Key concepts

Ingestion sources

  • Ingestion sources overview
  • Supported data formats
  • Create a connection
  • Ingest from files
  • Ingest from S3
  • Ingest from Kinesis
  • Ingest from Confluent Cloud
  • Kafka Connector for Imply Polaris
  • Push event data

Tables and data

  • Overview
  • Introduction to tables
  • Table schema
  • Create an ingestion job
  • Timestamp expressions
  • Data partitioning
  • Introduction to rollup
  • Approximation algorithms
  • Replace data

Analytics

  • Overview
  • Manage data cubes
  • Visualize data
  • Data cube dimensions
  • Data cube measures
  • Dashboards
  • Create a dashboard
  • Visualizations reference
  • Set up alerts
  • Set up reports
  • Embed visualizations
  • Query data

Monitoring

  • Overview

Management

  • Overview
  • Pause and resume a project

Billing

  • Overview
  • Polaris plans
  • Estimate project costs

Usage

  • Overview

Security

    Polaris access

    • Overview
    • Invite users to your organization
    • Permissions reference
    • Manage user groups
    • Enable SSO
    • SSO settings reference
    • Map IdP groups

    Secure networking

    • Connect to AWS

Developer guide

  • Overview
  • Authentication

    • Overview
    • Authenticate with API keys
    • Authenticate with OAuth
  • Manage users and groups
  • Migrate deprecated resources
  • Create a table
  • Define a schema
  • Upload files
  • Create an ingestion job
  • Ingestion sources

    • Ingest from files
    • Ingest from a table
    • Get ARN for AWS access
    • Ingest from Amazon S3
    • Ingest from Amazon Kinesis
    • Ingest from Confluent Cloud
    • Push event data
    • Kafka Connector for Imply Polaris
    • Kafka Connector reference
  • Filter data to ingest
  • Ingest nested data
  • Ingest and query sketches
  • Query data
  • Update a project
  • Link to BI tools
  • Connect over JDBC
  • Query parameters reference
  • API documentation

    • OpenAPI reference
    • Query API

Product info

  • Release notes
  • Known limitations
  • Druid extensions

Apache Druid and Imply extensions enabled in Polaris

The following Imply Polaris functionality is provided by Apache Druid extensions built into Polaris. See the Imply Extensions page for more information on the individual extensions.

  • Ingesting data from files stored in S3 (Druid S3 extensions).
  • Approximate histogram aggregator and fixed buckets histogram aggregator (Druid histogram).
  • Apache DataSketches aggregators (Druid DataSketches).
  • Constructing bloom filters from query results (Druid bloom filters)
  • Caching lookups that aren't executed at query time (Druid globally cached lookups).
  • Exactly-once Apache Kafka ingestion for indexing (Druid Kafka indexing service).
  • Exactly-once Kinesis ingestion for indexing (Druid Kinesis indexing service).
  • Microsoft Azure deep storage (Druid Azure extensions).
  • Parsing and ingesting the ORC data format (Druid ORC extensions).
  • Parsing and ingesting the Parquet data format (Druid Parquet extensions).
  • Parsing and ingesting the Protobuf data format (Druid Protobuf extensions).
  • Parsing and ingesting the Avro data format (Druid Avro extensions).
  • Statistics-related aggregators including variance and standard deviations (Druid stats).
  • Kafka-based namespace lookup (Kafka namespace extraction).

The following Polaris functionality is provided by Imply extensions built into Polaris:

  • Sending metrics to Imply's Clarity service or to a Kafka broker (Clarity emitter).
  • Automatic scaler for the ingestion service.
  • Implementations of various interfaces to provide configuration properties with secret values.
← Known limitations
Key links
Try ImplyApache Druid siteImply GitHub
Get help
Stack OverflowSupportContact us
Learn more
BlogApache Druid docs
Copyright © 2023 Imply Data, Inc