›Product info

Get started

  • Introduction to Imply Polaris
  • Quickstart
  • Navigate the console
  • Key concepts

Data

  • Overview
  • Introduction to tables
  • Create a schema
  • Batch ingestion
  • Data partitioning
  • Introduction to rollup
  • Replace data
  • Supported data formats

Analytics

  • Overview
  • Manage data cubes
  • Visualize data
  • Dimensions
  • Measures
  • Dashboards
  • Create a dashboard
  • Visualizations reference
  • Query data

Monitoring

  • Overview

Management

  • Overview

Billing

  • Overview
  • Polaris plans
  • Estimate project costs
  • Manage billing and payments

Usage

  • Overview

Security

  • Overview
  • Add users to an organization
  • User roles reference
  • Manage user groups
  • Enable SSO
  • SSO settings reference

Developer guide

  • Overview
  • Authenticate API requests
  • Create a table
  • Get table ID
  • Define a schema
  • Upload files
  • Ingest batch data
  • Push event data
  • Aiven HTTP Connector for Kafka
  • Query data
  • Link to BI tools
  • Connect over JDBC

API reference

  • Overview
  • Reference index
  • Events API
  • Files API
  • Ingestion Jobs API
  • Ingestion Templates API
  • Performance API
  • Query API
  • Tables API
  • Common object definitions

    • Table
    • TableRequest
    • RollupSchema
    • IngestionJobSpec
    • CsvFormatSettings
    • JsonFormatSettings
    • TimestampMapping

Product info

  • Release notes
  • Known limitations

Known limitations

This topic describes known limitations of Imply Polaris.

Deleting tables

  • After deleting a table, data might still be available for a few seconds or minutes until the back-end drops all the data.

  • If you are only using the APIs for table management, you must call https://api.imply.iov1/tables/<TABLE_ID>?detail=detailed to update the table status after deletion.

Parsing and ingestion

  • Polaris accepts any event payload. Polaris only checks payload syntax when processing events for ingestion. The acceptance of a pushed streaming event payload does not indicate successful addition to the table. See Event payload requirements to verify the requirements for incoming events.

  • Fields that fail to be parsed are populated in a table row as nulls. Failure in parsing a single column does not cause the whole event to be rejected unless that column is the __time column.

  • You should double check the time column that is automatically selected by the UI before proceeding to ingest data. For best results, always explicitly set the __time column.

  • Ingesting an empty file or ingesting a file where all the rows fail to parse does not prevent an ingestion job from succeeding.

  • In some cases, Polaris may return a Succeeded status even though there is an ingestion error. For batch ingestion, click the job in Ingestion Jobs list to check for any errors. For streaming ingestion, click ... More Options for the Push API Endpoint and select View details to see streaming errors.

  • Polaris does not currently support transform expressions.

  • Polaris does not currently support processing nested data. Nested fields cannot be ingested. We recommend you flatten the data object that you are uploading. For supported source data formats, see Supported data and file formats.

  • The maximum supported file size for file upload is 2 GB. If you have a file larger than 2 GB, split it into multiple files that are smaller than 2 GB.

  • If you drop all data from a table that previously had ongoing streaming ingestion, you can't push new data to that table without some intervention from Imply. Contact Polaris Support.

  • If your source data has both a __time column and a timestamp column, Polaris currently maps the timestamp column from the source data to the __time column in the Polaris table. To avoid this, rename the timestamp column in your source data.

Streaming downloads

  • Polaris sorts the time field in chronological order even if you sort the data in reverse chronological order.

  • Rows with measures having a null value show up as ‘null’ when they should be suppressed. For example, if events don’t exist.

  • The ‘Include metadata’ option sometime results in failure.

  • Downloading visualization types other than Table throws an error.

System defined limits

  • A project within Polaris can support up to 20 data tables at a time.

  • Each table supports a maximum of 200 columns.

  • The maximum number of push streaming requests for all users in an organization is 83,334 calls per minute.

  • The upload speed for push streaming is 100 MB/S.

  • The maximum size for all files uploaded for an organization is 10 TB.

  • Downloading source data files is not supported. Once you upload a file for batch ingestion, you cannot re-download it from the file staging area in Polaris.

Time delay

  • It might take a few seconds for a file to become available for ingestion after you upload it.
← Release notes
  • Deleting tables
  • Parsing and ingestion
  • Streaming downloads
  • System defined limits
  • Time delay
Key links
Try ImplyApache Druid siteImply GitHub
Get help
Stack OverflowSupportContact us
Learn more
BlogApache Druid docs
Copyright © 2022 Imply Data, Inc