• Developer guide
  • API reference

›Monitoring

Getting started

  • Introduction to Imply Polaris
  • Quickstart
  • Execute a POC
  • Create a dashboard
  • Navigate the console
  • Customize Polaris
  • Key concepts

Tables and data

  • Overview
  • Introduction to tables
  • Table schema
  • Ingestion jobs

    • Create an ingestion job
    • Ingest using SQL
    • Job auto-discovery
    • Timestamp expressions
    • SQL ingestion reference
    • Ingestion status reference
  • Data partitioning
  • Introduction to rollup
  • Replace data
  • Ingestion use cases

    • Approximation algorithms
    • Ingest earliest or latest value

Ingestion sources

  • Ingestion sources overview
  • Supported data formats
  • Create a connection
  • Ingest from files
  • Ingest data from a table
  • Ingest from S3
  • Ingest from Kafka and MSK
  • Ingest from Kinesis
  • Ingest from Confluent Cloud
  • Kafka Connector for Imply Polaris
  • Push event data
  • Connect to Confluent Schema Registry
  • Ingestion source reference

Analytics

  • Overview
  • Manage data cubes
  • Visualize data
  • Data cube dimensions
  • Data cube measures
  • Dashboards
  • Visualizations reference
  • Set up alerts
  • Set up reports
  • Embed visualizations

Querying

  • Overview
  • Time series functions

Monitoring

  • Overview
  • Monitoring dashboards
  • Monitor performance metrics
  • Integrate with Datadog
  • Integrate with Prometheus
  • Integrate with Elastic stack
  • Metrics reference

Management

  • Overview
  • Pause and resume a project

Usage and Billing

  • Billing structure overview
  • Polaris plans
  • Add a payment method
  • Monitor account usage

Security

    Polaris access

    • Overview
    • Invite users to your organization
    • Manage users
    • Permissions reference
    • Manage user groups
    • Enable SSO
    • SSO settings reference
    • Map IdP groups

    Secure networking

    • Connect to AWS
    • Create AWS PrivateLink connection

Developer guide

  • Overview
  • Security

    • Overview
    • Authenticate with API keys
    • Authenticate with OAuth
    • Manage users and groups
    • Restrict an embedding link
  • Migrate deprecated resources
  • Create a table
  • Upload files
  • Ingestion jobs

    • Create an ingestion job
    • Create a streaming ingestion job
    • Ingest using SQL
    • View and manage jobs

    Ingestion sources

    • Ingest from files
    • Ingest from a table
    • Get ARN for AWS access
    • Ingest from Amazon S3
    • Ingest from Kafka and MSK
    • Ingest from Amazon Kinesis
    • Ingest from Confluent Cloud
    • Push event data
    • Kafka Connector for Imply Polaris
    • Kafka Connector reference

    Ingestion use cases

    • Filter data to ingest
    • Ingest nested data
    • Ingest and query sketches
    • Specify data schema
    • Ingest Kafka metadata

    Analytics

    • Query data
    • Connect over JDBC
    • Link to BI tools
    • Query parameters reference
  • Update a project
  • API documentation

    • OpenAPI reference
    • Query API

    Migrations

    • Migrate from Hybrid

Product info

    Release notes

    • 2023
    • 2022
  • Known limitations
  • Druid extensions

Integrate with Elastic stack

This topic provides a sample configuration for reporting Polaris performance metrics to Elasticsearch.

The topic assumes that you are using Elasticsearch for storing and searching your data, Kibana for visualizing and managing the data, and Metricbeat for collecting metrics.

Prerequisites

  • An Elasticsearch instance. Refer to the Elasticsearch documentation for installation instructions.
  • A Kibana instance configured to run against the same major version release as Elasticsearch. Refer to the Kibana documentation for installation instructions.
  • A Metricbeat instance with Openmetrics module enabled. Refer to the Metricbeat documentation for installation instructions.
  • A Polaris API key with the AccessMetrics permission. See Authenticate with API keys to obtain an API key and assign service account permissions. For more information on permissions, visit Permissions reference.

Instructions

The following is an example configuration for the Metricbeat Openmetrics module to collect metrics about your Polaris environment. For more advanced use cases, refer to the Openmetrics module documentation.

In the Metricbeat directory named modules.d, edit the openmetrics.yml file to connect to the Polaris Metrics API endpoint:

# Module: openmetrics

- module: openmetrics
  metricsets: ['collector']
  # Polaris updates metrics every minute, so this period is set to 60 seconds.
  period: 60s
  # Replace ORGANIZATION_NAME with the name of your organization.
  hosts: ["https://ORGANIZATION_NAME.api.imply.io"]

  # The name of the service the data is collected from. 
  # This can be changed.
  service.name: imply

  metrics_path: /v1/metrics/export
  username: apikey
  # Replace POLARIS_API_KEY with your API key. 
  # Ensure that the API key has the AccessMetrics permission.
  password: POLARIS_API_KEY

  metrics_filters:
    include: []
    exclude: []

Test your integration in Docker

To test this integration locally, you can use a single-node Elasticsearch cluster in a Docker container.

  1. Follow the steps in Run Kibana on Docker for development to start Kibana and connect it to your Elasticsearch container.

  2. Access Kibana through the web application on port 5601.

  3. On the home page, click Add integrations.

  4. Select OpenMetrics Metrics integration.

  5. Follow the steps displayed in the UI to install Metricbeat.

  6. Modify metricbeat.yml to set the connection information:

    output.elasticsearch:
    
      # Array of hosts to connect to. 
      # Replace ES_URL with your Elasticsearch host. For example: "localhost:9200".
      hosts: ["ES_URL"]
    
      # Protocol - either `http` (default) or `https`.
      # Replace ES_PASSWORD with your elastic user password.
      protocol: "https"
      username: "elastic"
      password: "ES_PASSWORD"
    
      # If using Elasticsearch's default certificate.
      # Replace ES_CERT_FINGERPRINT with your HTTP CA certificate SHA-256 fingerprint.
      ssl.ca_trusted_fingerprint: "ES_CERT_FINGERPRINT"
       
    setup.kibana:
      # Replace KIBANA_URL with your Kibana host. For example: "localhost:5601". 
      host: "KIBANA_URL"
    
  7. Follow the steps displayed in the UI to enable and configure the Openmetrics module.

  8. Modify the settings in the modules.d/openmetrics.yml file to connect to the Polaris Metrics API endpoint.

  9. Start Metricbeat:

    ./metricbeat setup
    ./metricbeat -e
    
  10. In the UI, go to Dashboard > Create dashboard. Click Create visualization. Follow Kibana’s Dashboard and visualizations documentation to visualize your Polaris performance metrics. Refer to the Metrics reference documentation for a list of available metrics.

Learn more

See the following topics for more information:

  • Monitor performance metrics
  • Metrics reference
← Integrate with PrometheusMetrics reference →
  • Prerequisites
  • Instructions
    • Test your integration in Docker
  • Learn more
Key links
Try ImplyApache Druid siteImply GitHub
Get help
Stack OverflowSupportContact us
Learn more
BlogApache Druid docs
Copyright © 2023 Imply Data, Inc