Skip to main content
Skip table of contents

Optic Data Lake

Supported Versions

Please note that using a supported version is mandatory.

Product

Supported Deployment Types

Supported Versions

Optic Data Lake

All

2022.05, 2022.11, 2023.03, 23.4

Environmental Prerequisites

Confirm the prerequisites of the corresponding integration template before continuing further, as some templates may not require all environmental prerequisites.

How do we import the retention profiles in Vertica?

Follow the steps below depending on your preferred domain manager.

AppDynamics

How do we import the AppDynamics retention profiles in Vertica?
  1. Import the retention profiles in Vertica by following the OPTIC DL documentation.

    • Product Name: OPTIC Data Lake

    • Product Version: OPTIC Data Lake Version

    • Service Name: OPTICDL Open Data Ingestion Administration API

    • Service Version:  V3

  2. Download the Optic Data Lake “Datasets” and “Retention Profiles” from https://download.zigiwave.com/zigiops/opticdl/opticdl_datasets_ret_profiles_v1_2024-01-16.zip.

  3. Locate the "Retention-Config", which allows configuring retention for the data, and follow the instructions:

    • Unzip the opticdl_datasets_ret_profiles_v1_2024-01-16.zip and locate the AppDynamics_ret_prof folder containing the "Retention Profiles”:

      • appdynamics_retention_raw.json

      • appdynamics_retention_hourly.json

      • appdynamics_retention_daily.json

How do we import the AppDynamics task flows and datasets in Vertica?
  1. Import the datasets and task flows you plan to use in Vertica by following the OPTIC DL documentation.

    • Import the AppDynamics_application_set dataset:

      • appdynamics_application_task_flow.json

      • opsb_ext_appdynamics_application_raw.json

      • opsb_ext_appdynamics_application_1h.json

      • opsb_ext_appdynamics_application_1d.json

    • Import the AppDynamics_node_set dataset:

      • appdynamics_node_task_flow.json

      • opsb_ext_appdynamics_node_raw.json

      • opsb_ext_appdynamics_node_1h.json

      • opsb_ext_appdynamics_node_1d.json


Datadog

How do we import the Datadog retention profiles in Vertica?
  1. Import the retention profiles in Vertica by following the OPTIC DL documentation.

    • Product Name: OPTIC Data Lake

    • Product Version: OPTIC Data Lake Version

    • Service Name: OPTICDL Open Data Ingestion Administration API

    • Service Version:  V3

  2. Download the Optic Data Lake “Datasets” and “Retention Profiles” from https://download.zigiwave.com/zigiops/opticdl/opticdl_datasets_ret_profiles_v1_2024-01-16.zip.

  3. Locate the "Retention-Config", which allows configuring retention for the data, and follow the instructions:

    • Unzip the opticdl_datasets_ret_profiles_v1_2024-01-16.zip and locate the DataDog_ret_profiles folder containing the "Retention Profiles”:

      • datadog_retention_raw.json

      • datadog_retention_hourly.json

      • datadog_retention_daily.json

How do we import the Datadog task flows and datasets in Vertica?
  1. Import the datasets and task flows you plan to use in Vertica by following the OPTIC DL documentation.

    • Import the DataDog_host_set dataset:

      • datadog_host_task_flow.json

      • opsb_ext_datadog_host_raw.json

      • opsb_ext_datadog_host_1h.json

      • opsb_ext_datadog_host_1d.json


Dynatrace

How do we import the Dynatrace retention profiles in Vertica?
  1. Import the retention profiles in Vertica by following the OPTIC DL documentation.

    • Product Name: OPTIC Data Lake

    • Product Version: OPTIC Data Lake Version

    • Service Name: OPTICDL Open Data Ingestion Administration API

    • Service Version:  V3

  2. Download the Optic Data Lake “Datasets” and “Retention Profiles” from https://download.zigiwave.com/zigiops/opticdl/opticdl_datasets_ret_profiles_v1_2024-01-16.zip.

  3. Locate the "Retention-Config", which allows configuring retention for the data, and follow the instructions:

    • Unzip the opticdl_datasets_ret_profiles_v1_2024-01-16.zip and locate the Dynatrace_ret_profiles folder containing the "Retention Profiles”:

      • dynatrace_retention_raw.json

      • dynatrace_retention_hourly.json

      • dynatrace_retention_daily.json

How do we import the Dynatrace task flows and datasets in Vertica?
  1. Import the datasets and task flows you plan to use in Vertica by following the OPTIC DL documentation.

    • Import the Dynatrace_app_bowser_set dataset:

      • dynatrace_app_browser_task_flow.json

      • opsb_ext_dynatrace_app_browser_raw.json

      • opsb_ext_dynatrace_app_browser_1h.json

      • opsb_ext_dynatrace_app_browser_1d.json

    • Import the Dynatrace_app_user_type_set dataset:

      • dynatrace_app_user_type_task_flow.json

      • opsb_ext_dynatrace_app_user_type_raw.json

      • opsb_ext_dynatrace_app_user_type_1h.json

      • opsb_ext_dynatrace_app_user_type_1d.json

    • Import the Dynatrace_disk_set dataset:

      • dynatrace_disk_task_flow.json

      • opsb_ext_dynatrace_disk_raw.json

      • opsb_ext_dynatrace_disk_1h.json

      • opsb_ext_dynatrace_disk_1d.json

    • Import the Dynatrace_host_set dataset:

      • dynatrace_host_task_flow.json

      • opsb_ext_dynatrace_host_raw.json

      • opsb_ext_dynatrace_host_1h.json

      • opsb_ext_dynatrace_host_1d.json

    • Import the Dynatrace_network_set dataset:

      • dynatrace_netif_task_flow.json

      • opsb_ext_dynatrace_netif_raw.json

      • opsb_ext_dynatrace_netif_1h.json

      • opsb_ext_dynatrace_netif_1d.json

    • Import the Dynatrace_service_set dataset:

      • dynatrace_service_task_flow.json

      • opsb_ext_dynatrace_service_raw.json

      • opsb_ext_dynatrace_service_1h.json

      • opsb_ext_dynatrace_service_1d.json


New Relic

How do we import the New Relic retention profiles in Vertica?
  1. Import the retention profiles in Vertica by following the OPTIC DL documentation.

    • Product Name: OPTIC Data Lake

    • Product Version: OPTIC Data Lake Version

    • Service Name: OPTICDL Open Data Ingestion Administration API

    • Service Version:  V3

  2. Download the Optic Data Lake “Datasets” and “Retention Profiles” from https://download.zigiwave.com/zigiops/opticdl/opticdl_datasets_ret_profiles_v1_2024-01-16.zip.

  3. Locate the "Retention-Config", which allows configuring retention for the data, and follow the instructions:

    • Unzip the opticdl_datasets_ret_profiles_v1_2024-01-16.zip and locate the NewRelic_ret_prof folder containing the "Retention Profiles”:

      • newrelic_retention_raw.json

      • newrelic_retention_hourly.json

      • newrelic_retention_daily.json

How do we import the New Relic task flows and datasets in Vertica?
  1. Import the datasets and task flows you plan to use in Vertica by following the OPTIC DL documentation.

    • Import the NewRelic_app_set dataset:

      • newrelic_app_task_flow.json

      • opsb_ext_newrelic_app_raw.json

      • opsb_ext_newrelic_app_1h.json

      • opsb_ext_newrelic_app_1d.json

    • Import the NewRelic_host_set dataset:

      • newrelic_app_host_task_flow.json

      • opsb_ext_newrelic_app_host_raw.json

      • opsb_ext_newrelic_app_host_1h.json

      • opsb_ext_newrelic_app_host_1d.json


SolarWinds

How do we import the SolarWinds retention profiles in Vertica?
  1. Import the retention profiles in Vertica by following the OPTIC DL documentation.

    • Product Name: OPTIC Data Lake

    • Product Version: OPTIC Data Lake Version

    • Service Name: OPTICDL Open Data Ingestion Administration API

    • Service Version:  V3

  2. Download the Optic Data Lake “Datasets” and “Retention Profiles” from https://download.zigiwave.com/zigiops/opticdl/opticdl_datasets_ret_profiles_v1_2024-01-16.zip.

  3. Locate the "Retention-Config", which allows configuring retention for the data, and follow the instructions:

    • Unzip the opticdl_datasets_ret_profiles_v1_2024-01-16.zip and locate the Solarwinds_ret_profiles folder containing the "Retention Profiles”:

      • solarwinds_retention_raw.json

      • solarwinds_retention_hourly.json

      • solarwinds_retention_daily.json

How do we import the SolarWinds task flows and datasets in Vertica?
  1. Import the datasets and task flows you plan to use in Vertica by following the OPTIC DL documentation.

    • Import the Solarwinds_cluster_set dataset:

      • solarwinds_cluster_task_flow.json

      • opsb_ext_solarwinds_cluster_raw.json

      • opsb_ext_solarwinds_cluster_1h.json

      • opsb_ext_solarwinds_cluster_1d.json

    • Import the Solarwinds_esxi_set dataset:

      • solarwinds_esxi_task_flow.json

      • opsb_ext_solarwinds_esxi_raw.json

      • opsb_ext_solarwinds_esxi_1h.json

      • opsb_ext_solarwinds_esxi_1d.json

    • Import the Solarwinds_host_set dataset:

      • solarwinds_host_task_flow.json

      • opsb_ext_solarwinds_host_raw.json

      • opsb_ext_solarwinds_host_1h.json

      • opsb_ext_solarwinds_host_1d.json

    • Import the Solarwinds_vm_set dataset:

      • solarwinds_vm_task_flow.json

      • opsb_ext_solarwinds_vm_raw.json

      • opsb_ext_solarwinds_vm_1h.json

      • opsb_ext_solarwinds_vm_1d.json

Connected System Configuration

Follow the steps below to add your instance as a connected system.

  1. Log into your ZigiOps instance.

  2. Navigate to Connected Systems → Add New System → Optic DL and configure the following parameters: 

    • URL → This is the URL of the Optic DL instance. For example, https://opticdl.example.com.

    • Username → This is the username that the Integration Hub will use to authenticate and obtain a token from the Micro Focus OPTIC DL API.

    • Password → This is the password that the Integration Hub will use to authenticate and obtain a token from the Micro Focus OPTIC DL API.

    • Tenant Name → This is the name of the Micro Focus OPTIC DL.

    • Proxy Settings → Enables the usage of a proxy server.

  3. Examine the settings, and if they are correct, click the Save button to save the system.

Related Templates

JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.