Prepare Elasticsearch Stack via Python

The creation of indices, templates, ingest pipelines, index lifecycle templates and dashboard objects can easily be done via the provided python script. So a lot of the manual work can be reduced.

python-Environment and Wheel package

The script is being delivered as a wheel package. The package needs to be installed in a local python 3 environment. When choosing the place to run the scripts make sure the target Elasticsearch instance is reachable via http. A detailled description to set up the wheel package can be found here: resources/analytics/python/monitor_setup/README.md.

Configuration of parameters

After the python and Elasticsearch environments are set up, the script needs to be configured with the necessary parameters.

This can be done via a JSON file with the following structure.

Alternatively, the values for common, elasticsearch, kibana and proxy can also be set as system environment variables. The names follow a schema derived from the path of the JSON structure, e.g. ELASTICSEARCH_URL.

Configuration for setting up Elastic and service.monitor
{
  "common": {
    "local_imports_dir": "<PATH TO resources/analytics/elasticsearch>",
    "overwrite_objects": false
  },
  "elasticsearch": {
    "url": "http://elastic-host.example.com:9200",
    "username": "elastic",
    "password": "<elastic_pwd>"
  },
  "kibana": {
    "url": "http://kibana-host.example.com:5601"
  },
  "proxy": {
    "url": "http://proxy-host.example.com",
    "use_forwarding_for_https": false
  },
  "spaces": {
    "default": {
      "id": "ct-monitor-test",
      "name": "service.monitor",
      "description": "con terra service.monitor - monitoring, operations, analytics",
      "color": "#ffffff",
      "initials": "CT",
      "imageUrl": "data:image/png;base64,iVBORw0KG....."
    },
    "FME": {
      "id": "ct-monitor",
      "name": "service.monitor for FME",
      "description": "con terra service.monitor for FME",
      "color": "#ffffff",
      "initials": "CT",
      "imageUrl": "data:image/png;base64,iVBORw0KG....."
    }
  }
}

common.local_imports_dir

Path to the configuration files

COMMON_LOCAL_IMPORTS_DIR

common.overwrite_objects

If this flag is set to True, all objects in Elastic and Kibana will be overwritten. Otherwise, only new objects will be added.

COMMON_OVERWRITE_OBJECTS

elasticsearch.url

URL to the Elastic instance

ELASTICSEARCH_URL

elasticsearch.user

Name of the user

ELASTICSEARCH_USER

elasticsearch.password

Password of the user

ELASTICSEARCH_PASSWORD

kibana.url

URL to Kibana

KIBANA_URL

proxy.url - optional

URL to a proxy server, if necessary for accessing Elasticsearch. Once a proxy URL is specified, all traffic is routed through the proxy.

PROXY_URL

proxy.use_forwarding_for_https - optional

Whether requests should be forwarded to the HTTPS proxy or a TLS tunnel should be created using the HTTP CONNECT method. In standard scenarios, this option does not need to be changed. The forwarding can be used, for example, if the proxy used does not support the HTTP CONNECT method.

PROXY_USE_FORWARDING_FOR_HTTPS

The script reads the object data provided with this installation and sends the data to Elasticsearch and Kibana. The configuration parameter local_imports_dir specifies the path to the objects dir.

Executing the script

The script is being executed in the command line of your Python environment. If just FME Flow related objects shall be installed the additional parameter t is going to be used.

Command line call of the script
python -m monitor_setup -c C:\data\config.json -a full
Command line call of the script for FME only
python -m monitor_setup -c C:\data\config.json -a full -t FME

Watch the command line’s output to understand about the success of the operation.