Overview & Smoke testing

This overview should help to better understand the interrelationships of data sources and necessary installation steps and give hints on how to check the successful data flow.

Overview

Ideal typical order of setting up data sources

  1. Configuring Elastic Index(-template), Lifecycle Policy and Alias

  2. Importing Kibana dashboards, queries and index patterns

  3. Configuring Logstash Pipelines

  4. further activities outside of Elasticsearch

Assignment of files of the delivery to data sources

Data sources & files
Data source dev-console kibana logstash more

ct-analytics

ct-analytics.txt

ct-analytics/export.ndjson

ct-analytics

map.apps bundle upload, installation of monitor-analytics-webapp

ct-arcgis

ct-arcgis-logfile.txt

ct-arcgis/export.ndjson

ct-arcgis-logfile

Configuration of Filebeat on ArcGIS hosts

ct-fme

ct-fme-*.txt

ct-fme/export.ndjson

ct-fme-*

set FME environment parameters being used by Logstash

ct-log

ct-log.txt

ct-log/export.ndjson

ct-log

Activate Log4J logging via GELF in con terra products

ct-monitoring

ct-monitoring.txt

ct-monitoring/export.ndjson

ct-monitoring

Activate transfer of monitoring events from monitor-webapp

Smoke Test

  1. Checking the log file of the Logstash process.

    • Are there any error messages related to the connection to Elasticsearch?

    • Are all defined logstash pipelines started? (e.g.: Pipeline started successfully {:pipeline_id⇒"ct-monitoring)

  2. Checking the view in Kibana

    • Is data already available in the Discover view for the desired data source? (Note: observe the set time span of the query)

    • Do the dashboards already show data?