Monitoring events
Details about service.monitor Monitoring executed monitoring executions can be stored in the local database. Alternatively they can also be sent directly to Analytics' Logstash pipeline via GELF protocol.
Procedure
-
Executing the statements from
dev-console/ct-monitoring.txt
in the Kibana Dev Console -
Import Kibana dashboards, queries and index patterns from file
kibana/ct-monitoring/export.ndjson
. -
Setting up ingest pipelines from
ingest/ct-monitor-monitoring-*.txt
(see below) . -
Configuration of the /monitor web application
Publish Elastic Ingest Pipeline
The ingest pipelines read the target host and ArcGIS Server specific information from the requested monitoring URL to provide convenient filtering options. Furthermore, some time/date information is extracted.
The contents of
-
ingest/ct-monitor-temporal.txt
-
ingest/ct-monitor-monitoring-arcgis-serviceinfo.txt
-
ingest/ct-monitor-monitoring-host.txt
-
ingest/ct-monitor-monitoring.txt
to be sent via the Kibana 'Dev Tools' > 'Console'.
Configuration of the /monitor web application
Add the following changes to the application.properties
file of service.monitor Monitoring:
#### Storing monitored events in elasticsearch via a logstash pipeline ####
# This is very useful if you want to leverage kibana widgets and dashboards
event.storage.elastic.enabled=true
event.storage.elastic.logstash.host=localhost
event.storage.elastic.logstash.port=12203
The above configuration assumes that the Logstash process is running on the same machine as the Tomcat.