ArcGIS Enterprise log data

Overview

arcgis enterprise log

Procedure

  1. Configuring Filebeat on the ArcGIS Enterprise Host to poll the log files on a regular basis (see below)

  2. Verify ArcGIS Log Level is set correctly (see below)

Filebeat configuration - Notes

The Filebeat component must be installed for each ArcGIS host that is to be involved in collecting the log data. Currently Filebeat 7.x is supported, with Filebeat 8.x no problems could be found yet.

Verify ArcGIS Log Level is set correctly

The evaluations and analyses of the service.monitor dashboards require a log level of Fine at the ArcGIS Server. This value can be set via the ArcGIS Server Manager under Logs > Settings.

Configuration of Filebeat with Elasticsearch as output

FME log files can be written by Filebeat to a logstash or ingest pipeline. To dispense with the Logstash installation, an elasticsearch index with an ingest pipeline can also be used directly with the following configuration.

###################### Filebeat Configuration Example #########################
env: 'production'

arcgis.base.path: 'c:\arcgisserver\logs'
#arcgis.base.path: 'c:\arcgisportal\logs'
#arcgis.base.path: 'c:\arcgisdatastore\logs'
#arcgis.base.path: '/var/log/arcgis'

filebeat.inputs:
  # ============================== ArcGIS inputs ===============================
  - type: filestream
    id: "arcgis_logfiles-server"
    # Change to "true" to enable this input configuration.
    enabled: true
    # Paths that should be crawled and fetched for ArcGIS Enterprise logs. Glob based paths.
    # Adapt these paths/patterns according to your environment
    paths:
      - ${arcgis.base.path}\*\server\*.log
      - ${arcgis.base.path}\*\services\*\*.log
      - ${arcgis.base.path}\*\services\*\*\*.log
      - ${arcgis.base.path}\*\services\System\*\*.log
    fields_under_root: true
    fields:
      labels:
        env: ${env}
        source: 'arcgis-server'
    ### Multiline options
    # Note: This only needs to be changed if the ArcGIS Server log file structure changes
    parsers:
      - multiline:
          type: "pattern"
          pattern: '^<Msg([^>]*?)>(.*)'
          negate: true
          match: "after"
          skip_newline: false

  - type: filestream
    id: "arcgis_logfiles-portal"
    # Change to "true" to enable this input configuration.
    enabled: true
    # Paths that should be crawled and fetched for ArcGIS Enterprise logs. Glob based paths.
    # Change these paths/patterns according to your environment
    paths:
      - ${arcgis.base.path}\*\portal\*.log
    fields_under_root: true
    fields:
      labels:
        env: ${env}
        source: 'arcgis-portal'
    ### Multiline options
    # Note: This only needs to be changed if the ArcGIS Server log file structure changes
    parsers:
      - multiline:
          type: "pattern"
          pattern: '^<Msg([^>]*?)>(.*)'
          negate: true
          match: "after"
          skip_newline: false

  - type: filestream
    id: "arcgis_logfiles-datastore"
    # Change to "true" to enable this input configuration.
    enabled: true
    # Paths that should be crawled and fetched for ArcGIS Enterprise logs. Glob based paths.
    # Change these paths/patterns according to your environment
    paths:
      - ${arcgis.base.path}\*\server\*.log
    fields_under_root: true
    fields:
      labels:
        env: ${env}
        source: 'arcgis-datastore'
    ### Multiline options
    # Note: This needs only be adopted if the ArcGIS Server log file structure changes
    parsers:
      - multiline:
          type: "pattern"
          pattern: '^<Msg([^>]*?)>(.*)'
          negate: true
          match: "after"
          skip_newline: false

# ============================== Filebeat modules ==============================

filebeat.config.modules:
  # Glob pattern for configuration loading
  path: ${path.config}/modules.d/*.yml
  # Set to true to enable config reloading
  reload.enabled: false
  # Period on which files under path should be checked for changes
  reload.period: 10s

# ======================= Elasticsearch template setting =======================

# We handle ILM and templates in Elasticsearch
setup.ilm.enabled: false
setup.template.enabled: false

# ================================== Outputs ===================================

# ------------------------------ Logstash output -------------------------------
# --------------------------- Elasticsearch output -----------------------------
output.elasticsearch:
  # Index can be defined as index pattern, when ILM is activated in Elasticsearch.
  index: "ct-arcgis-logfile"
  # The name of the Ingest pipeline processing the Filebeat input.
  pipeline: "ct-monitor-arcgis-logfile"
  # Elasticsearch host and port
  hosts: ["https://localhost:9200"]
  # Elasticsearch username
  username: ""
  # Elasticsearch password
  password: ""
  ssl:
    enabled: true
    # Elasticsearch SSL fingerprint
    ca_trusted_fingerprint: ""

# ================================= Processors =================================
# The following section needs no adaptation
processors:
  - add_host_metadata:
      when.not.contains.tags: forwarded
  - add_cloud_metadata: ~
  - add_docker_metadata: ~
  - add_kubernetes_metadata: ~

== Configuration of Filebeat with Elasticsearch as output, also supporting FME Flow

If, in addition to ArcGIS log files, FME Flow log files are also to be read in with Filebeat, a filebeat.yml can be used for this purpose in the filebeat/arcgis-fme-logfile folder. This combines the two pipelines.

Configuration of Filebeat with Logstash as output (deprecated)

arcgis enterprise log w logstash

The Filebeat configuration is then done on the basis of the template filebeat/arcgis-logfile/filebeat.yml.

###################### Filebeat Configuration Example (Logstash) #########################
output.logstash:
  hosts: ["logstash.host:5604"]
Select the value under fields.labels.source from arcgis-server, arcgis-portal, arcgis-datastore to get better filtering possibilities in Kibana. The same applies to fields.labels.env to distinguish between different stages.