Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 16 Next »

Use Case

Setting up a distributed monitoring configuration between a branch location and the central location of our example company.

Example Setup

The instructions are based on the example setup as described below.

Satellite SONARPLEX

This is a branch location in cologne which is managed remotely.

LocationCologne
IP-Address172.16.0.100

Central SONARPLEX

This is the network operation center where all alerts are collected form the remote locations.

LocationLondon
IP-Address172.16.0.254
azeti Agent Port

4192

azeti Agent PasswordHow-ToDM

The satellite device should send its status every 30 minutes and only HARD events should be sent.

The general procedure is as follows.

  1. Configure the Agent on the Central SONARPLEX
  2. Set the location identifier on the Satellite SONARPLEX
  3. Enable status and event delivery on the satellite
  4. Done.

Step-by-step guide

Central SONARPLEX configuration

First set the azeti Agent credentials.

  1. Open the in the administrative Web Interface (http://172.16.0.254:81) and choose Configuration ::Network
  2. Choose Agent Configuration
  3. Set Port to use: 4192
  4. Set Agent Password: How-ToDM
  5. Click  to save the configuration

Satellite SONARPLEX configuration

Now open the administrative web interface of the Satellite SONARPLEX and set the location identifier. 

  1. Choose Configuration :: My Properties and scroll down to the bottom
  2. Open the resource LOCATION
  3. Set Ressource Value: Cologne
  4. Click  to save the configuration

Now enable the status and event delivery on the Satellite SONARPLEX.

  1. Choose Configuration :: Status Delivery Configuration for Distributed Monitoring and configure the settings like shown below.
Parameter NameSetting
Deliver Status at regular baseenable
Delivery Interval in Minutes (Heartbeat)30
Retry Frequency in Minutes5
Retry Attempts3
Deliver following Events immediately

Service Alerts

Host Alerts

Suppress SOFT State alerts

Deliver Performance Data

enable
Do not store performance data locallydisable
Destination Host (NOC)172.16.0.254 
Destination Agent Port4192
Destination Agent PasswordHowTo-DM 
Default service output upon transfer problemsMissing check result from satellite.
Default service state upon transfer problemsUNKNOWN
Retain state on outdated servicesenable
Debug LevelWARN

See the Status Delivery Configuration for Distributed Monitoring article for a detailed explanation of the particular settings.

Verifying the Setup

All the satellite objects will appear on the destination device after a couple of minutes. The new objects will be prefixed by the location identifier (see above). All incoming status information is submitted passively, see Reports :: Event Log in the user web interface. You should message similar like the shown excerpt.

 

[2014-02-04 17:11:06] PASSIVE SERVICE CHECK: dm-test_-azeti-A-;dm-test_Software-Status;0;OK - FM-Usage: 0.22% - LOAD: 0.33
[2014-02-04 17:11:06] PASSIVE SERVICE CHECK: dm-test_-azeti-A-;dm-test_Monitoring;0;OK - 13 processes running, last status update 2 seconds ago
[2014-02-04 17:11:06] PASSIVE SERVICE CHECK: dm-test_-azeti-A-;dm-test_Watchdog-Status;0;OK - No problems within last recent hour

 

You can see the internal process of the distributed monitoring delivery right after it was enabled within the log files.

  • Distributed Monitoring (NOC Processor) [process_satellite.log]
  • Distributed Monitoring (Satellite Processor) [send_status.log]
  • Distributed Monitoring Event Log (NOC) [event.log]
  • No labels