We’ll start with the Data Integrator adapter as the structure, remember this was one of the examples in Part 1 that listed the server, then the different management silo’s that I want to use as breakouts for different performance/availability indicators.
Below is a sample of the setup within the Data Integration editor along with what the running adapter looks like.
Data Integrator and the setup
Example of running adapter
The next step is to create a view under the Service Model, let’s call it Server View. Right-click on Server View and select Service Configuration, Create, this will launch Service Configuration Manager, this is the tool that set ups automated view building and correlation rules for elements. A couple of details before we get started, highlight (New Defintion) and then on the right you have a Reconciliation Policy option, Delete Before Execute is a common one to use when first setting up a Service Configuration job. Using this will ensure that the view is started from scratch and is easier to toubleshoot. When you build more advanced jobs with multiple definitions, it becomes more important to turn this off and on depending on what you are troubleshooting. Merge is more of an add/remove, if the element in the final view is not updated, it will get removed. None is more for adding forever and never removing, again, use Delete Before Execute, sometimes refered to as DBE.
Next Highlight Structure and point that to the Server folder under the running Data Integrator Adapter under Elements. At this point, selecting save and generate should make the server view build out a mirror of the server view.
Now it is time to add the first management data source, for this example we have a management tool that has a break out of the server and a few different types of monitoring.
Select the Monitor Categories as the source. We then will reduce what we are looking at under there by setting a match rule on the source to only look/match for elements of class “net_internet”. The class will be different on your implementation, I selected this one because it was a networking type of class and I liked the icon. When editing the Match rule, make sure you remove the .* match expression, then add the appropriate class name.
The next thing we need to do is update the default join rule, for this exercise we are going to do a property match. The default match is based on the names of the elements, we want to do something a little different, we want to match below the server, specifically for the “Network” folder and only when the server names match. We accomplish this by using the following settings on the source join rule.
When the job is run again, the network folder below the server should now have an alarm associated to it and changed colors like below.
For this example the assumption is that other adapters are configured for the other data sources like Help Desk and Change Management. The next step is to add the additional sources and follow the same steps, except, replace “Network” in the property match with the corresponding folder. Once completed, each of the data sources should be appropriately homed under the corresponding server in the specific category/folder.
There are several other options we could do such as generating additional structure from the matched source element, automatically adding algorithms, etc. In part 3, we will cover Algorithms a little and how to apply them within the Service Configuration job.
Disclaimer: As with everything else at NetIQ Cool Solutions, this content is definitely not supported by NetIQ, so Customer Support will not be able to help you if it has any adverse effect on your environment. It just worked for at least one person, and perhaps it will be useful for you too. Be sure to test in a non-production environment.