8.1 Configuring Data Source

To manage Data Sources, view jobs, or modify ETL filters, click the Configuration tab in the left panel of the Admin Console. For more information about configuring data, review the following sections:

8.1.1 Managing Data Sources

A Data Source refers to a single AppManager repository, the related Data Mart database, and the connections to these servers and databases. Before you generate an AppManager report, you must identify the following:

  • Each AppManager repository from which you want data. At any point of time, an AppManager repository can be identified only once.

  • SQL Server for each AppManager repository.

  • SQL Server or the SQL Server failover cluster for each related Data Mart database.

  • Name for each Data Mart database.

  • Connection information to these servers and databases.

For more information about how to manage Data Sources, review the following sections:

Planning a New Data Source

When you add a Data Source on a SQL Server, a Data Mart is created on that server. Before you add a new Data Source, you must determine the following information about the SQL Server on which you want to create the Data Mart:

  • Is the SQL Server in a cluster or a non-cluster environment?

  • Does the SQL Server use named instances or the default instance?

This information helps you to accordingly modify the MsDtsSrvr.ini.xml file in order to successfully create the Data Source. The MsDtsSrvr.ini.xml file is located in the DTS\Binn folder of the SQL Server installation folder. For example:

  • On a SQL Server 2005 computer: The MsDtsSrvr.ini.xml file is located in SQL_Server_installation_folder\90\DTS\Binn folder.

  • On a SQL Server 2008 R2 computer: The MsDtsSrvr.ini.xml file is located in SQL_Server_installation_folder\100\DTS\Binn folder.

NOTE:Ensure that the collation settings for Data Mart SQL Servers and the Corresponding AppManager Repository SQL Servers match.

Before creating a Data Mart, review the following sections:

Creating Data Marts in a Non-Cluster Environment

If the default instance of SQL Server is not found on the Data Mart SQL Server, perform the following steps to create a single Data Mart or multiple Data Marts:

  1. On the SQL Server, locate the MsDtsSrvr.ini.xml file and open it in an editor.

  2. (Conditional) To create a single Data Mart on an instance of the Data Mart SQL Server, specify the servername\instance within the <ServerName>...</ServerName> tags as follows:

    <ServerName>servername\instance</ServerName>

    where instance refers to the server instance name on which the Data Mart is created.

  3. (Conditional) To create multiple Data Marts on different instances of the Data Mart SQL Server, copy the following lines in the MsDtsSrvr.ini.xml file and specify the corresponding values for the Folder_name and servername\instance variables for every Data Mart you want to create:

    <Folder xsi:type="SqlServerFolder"> <Name>Folder_name</Name> <ServerName>servername\instance</ServerName> </Folder>

    where Folder_name refers to the SSIS packages folder and instance refers to the instance on which you are going to create the Data Mart.

  4. Restart the SQL Server Integration Services.

Creating Data Marts in a Cluster Environment

Perform the following steps to create a single Data Mart or multiple Data Marts:

  1. Install the SQL Server Integration Services.

  2. On the SQL Server, locate the MsDtsSrvr.ini.xml file and open it in an editor.

  3. (Conditional) To create a single Data Mart on an instance of the Data Mart SQL Server:

    1. (Conditional) If the Data Mart Virtual SQL Server is a named instance, then specify the servername\instance within the <ServerName>...</ServerName> tags as follows:

      <ServerName>virtual_sql_server\instance</ServerName>

      where instance refers to the server instance name on which the Data Mart is created.

    2. (Conditional) If the Data Mart Virtual SQL Server is a default instance, then specify the instance name within the <ServerName>...</ServerName> tags as follows:

      <ServerName>virtual_sql_server</ServerName>

  4. (Conditional) To create multiple Data Marts on different instances of the Data Mart SQL Server:

    1. Copy the following lines in the MsDtsSrvr.ini.xml file and specify the corresponding values for the Folder_name and servername\instance variables for every Data Mart you want to create:

      <Folder xsi:type="SqlServerFolder"> <Name>Folder_name</Name> <ServerName>servername\instance</ServerName> </Folder>

      where Folder_name refers to the SSIS packages folder and instance refers to the instance on which you are going to create the Data Mart.

    2. (Conditional) If the Data Mart Virtual SQL Server is a named instance, then specify the servername\instance within the <ServerName>...</ServerName> tags as follows:

      <ServerName>virtual_sql_server\instance</ServerName>

      where instance refers to the server instance name on which the Data Mart is created.

    3. (Conditional) If the Data Mart Virtual SQL Server is a default instance, then specify the instance name within the <ServerName>...</ServerName> tags as follows:

      <ServerName>virtual_sql_server</ServerName>

  5. Restart the SQL Server Integration Services.

Adding a Data Source

You can add a Data Source for an AppManager repository (application database) that uses the default or a named instance.

The time taken to create a Data Source varies depending on factors such as system resources, network connections, and the amount of data being processed. The process of creating a Data Source involves the following:

  1. Creating the related Data Mart database, the ETL job, the related cube in the Data Warehouse.

  2. Running the ETL job.

  3. Processing the cube.

When the data is being processed, the status of Data Source in the Data Source View is displayed as Pending. After the data processing is complete, the status of Data Source changes to Success.

NOTE:You cannot run reports against a Data Source until the associated cube is created, the ETL and OLAP Processing jobs have completed, and the data is available in the cube. Errors are displayed when you run a report in the following scenarios:

  • Before the cube is created: An error message that the cube does not exist or is not processed is displayed.

  • Before data is available in the cube: An error message that no data is found for this report is displayed.

To add a Data Source:

  1. Launch the Admin Console.

  2. In the left navigation pane, click Configuration > Data Source View and click Add in the top Navigation pane.

  3. In the Application Data dialog, specify the following:

    • Application: Displays the application name, AppManager, from which you want to import data.

    • Application data server: Specify the name of the SQL Server used for the application database, which is the AppManager repository.

    • Database name: Specify the name of the application database.

    • Authentication: Select the authentication method for the Data Mart SQL Sever Agent to connect to the application database:

      • Use Windows authentication: Select this option to enable the Data Mart SQL Server Agent to connect to the AppManager repository SQL Server using Windows authentication. In this case, the SQL Server Agent service account is used.

      • Use SQL Server authentication: Select this option to enable the Data Mart SQL Server Agent to connect to the AppManager repository SQL Server using a SQL account and specify the Login Name and Password.

    • Initial load date: Select the month and year starting which you want to extract data from the application database. For example, if you select January 2013, then all data from January 2013 onwards is extracted from the AppManager repository.

  4. Click Next and provide the following information:

    • Data Mart SQL Server: Specify the name of the SQL Server that you want to use for the Data Mart database. Depending on whether the specified SQL Server is in a cluster or a non-cluster environment and it uses named instances or default instance, you must accordingly modify the MsDtsSrvr.ini.xml file. For more information, see Planning a New Data Source.

      Also, ensure that SQL Server Integration Services is installed and running on this Data Mart SQL Server.

    • Database name: Specify a new name for the Data Mart database or retain the default specified name.

    • Authentication: Select the authentication method for the Data Warehouse SQL Sever Agent to connect to the Data Mart SQL Server:

      • Use Windows authentication: Select this option to have the Data Warehouse SQL Server Agent connect to the Data Mart SQL Server using Windows authentication. In this case, the SQL Server Agent service account is used.

      • Use SQL Server authentication: Select this option to have the Data Warehouse SQL Server Agent connect to the Data Mart SQL Server using a SQL account and specify the Login Name and Password.

    • Default database file location: Select this option to accept the default file location or deselect this option to specify the Data Mart data file location in the format: drive:\path.

      You must ensure that the Data Mart data file path already exists. Do not include the name of the data file in this file path. If you specify a non-existing file path, then adding of Data Source fails because Data Mart is not created for this Data Source.

    • Database initial size: Specify the initial size, in MB, for the Data Mart data file. The default size is 10 MB. You can modify this value only if you have not selected the Default database file location option.

    • Default database log file location: Select this option to accept the default log file location or deselect this option to specify the Data Mart log file path in the format: drive:\path.

      You must ensure that the Data Mart log file path already exists. Do not include the name of the log file in this file path. If you specify a non-existing file path, then adding of Data Source fails because Data Mart is not created for this Data Source.

    • Log initial size: Specify the initial size, in MB, for the log file. The default size is 1 MB. You can modify this value only if you have not selected the Default database log file location option.

    • Daily aggregate after: Select the age, in months, at which data should be aggregated daily in the Data Mart. Daily aggregation provides less granularity than the actual raw data, but requires less storage space. In daily aggregation, data generated during the same 24-hour time period is compressed to two rows of data in the Data Mart: one for the local time zone and another for the UTC time zone.

    • Hourly aggregate after: Select the age, in months, at which data should be aggregated hourly in the Data Mart. Hourly aggregation provides less granularity than the actual raw data, but requires less storage space (although more than daily aggregation). In hourly aggregation, data generated during the same one-hour time period is compressed to one row of data in the Data Mart.

  5. Click Next and provide the following information:

    • Enter a name to identify this Data Source or accept the default name: Specify a name for the Data Source or retain the default specified name. This is the name Analysis Center uses to identify the Data Source.

    • Delete data after: Specify the number of months for which you want to store AppManager data in Analysis Center. Data older than what you specify is deleted from Analysis Center.

    • Schedule: Select the schedule to transfer data from the application database. For example, if you select Every 59 minutes, then data is transferred every 59 minutes from the AppManager repository.

  6. Click Next and configure the following options:

    • Select one of the following pre-check modes:

      • No check: Select this option if you do not want to verify the information you specified in the previous wizard pages.

      • Basic: Select this option to verify the SQL Server connections and also verify whether the Data Mart can be created.

      To manually run pre-check on a Data Source, click the Data Source and click Run Pre-check in the top navigation pane.

    • Activate Data Source automatically once all pre-check steps passed: Select this option to automatically activate the Data Source after the Basic pre-checks mode is completed. If you do not select this option, the status of the Data Source that you add is displayed as Pending in the Data Source View. To manually activate the Data Source, right-click the Data Source and click Activate.

Understanding the Data Source View

To display the Data Source view, click Configuration > Data Source View in the left navigation pane of the Admin Console.

The Data Source View displays the following information. In the user interface, you can drag a column header above the view to group the information by that column:

Column Name

Description

Status

Displays the status of the Data Source. For example, the status is Success if the Data Source is available to provide data for reports.

Name

Displays the name of the Data Source.

Enabled

Indicates whether the Data Source is enabled or not.

Message

Displays a message relevant to the Data Source. For example: Pending Data Source means the Data Source is still processed.

DM Server

Displays the SQL Server name for the Data Mart database.

DM Database

Displays the name of the Data Mart database.

DS Server

Displays the SQL Server name for the AppManager repository.

DS Database

Displays the name of the AppManager repository database.

Updated Time

Displays the last updated time for the Data Source operation. For example:

  • The time at which the last attempt to create a Data Source either succeeded or failed.

  • The last time the properties of a Data Source were changed.

  • The last time the ETL job ran.

Oldest Data

Displays the date of the oldest data copied from the AppManager repository.

Newest Data

Displays the date of the most recent data copied from the AppManager repository.

Days Collected

Displays the number of days of data copied from the AppManager repository.

Changing Data Source Properties

To change the Data Source properties such as enable or disable Data Source, edit the schedule to transfer data from the application database, and edit the number of months of data to store in the Data Mart:

  1. Launch the Admin Console.

  2. In the left navigation pane, click Configuration > Data Source View.

  3. In the Data Sources list, do one of the following:

    • Right-click the Data Source for which you want to change the properties and click Properties.

    • Click the Data Source for which you want to change the properties and click Properties in the top navigation pane.

  4. In the Data Source Properties dialog, click the General tab to set the following properties:

    • Name: Displays the name of the Data Source. You cannot edit this option.

    • Set as an enabled Data Source: Displays whether the Data Source is enabled or not. If the option is selected, the Data Source is already enabled. Toggle the selection to accordingly enable or disable the Data Source.

    • Application: Displays the name of application, AppManager, from which data is copied. You cannot edit this option.

    • Initial load date: Displays the date starting which the data was loaded from the application database. You cannot edit this option.

    • Schedule: Displays the specified schedule to transfer data from the application database. To edit the schedule, select Change to and then select a schedule.

    • Delete data after: Displays the number of months of data, including the current month, to store in the Data Mart.

  5. Click the Application Data tab to set the following properties:

    • Application data server: Specify the name of the SQL Server used for the AppManager repository.

    • Database name: Specify the name of the AppManager repository.

    • Authentication: Select the authentication method for the Data Mart SQL Sever Agent to connect to the AppManager repository SQL Server:

      • Use Windows authentication: Select this option to have the Data Mart SQL Server Agent connect to the AppManager repository SQL Server using Windows authentication. In this case, the SQL Server Agent service account is used.

      • Use SQL Server authentication: Select this option to have the Data Mart SQL Server Agent connect to the AppManager repository SQL Server using a SQL account and specify the Login Name and Password.

  6. Click the Data Mart tab to set the following properties:

    • Data Mart SQL Server: Displays the name of the SQL Server used for the Data Mart database. You cannot edit this option.

    • Database name: Displays the name of the Data Mart database. You cannot edit this option.

    • Authentication: Select the authentication method for the Data Warehouse SQL Sever Agent to connect to the Data Mart SQL Server:

      • Use Windows authentication: Select this option to have the Data Warehouse SQL Server Agent connect to the Data Mart SQL Server using Windows authentication. In this case, the SQL Server Agent service account is used.

      • Use SQL Server authentication: Select this option to have the Data Warehouse SQL Server Agent connect to the Data Mart SQL Server using a SQL account and specify the Login Name and Password.

    • Default database file location: Displays the Data Mart data file location in the format: drive:\path. You cannot edit this option.

    • Database initial size: Displays the initial size, in MB, for the Data Mart data file. You cannot edit this option.

    • Default database log file location: Displays the Data Mart log file path in the format: drive:\path. You cannot edit this option.

    • Log initial size: Displays the initial size, in MB, for the log file. You cannot edit this option.

    • Daily aggregate after: Displays the age, in months, at which data should be aggregated daily in the Data Mart. To modify the age, select a new value.

      Daily aggregation provides less granularity than the actual raw data, but requires less storage space. In daily aggregation, data generated during the same 24-hour time period is compressed to two rows of data in the Data Mart: one for the local time zone and another for the UTC time zone.

    • Hourly aggregate after: Select the age, in months, at which data should be aggregated hourly in the Data Mart. To modify the age, select a new value.

      Hourly aggregation provides less granularity than the actual raw data, but requires less storage space (although more than daily aggregation). In hourly aggregation, data generated during the same one-hour time period is compressed to one row of data in the Data Mart.

  7. Click OK.

Enabling and Disabling a Data Source

  1. Launch the Admin Console.

  2. In the left navigation pane, click Configuration > Data Source View.

  3. (Conditional) To enable the Data Source, right-click the Data Source and click Enable.

  4. (Conditional) To disable the Data Source, right-click the Data Source and click Disable.

When you disable a Data Source, you can still create reports for data collected up to the time the last ETL job ran for that Data Source. After a Data Source is disabled, the related ETL job no longer runs and data is no longer copied from the AppManager database to the Data Warehouse.

Deleting a Data Source

  1. Launch the Admin Console.

  2. In the left navigation pane, click Configuration > Data Source View:

  3. Right-click the Data Source and click Delete.

  4. (Conditional) To remove the associated Data Mart databases, click Yes.

    If you do not choose to delete the Data Mart database associated with the Data Source, you can later use the SQL Server Enterprise Manager to delete the database.

Deleting a Data Source performs the following actions:

  1. Displays the status of Data Source as Pending Delete in the Data Source View.

  2. Creates and executes the Data_Source_Name Create Data Source Data Marts job to disable both the Data Source and the Data_Source_Name ETL job on the Data Mart database SQL Server.

  3. Initiates the Analysis Center (OLAP Processing) job as per the schedule to perform the following actions:

    • Removes the physical and regular cubes for the Data Source from the AC_OLAP database.

    • Deletes the Data Source ETL job on the Data Mart SQL Server.

    • Deletes the SSIS packages of the ETL job on the Data Mart SQL Server.

    • Deletes the Data Mart database if you chose to remove it.

    • Redefines the virtual cubes so that the physical and regular cubes for the Data Source are no longer included.

    • Marks the Data Source as Deleted, although it still exists in the database.

      NOTE:If no ETL jobs have run against the Data Source, then the Data Source was not fully loaded into the OLAP Data Source dimension. As a result, the OLAP Processing job completely removes the Data Source.

8.1.2 Managing Jobs

  1. Launch the Admin Console.

  2. In the left navigation pane, click Configuration > Jobs View.

    The Jobs View displays the following information. In the user interface, you can drag a column header above the view to group the information by that column:

    Column

    Description

    Status

    Displays the job status such as success and failure.

    Job Name

    Displays the name of the SQL Server Agent job.

    Data Source Name

    Displays the name of the Data Source.

    Message

    Displays a short description of the SQL Server Agent job.

    Run Time

    Displays the last date and time when the job ran.

    Elapsed Time

    Displays the duration of the last job run.

You can do the following:

  • To start, stop, enable, or disable a job, right-click the job and select the appropriate option.

  • To view the detailed log information of the job, select the job and click Detail Log. If a job fails, you can use this log information to find out more details about the job failure.

  • To view the detailed log information about a job step, select the step and click Step Detail Log.

8.1.3 Understanding ETL Filters

ETL filters let you control the amount of data that Analysis Center collects from AppManager. For example, if the AppManager repository contains a large amount of data, you can define a filter to specify which data to include or exclude from the Analysis Center report. To troubleshoot issues when Analysis Center does not display data that you expect to see, verify if any filters are applied on that data.

You can view or modify existing ETL filters and also define new filters. To disable filters, select the filters and click Disable ETL Filters in the top navigation pane. You cannot enable a disabled filter and must recreate it. To display the disabled filters, select the Display disabled ETL filters option. The disabled filters are highlighted in grey.

Review the following sections:

Understanding ETL Filter Limitations

  • When you apply an ETL filter, it only affects the new data. Existing data already processed in Analysis Center is not affected. To apply the filter to the existing data, you must redeploy the job in AppManager. Before redeploying a job, you must ensure that Analysis Center stops processing that data until the job is complete.

  • When you remove an ETL filter, Analysis Center begins processing the data, which it was previously filtering, on the next ETL run. It does not process data prior to when the filter was applied.

Defining New Filters

Analysis Center provides two predefined ETL filters. If you define and save a new filter, Analysis Center applies this new filter when the ETL process runs successfully the next time.

To add a new filter:

  1. Launch the Admin Console.

  2. In the left navigation pane, click Configuration > ETL Filters.

  3. Click New Filter to create a new filter entry in the list and do the following:

    1. In the Data Source option, select the Data Source on which you want to apply the filter.

    2. In the Object Type option, select the type of object on which you want to apply the filter.

      To create a filter for UNIX, NT, or AppManager, select the Object Type as Knowledge Script and select the Knowledge Script filter accordingly. For example:

      • For UNIX Knowledge Script, the filter is UNIX%.

      • For NT Knowledge Script, the filter is NT%.

    3. In the Object Name option, specify the name of the object.

      Following are some of the examples of object names based on the selected object type:

      • Application: NT, SQL, IIS

      • Knowledge Script: NT_CpuByProcess, SQL_MemUtil, IIS_CpuHigh

      • Server: AGENT1, SQLINSTANCE, DELL_MACHINE

      To enable the filter to use the object name as a wildcard, use the percent sign (%) in the name and set the Object Wildcard option as Yes.

    4. In the Object Wildcard option, select one of the following to specify whether the filter should treat the object name as a wildcard or not:

      • Yes: The filter treats the object name as a wildcard and filters any object that contains the text specified in Object Name option. However, the object name you specified in Object Name option must contain the percentage sign (%) wildcard character.

      • No: The filter does not treat the object name as a wildcard and filters any object that contains the exact text specified in Object Name option.

    5. In the Filter Flag option, select one of the following:

      • INCLUDE: The data from the specified object should be included in the data load.

      • EXCLUDE: The data from the specified object should be excluded from the data load.

      • LEGEND: The data from the specified object should be processed as a static legend even if the metric carries a dynamic legend.

    6. Click Save.