Category Archives: Cognos / Insight Reporting

How to configure Cognos for Microsoft Office

This is a Cognos plugin tool that is used to integrate your IBM Cognos BI, IBM Rational Insight or RRDI environment with Microsoft Office so that you can view Cognos reports through Office software. The tool has also been called “Cognos Office” or “Cognos Go-Office” in the past. As of this post, it is called “Cognos for Microsoft Office”.

I have found the best use for this tool is using it with a Powerpoint presentation. Here is an example. Say an employee has a weekly meeting where they present a specific set of slides. The structure of their slides are the same but the data in the reports being shown (which come from Cognos) vary from week to week. So each week, this employee must manually generate the reports via copying and pasting screen shots into the Powerpoint presentation. Its not the most enjoyable task.

However, if this employee installs the “Cognos for Microsoft Office” plugin, then every week, they would simply launch their existing Powerpoint slide deck and click on a “Refresh” button and this will automatically update their existing reports to show the most current data. That is all, in one simple click.

This plugin is a local install, so each user that wants to use it must install it on their local machine, where their Microsoft Office software resides. It is a pretty straight-forward installation and configuration process. Once you have installed the software, here is how to configure it on your local machine. (Please note that I am using IBM Cognos 8 in this example).

1) After installation of the Cognos for Microsoft Office plugin, launch Microsoft Powerpoint and you will find a new tab named “IBM Cognos 8”. Click on this tab.

2) Now, in the “Application” section, click on the IBM Cognos 8 button, then select “Options”

3) Here, specify your Report Server that Office will communicate with. Enter the “System Gateway URI” and give it a “Friendly name”.

4) Click “Test Connection” to verify that your Office software can communicate with your Report Server. If your connection is successful, you should see this message:

5) Now, click “Add” then “Ok”

6) From the toolbar, click “Logon” and select the Report server you just added (in my case, its “My Insight Server”)


7) Login with your Report Server credentials (if applicable)

8) Now, on the right hand side of your screen, click on “IBM Cognos 8 Go! Office” (Depending on your version, you may not see this identical text)


9) You should now see your Report server directory structure in your Microsoft Office Powerpoint application


10) You can now drag and drop your Cognos reports into your Powerpoint presentation

Best Practices on Planning / Scoping a Cognos Reporting Environment

I thought I would share my expertise and experience on best practices in planning a new deployment for a reporting environment. I have had a ton of experience with many different organizations and industries, worldwide. I have seen alot of successful deployments and I have seen alot of…well…unsuccessful deployments.

While much of what I am about to discuss could be applicable to any software deployment, I will focus this specifically on a IBM Cognos BI, IBM Reporting for Development Intelligence and IBM Rational Insight deployment. There are many different aspects to take into consideration when you are trying to scope and plan a new reporting environment. Here is my advice on what you should do.

Start small and grow big!

You cannot address all of your needs or solve all of your problems in the first phase of your deployment. It is a common mistake to onboard too many users too quickly or over scope the initial deployment. This can lead to many issues including performance problems, server stability, user frustration, an abundance of problem tickets and even canceling the entire project effort.

You should take an incremental approach to your deployment and you will should do this in an iterative fashion. First, you need to scope a type of topology based on your current, and what you believe to be, your future needs (eg clustered , non-clustered, IHS server/Cognos Gateway or a standard distributed environment). To do this, you need to review your overall business goals, translate those to operational objectives and then break those down into individual tasks.

Here are some questions you should address when attempting to plan for a new Reporting environment:

  • What are my expected number of users in the first 3 months, 6 months, 1 year?
  • What types of reports will these users be running (live, trend, saved)?
  • Will the reports be run individually or through dashboards or schedule and emailed to stakeholders?
  • Will my users be accessing the Report server directly or through another interface (eg Rational Team Concert dashboard)?

Once you have understood these items and identified your topology type based on this, you now have the foundation for your initial deployment (note: it is possible to change your deployment topology after the fact, such as going from a vertical to horizontal cluster). Remember, start small and grow big.

The next thing you need to do is identify a small set of users (eg ~50) who will access the deployment as early adopters. You also need to identify a small set of reports (eg 5-10) to develop and deploy into production for use. Again, do not try to do too much too fast. History and experience has shown me that this approach of trying to do too much too fast has a significantly lower percentage of success.

Once you go live with your initial deployment, you need to closely monitor it and look at these key elements:


  • How are my servers performing?
  • How are my cpu and memory utilization?
  • Are my servers crashing?
  • How is the network behaving?

Usage model / User activity

  • Are my users consuming the solution as expected?
  • Are they logging into the Report server portal directly?
  • Are they accessing reports through other means such as a Rational Team Concert dashboard widgets?
  • Are they scheduling reports and having them emailed to stakeholders?
  • Are they running individual reports or viewing a dashboard of reports? How many reports (and what type) are being run in each dashboard?

User feedback

  • What are my users saying? What is their experience with the deployment?
  • Am I getting alot of problem tickets or emails about errors?
  • Are they getting any value from the deployment?

I highly recommend that you set milestones for your deployment. Here is an example of a high level roadmap. This is just an example! You can scale this however you deem appropriate so long as you start small and expand incrementally.

As you progress through milestones, you should be making adjustments based on your observations of what I mentioned that you need to monitor. You may find that you need double the amount of servers you had scoped. In this case you would either need to expand your topology quicker or onboard users at a slower rate. You may also find that you have more than enough of power within your server environment. In this case, you could either scale back a few servers, downgrade some of the machine specs or onboard more users at a faster rate.

The key to success is to start small and grow big!

Links to my IBM Innovate Conference presentations

Here are links where one can view or download my IBM Innovate presentations over the last five years.

Innovate 2010: Best Practices and Lessons Learned on Our IBM Rational Insight Deployment

Innovate 2011: The IBM Rational Insight Reporting Solution

Innovate 2012: Deploying Rational Insight into a Heterogeneous Environment

Innovate 2013: Improving Predictability and Efficiency with Kanban metrics using IBM Rational Insight

Innovate 2014: Unleash Your Metrics Outside the Box: Customizing Your IBM Rational Insight Deployment (workshop)

Configuring Rational Insight to extract data from Rational DOORS

In this post, I will show you how to configure IBM Rational Insight to extract data from IBM Rational DOORS using the RIF integration method. In this scenario, I am using Rational Insight v1.0.1.1 & Rational DOORS v9.3.0.4. This is based on documentation I created on September 9th, 2011 and it is still valid as of this post. However, if you are using Rational DOORs v9.5.1 or higher, you should leverage the more optimal REST interface integration provided by the IBM Rational DOORS team here.

The way to achieve this integration is to point Insight to a DOORS RIF file (which is exported from DOORS) that contains the necessary data to be reported on. Before I begin, let’s review some key things to keep in mind to communicate to your DOORS Developer whom will be defining the content of the DOORS RIF file for you to point to from Insight.

  • Ensure that all of the necessary fields that you are looking to capture in Insight are available in the DOORS view in the RIF file.
  • Only String and Integer field types of DOORS are consumable in the RIF format. If you’ve got some fields stored in a Text type field, for example, that you require, then you must create another new field of type String and write a DXL script to populate that “Text” field’s data in the new field. It will then be consumable in the RIF format.

Now, let’s start with the integration..

Configuring the XDC file – There are two different methods when pointing to a RIF file

Method 1 – Pointing to a static RIF file

1) Let’s say that the DOORS developer has created and exported a RIF file and placed it on your C drive (Windows OS) of your Insight developer machine. In this scenario, here is how you would configure the XML Data Configuration tool (XDC) to point to this RIF file, called myRIF3.xml.

2) Launch the XDC tool and open the doors.xdc configuration file (by default, located here: (Insight_install_directory)\etl\configs\Configurations)

3) Traverse through the Resources directory until you reach the Resource Group level. Double-click on the DOORS resource group and ensure the URL Type is set to RIF, the URL is pointing to the correct directory of the RIF file on your local machine (in this case, the C drive), and Authentication type is set to None. (Note: You can optionally test the connection at this point by click on Test Connection but this does not mean that you are successfully integrated with the RIF file just yet).

4) Next, double-click on the Resource named doors and ensure that the relative path references the name of the RIF file

5) Now, you must verify that you can review the contents of that RIF file from Insight. Double-click on one of the Data Mapping Tables, let’s select Header. Then click on the Columns tab at the bottom left of the workspace window

6) Now click on the Modify link to view the XML schema of the RIF file against your data mapping table

7) If you have completed all of the previous steps successfully, then you should now see the XML schema of the RIF file under the section labeled XML Schema. If that section comes up blank, then there is either an issue with the source RIF file or you may have missed one step above.


Method 2 – Pointing to the RIF Exporter Service

The other method that could be used to extract data from DOORS into Insight would be to point to the RIF Exporter Service. The RIF Exporter Service is an IBM Services asset that automates the generation of RIF files. For more information, contact your local sales representative.

Assuming you have the RIF Exporter Service deployed, here is how you would configure Insight to leverage it.

1) Launch the XDC tool and doors.xdc file (as shown in the steps above). Traverse through the Resources directory until you reach the Resource Group level. Double-click on the DOORS resource group and ensure the URL Type is set to RIF.

The URL must be pointing to the RIF file located at DOORS RIF Exporter service. Sample syntax is: http:/DoorsServices/DOORS/InsightP?rifdef=. Here is my example: http://localhost:9080/DoorsServices/DOORS/InsightP?rifdef=myRif3

The Authentication type should be set to Basic and you must enter credentials that have access to the source DOORs deployment. You may now click on Test Connection to verify that you can communicate with that file.

2) Next, click on the Resource doors and verify that the Relative path box is empty. You have already specified the file name at the Resource Group level, which is used as part of the URL for validation.

3) Now, double-click on the Data Mapping Table named SpecObject (which provides the basic data around DOORS objects.

4) Click on the Modify link at the top right of the workspace

5) If you now can view the XML schema on the left without any error messages then everything that is being mapped in your Relational table columns does exist in the RIF file. You can move on to Step 2.

6) However, if you receive an error message like this, and see fields highlighted in red, then this means that there is something that exists in your mappings in Insight that does not exist in the RIF file. You will need to verify the contents of what is available in the RIF file with your DOORS administrator.

7) In this example, you will modify the XDC mappings to capture only the fields that they would like to extract from DOORS. If they go back to the Data Mapping Table view (in Columns tab), select the columns that you do not currently want to capture, and click on the Unload link at the top right.

The view will now look like this:

8) Click the Save button. Also, make sure to save any other windows you have open in the XDC tool if applicable.

Configuring the Extract Transform Load Process

1) Launch IBM Cognos Data Manager and open your current ETL catalog.

2) Navigate to the Operational Data Store (ODS) section, and expand the DOORs folder. These are the out of the box DOORS fact builds that ship with Insight.

3) In this example, you need to configure the DOORS object, so click on the DOORS_Stg fact build and then double-click on the DataSource1 node in the workspace on the right.

4) Click on the Query tab

5) Here you find the DOORS Data Mapping Tables (as defined in the XDC tool) on the left hand side, and on the right, you see the out of the box SQL Query for the SpecObject table.

You need to modify the SQL Query so that it matches what data is currently available from DOORS, based on your previous mappings.

6) The simplest way to do this is to right-click on the SpecObject node and select Add Table select statement.

This will automatically remove the existing SQL Query and replace it with a SQL Query that will extract everything in your current SpecObject mapping table. It should look like this.

7) You should validate the data here, so click on the green triangle button to return all rows. This will query the DOORS RIF file and return the data queried. You should review these results to ensure this is what you are expecting.

8) Now, click on the Result Columns tab, and then click on the Refresh button on the bottom left to refresh the result columns that you now will return. Then click on Ok.

9) At this point, you would continue to configure the ETL just as you would for any other data source (business as usual). You will need to repeat the above steps for each Data Mapping table or entity that you would like to extract into Insight.


This blog post demonstrated how to configure Rational Insight to extract data from Rational DOORS. First, I showed how you integrated Insight with the DOORS RIF file and verified that it could see the source data. Second, I modified the XDC file and exposed the DOORS data that we were interested in extracting into Insight. Finally, I configured the Data Manager ETL fact to realize the source data via the SQL Query window.

Why is my Role & Membership data Missing in the Rational CLM / RRDI or Insight Data Warehouse ?

Some folks may experience no data being returned when running reports containing role or membership data from CLM (JTS) using Cognos Report Studio with RRDI or Rational Insight. Some examples would be a report design that contains data from these sections of the framework model:

Operational Data Store > Project Area > Project Role
Operational Data Store > Project Area > Project Related members
Operational Data Store > Team Area > Team Role
Operational Data Store > Team Area > Team Related members

If you are seeing no data coming from these query subjects, then it could be because one of these reasons:

1) You are not using a JTS that is version 4.0.5 or higher (the ETLs for membership and role data for projects and teams were added in version 4.0.5)
2) Your CLM user, whose credentials you use to login to RRDI with, does not have permissions to access those project/team areas
3) Your membership/role ETLs are disabled.

I am going to provide more information on #3. By default, the ETLs for membership and role data are disabled. The reason for is likely because not every user or deployment would require reporting this data. Hence, to avoid additional ETL processing overhead, it is disabled by default. If a user wants to report on that data, then they need to enable these ETLs.

Here is how to do it.

Enabling the membership/role ETLs for RRDI in the Jazz Team Server

If you are using RRDI, then you can enable the Java ETLs by logging into the the Jazz Team Server and navigating to the Administrator section. Then, go to Server > Advanced Properties. Its easier to search for “”. Here you will see the “Ignore Roles and Membership Data” property, which is set to “true” by default. Change this to “false” and save. Run a full load of the ‘Common’ job and the membership/role data will now be loaded into the data warehouse.

JTS/Admin Advanced Properties

Enabling the membership/role ETLs for Rational Insight in Cognos Data Manager

If you are using Rational Insight, then you can enable the Data Manager ETLs by logging into your Data Manager client. Then, navigate to Builds and JobStreams > Jobs > JFSJobs > JFS4.0.5Jobs > JFS_Common4.0.5. You need to enable the following fact builds:


Data Manager - JFS_Common4.0.5

To enable each fact build, double-click on each build shown in the image above and ensure the “Exclude this node from processing” property is unchecked. Then click “Ok” and save the catalog. Run a full load of the ‘JFS_Common4.0.5’ job. The role/membership data will now be loaded into the data warehouse.

Configure fact build properties

IBM Rational Insight Enablement Links

Alot of folks ask me for information on how to do things or if certain things can be done. Many ask for some presentation collateral. Here is a list of some links to collateral that contain alot of great information about IBM Rational Insight that I have created over time that will address alot of your questions:

IBM Rational Insight Overview 2014

The IBM Rational Insight Reporting Solution

Deploying IBM Rational Insight in a Heterogeneous Environment

Improve Predictability and Efficiency with Kanban Metrics using IBM Rational Insight (presentation)

Improve Predictability and Efficiency with Kanban Metrics using Rational Insight (article version)

Integrating Rational Insight with HP Quality Center and other Third Party tools

Configure Rational Insight with an additional Jazz Team Server

Configure LDAP for Rational Insight when Integrated with Multiple Jazz Team Servers

How to implement delta loads using Rational Insight

Integrating a Microsoft Excel spreadsheet with Rational Insight (live data model)

Define Aggregate Rules for Semi-Additive Measures in Framework Manager

IBM Rational Insight – Setting up your environment for remote execution using Data Manager Network Services

Configuring Rational Insight for Rational Team Concert custom attributes (for RTC version 3.0 and earlier)

Here are some helpful IBM links:

IBM Rational Insight 1.1.1.x Infocenter

IBM Rational Insight Data Models (ETLs, Data Warehouse, Reports)

IBM Rational Insight & CLM Data Dictionaries

Insight / CLM Integration

IBM Rational Insight & IBM Rational Reporting for Development Intelligence (RRDI) – how they compare

This is another one of those situations where there can be alot of confusion between these two IBM report offerings. I did a similar comparison between IBM Rational Insight and IBM Cognos BI in a previous post IBM Rational Insight & IBM Cognos BI – How they compare. I suggest reading that post which has descriptions for several of the components that will be mentioned here. Please note that everything I mention here is accurate as of the date of this post and things are always subject to change in the future.

First, I am going to discuss what these two solutions have in common, and then I will talk about some of the key differences. Finally, I will provide some common use cases I have come across and which solution would be the most appropriate.

RRDI is an optional component when someone owns the Rational Collaborative Lifecycle Management solution or any piece of the solution. RRDI is a subset of Insight, or rather, Insight is an extension that is built on top of RRDI. The core binaries are the same. When you install the Insight report server, you will have two sets of bits to lay down, the RRDI components and the Insight extension. If you read my other post referenced above, you would already know that Insight is built using the IBM Cognos BI technology.

Both solutions contain a Report Server, which are deployed using the same bits, hence identical. The web UI navigation and experience is identical. The report authoring tools (Report Studio, Query Studio) are identical. Insight comes with some additional tools that appear in the report server which I will mention below.

Both solutions report off of Data Warehouse. The Data warehouses are almost identical for both solutions as they contain all the same core schemas. There is only one difference. The data warehouse that RRDI uses comes from the Rational CLM solution which contains an RICALM schema for certain specific CLM data points (Team Concert, Quality Manager, Requirements Composer). Insight contains an RIBA schema for some scorecard metrics. RRDI does not contain this RIBA schema. If you integrate Insight with one of the Rational CLM tools, then you will add the RICALM schema to the existing Insight data warehouse.

Now I’d like to discuss differences between the two solutions. I will only elaborate on the major differences, however I’ve provided a list here to display all of the key differences:

Differences between RRDI & Insight
*Both solutions can achieve cross-project reporting.

  1. RRDI can only report off of data that resides in the CLM data warehouse. This is typically limited to Rational Team Concert, Rational Quality Manager and Rational Requirements Composer. Please note that there are ways to load other tools’ data into the CLM data warehouse but that will not be discussed here. Insight is not limited to just CLM product data and can integrate with almost any data source in several ways (eg ODBC, REST APIs, XML).
  2. RRDI is limited to reporting on application data coming from one Jazz Team Server. Insight can report on application data from multiple Jazz Team Servers.
  3. RRDI’s framework data model and the CLM Java ETLs are non-customizable and no developer tools are provided. Insight’s framework data model and Data Manager ETLs are fully customizable and a set of developer tools are provided.

Here I have listed some common use cases that I have come across in working with clients and which reporting solution best meets those use cases.

  1. An organization has certain teams that use the IBM Rational CLM suite of tools and they also have other teams using some tools from HP and Microsoft to capture similar data. They are looking to create enterprise-level reports that aggregate data across all teams to provide a high level view to their executives.

    Appropriate Solution: Rational Insight
  2. An organization is using IBM Rational Team Concert for their change and configuration management (several CCMs with only one Jazz Team Server). They have a business need to create metrics based on their CCM data only.

    Appropriate Solution: RRDI
  3. An organization has several teams using Rational Quality Manager, Rational Team Concert and Rational Requirements Composer. Each team has their own Jazz Team Server (multiple Jazz Team Servers) and multiple CLM applications.

    Appropriate solution: Rational Insight

Workaround for RTC 4.0.5 Data Manager ETLs when run against a 4.0.3 CCM App server

I found this bug today and wanted to share my solution to the problem. My scenario is I have Insight v1.1.1.3 deployed and integrated with 2 CLM deployments, one of which is version 4.0.5 and the other is version 4.0.3. The RTC Data Manager ETLs run fine against the 4.0.5 CCM server however, when running the 4.0.3 DM ETLs against the 4.0.3 CCM server, errors are thrown.

Viewing the ri_jdbc.log file, I was able to detect the first error in the “RTC_WorkItemApprovalState4.0.3” fact build (see below for full error). It is complaining about a column it is trying to query that does not exist. The column is ‘projectAreaArchived’.

When I checked the XML data configuration file in the XDC tool, the respective resource (WorkItemApprovalState) was indeed missing this data element.


The solution to this is to expose the required data element which indeed does exist in the RTC reportable REST API.

  1. Launch the XML Data Configuration tool
  2. Load the workitem403.xdc file
  3. Navigate to the data mapping table: “Resources” > “RTCWorkItem” > “WORKITEM_CCM” > “WorkItemApprovalState” > “WorkItemApprovalState
  4. Double-click on the data mapping table
  5. Select the “Columns” tab and then select “Modify”
  6. Navigate through the RTC reportable REST API to find the ‘/archived’ data element. Highlight it and select “Create”. This will add the new element to the data mapping template.
  7. Rename it to ‘projectAreaArchived’ by clicking in the “Table Column Name” cell and modifying the text
  8. Save the configuration
  9. Close the “WorkItemApprovaState” data mapping template tab
  10. You should receive a dialog that tells you the template has been changed and asks you to update it. Select “Ok”
  11. Save your configuration once again
  12. Your new column will appear faded. You have added the column but now need to load it. Highlight your new column and click on “Load”.
  13. Save your configuration again and this new column is now available made to Insight, hence the ETL will no longer complain about it.

There are 4 different places where this error will be thrown. You will only see one error at a time as the ETL halts upon the first error in this case. Here are the 4 fact builds this affects:


Follow the steps I provided above for each of these 4 fact builds and you can proceed with your deployment without error.

For reference, here is the full error found in the ri_jdbc.log:

query => SELECT “id”,
FROM “WORKITEM_CCM”.”WorkItemApprovalState”
WHERE modifiedsince=’1899-12-31 00:00:00′ AND projectAreaArchived=false
03/06/2014 13:27:27,444 ERROR Thread-5 : CRRRE1203E: The column ‘projectAreaArchived’ does not exist.
03/06/2014 13:27:27,444 ERROR Thread-5 : CRRRE1203E: The column ‘projectAreaArchived’ does not exist. CRRRE1203E: The column ‘projectAreaArchived’ does not exist.
Caused by: CRRRE1203E: The column ‘projectAreaArchived’ does not exist.
… 1 more
03/06/2014 13:27:27,459 INFO main : CRRRE1228I: Close this statement

RRDI / Insight setup wizard error when deploying Insight reporting components

I was running the RRDI setup wizard and got all the way to the last step when I encountered an error.

RRDI setup error

I enabled TRACE level logging for the rrdi_setup.log and focused on these errors in bold:

02/21/2014 16:11:14,549 DEBUG : Exit BIServerUtil.executeCMD
02/21/2014 16:11:14,549 DEBUG : Enter WASUtil.createWSAdminCMDHead
02/21/2014 16:11:14,549 DEBUG : Exit WASUtil.createWSAdminCMDHead
02/21/2014 16:11:14,549 DEBUG : Enter BIServerUtil.executeCMD
02/21/2014 16:11:14,549 TRACE : Command:
02/21/2014 16:11:14,549 TRACE : [/home/marc/IBM/WebSphere/AppServer/bin/, -profileName, RationalReporting, -c, $AdminApp install “/home/marc/IBM/RRDI/setup/insight/UA/insight.war.war” {-MapWebModToVH {{.* .* default_host}} -contextroot help -appname “help” -target WebSphere:cell=rhelsweatNode01Cell,node=RationalReportingNode01,server=RationalReportingServer}]
02/21/2014 16:12:24,664 TRACE : Exit value = 103 Output = WASX7209I: Connected to process “RationalReportingServer” on node RationalReportingNode01 using SOAP connector; The type of process is: UnManagedProcess
WASX7015E: Exception running command: “$AdminApp install “/home/marc/IBM/RRDI/setup/insight/UA/insight.war.war” {-MapWebModToVH {{.* .* default_host}} -contextroot help -appname “help” -target WebSphere:cell=rhelsweatNode01Cell,node=RationalReportingNode01,server=RationalReportingServer}”; exception information: org.eclipse.jst.j2ee.commonarchivecore.internal.exception.SaveFailureException: WEB-INF/plugins/ [Root exception is org.eclipse.jst.j2ee.commonarchivecore.internal.exception.SaveFailureException: WEB-INF/plugins/]
org.eclipse.jst.j2ee.commonarchivecore.internal.exception.SaveFailureException: org.eclipse.jst.j2ee.commonarchivecore.internal.exception.SaveFailureException: WEB-INF/plugins/

After doing some research, I found several links leading to different solutions which included fixpacks, memory issues and more. Nothing seemed to match my exact error/scenario. I also consulted with a few colleagues that had leaned towards memory or disk space issues.

Well, I had already checked the amount of free disk space beforehand and all looked well. My memory was at 50% usage, so it was fine. However after reviewing the disk space again since I received these errors, I found that I had run out of disk space on the install partition. So first, I increased the amount of free disk space and reran but still got the errors. I then completely uninstalled Insight and re-installed the bits, ran RRDI setup and everything worked like a champ.

Hope this can help others with this scenario.

FYI, here was my environment:

RHEL 5.10 x64
Dual Core Proc
Insight 1113/RRDI 205
WAS 8.5.5
Oracle 11gR2 32-bit client

Timesheet entry data availability in RTC & Insight Data Warehouse

CLM included new ETLs for Timesheet Entry data in RTC version 4.0.4.

First, with respect to RTC itself, I found out that the RTC 4.0.4 Timesheet ETL requires a patch to get the Java ETLs to work correctly.

With respect to using Rational Insight, a user would only see the Timesheet ETLs in the Data Manager ETL catalog if they downloaded the version 4.0.4 and higher Data Manager PKG ETLs files from RTC. What I found to be a bit confusing, was that they are labeled with the version 4.0.3:



Development confirmed that the reason that they are labeled with 4.0.3 and not 4.0.4 was because these ETL jobs were originally scheduled for RTC version 4.0.3 but were actually delivered in 4.0.4. So, if you have integrated Insight with RTC 4.0.4 or  higher, expect to see these Timesheet ETLs under the RTC 4.0.3 jobs folder.

This Timesheet ETL job will run against RTC version 4.0.3. I have implemented this successfully. If you have integrated Insight (v1.1.1.2) with RTC v4.0.3 and want the Timesheet entry data job, you will need to do the following:

  1. Obtain the etl PKG file for RTC 4.0.4
  2. Create a new “test” Insight ETL catalog in Data Manager
  3. Import the Insight OOTB ETL catalog, then import the JTS/RTC PKG files from version 4.0.4.
  4. Once imported, create a new package file and select only the “RTC_Timesheet403” & “RTC_Timesheet403_FullLoad” jobs (located in RTCJobs > RTC4.0.3Jobs).
  5. Now, launch your ETL package where you require the Timesheet jobs and import the new PKG file you just created in the previous step.

Please note that IBM considers this a customization and that you will own this solution as it is not officially supported by IBM. My legal disclaimer applies here.

IBM Rational Insight & IBM Cognos BI – How they compare

Alot of people have asked me about how these two solutions stack up next to each other.  Questions such as “What is the difference between them?”, “Which one is a better solution?” or “Which one will meet my specific business needs?”. The truth is, there is no one correct answer. It all depends on several factors. I’m going to talk about some of the key differences to help everyone better understand this.

IBM  Cognos‘ main components are:

  • BI server – this is where users view, run and create reports and dashboards. This is also referred to as the “Report server”.
  • ETL server – this handles the ETL processing of the data. It extracts the data from the specificed data sources, transforms the data based on business logic and loads that data into the Data Warehouse.
  • Data Warehouse – schema/structure of your database where all of the data will reside. Reports are run against the data that live here.
  • Data Manager – this is the ETL development tool where users specify what their data sources are, how to transform the data and where to load that data. All this is specified in what is called the ETL “catalog”. Once this is defined, it is published as a “data movement task” that the ETL server executes on its behalf. Note: for development/testing, Data Manager also has a built in DM engine that could run the ETL.
  • Framework Manager – this is the framework modeling development tool that is used create a metadata model that is the layer between the raw structure of the Data Warehouse and what the Report Authors see. It creates a cleaner view of the data and organizes it based on business needs.

Cognos also comes with these end-user tools: Report Studio, Query Studio, Analysis Studio, Event Studio, Workspace and Workspace Advanced. You can just google them to see what they do.

Please note that there are many tools bundled into this solution and I have only explained the key infrastructure components. There are also additional databases needed. I do not want to this post to flood you all with too much info 🙂

In additional to the servers, components and tools, Cognos also comes with an optional sample download called “The Great Outdoors“. This provides a simple example of some of the capabilities of the Cognos offering. This sample does not include any ETL samples so its not an end to end example, but it helps show some of what Cognos can do. I have not personally used the sample. I focus more on SDLC/ALM metrics and this does not appear to be an example of that.

IBM Rational Insight is built on the Cognos BI platform. So this means that is also has the same server components “under the hood”: Cognos BI server & ETL server. Insight also leverages the Data Manager & Framework Manager developer tools. Rather than list out all of what Insight provides, I am going to list out the differences between Insight and Cognos:

  • Out of the Box Collateral – Insight provides very extensive  out of the box deployment collateral. This includes XML/XDC mappings, a vast ETL catalog (containing fact builds, dimensions, data marts), a complete Data Warehouse schema, framework data models and several reports and dashboards.

To me, this is one of the key benefits of Insight. Its provides us with all the collateral needed to get deployed and realize value quickly. The effort to create this collateral is not trivial and in my estimate, would take thousands of man hours to recreate. We can use the examples in “as is” form, customize them to meet our needs or simply use them as a guide to create our own collateral. The OOTB collateral takes into consideration all of the key elements of the SDLC including: Program, Project, Iteration, Release, Requirement, Test Case, Defect and much more. My suggestion and best practice is to leverage as much of this as possible to avoid rework.

Additional components created by IBM Rational for Insight:

  • IBM Rational Data Services Server – This server is used when extracting data from IBM Rational Clearquest, IBM Rational Clearcase, IBM Rational RequisitePro and IBM Rational Test Manager. These tools use an “adapter” which is basically a set of Java classes that plug into the Data Services API. The Data Services server then communicates with the point product API through XML to expose the source data.
  • IBM Rational XML Data Configuration – This developer tool is used to expose the source data through XML/Reportable REST APIs. This can be used against any data source (IBM or third party) that has a native supported reportable REST API. This is also used for the tools references above that leverage the Data Services server.

*Insight does not come with the Analysis Studio tool.

Insight and Cognos can both extract from IBM and third party tools in a variety of ways. As an existing IBM Rational customer, one would likely get more value from Insight as several of the point products today (Rational Team Concert, Rational Quality Manager, Rational DOORS NextGen and more) provide their own Insight collateral with each release so its “plug and play” between Insight and their tools. That is huge value. This leverages the IBM Rational XML Data Configuration tool and driver. I work with alot of IBM Rational tools, so Insight is very beneficial to me and gives me a huge head start with reporting.

If I was an existing Cognos customer with an established deployment (DW schema, ETL catalog, etc) then perhaps I would not see as much value in the OOTB collateral that Insight provides as I would already have had my foundation for all this collateral in place. It really all depends on what tools one is using, what the current process is and what the business goals are. Each case is different.

A few other important things to know:

  • There are many other cool Cognos tools that can be purchased and plugged into your existing deployments, such as Cognos Office and Cognos Mobile. These are applicable to both Cognos and Insight. I have used both of these and they are great. Office for Powerpoint is cool because I can embed my reports in a slide deck once, and use it for weekly/monthly meetings, without having to recreate it each time. With mobile technology being on the rise, Cognos Mobile becomes more valuable to people on the go.
  • IBM Cognos has an offering named “Cognos Insight“. This is not IBM Rational Insight, but its a different solution.

So keep this in mind…Insight & Cognos both use the same Cognos components and engine to accomplish the same reporting tasks. The key differences are what I have mentioned above. This should help you understand how they differ and help determine which one may be the more suitable choice for your organization.

Regardless of your selection, they are both very powerful pieces of software.

Integrating HP Quality Center (or HP ALM/other Third Party tools) with IBM Rational Insight

I’ve had many people ask me about this integration and if it is possible, how it works, effort required to make it work and various other questions. IBM Rational Insight can integrate with IBM and Third-Party data sources in a variety of ways including REST APIs, ODBC and generic XML.

I have a detailed document that describes this integration on IBM’s

Is it possible? Yes.

How does it work? When working with HPQC or HPALM, you first need to decide how you will be integrating. I am not an HP expert and am not familiar with their [REST] APIs, which would be a more ideal approach. In my experience, I have helped others integrate HP with Insight via direct database access. The concept is simple, you gain read access to the HP repository with a valid user name and password. Once you have this, Insight can extract data from it.

While extracting HP data, it is a best practice to map that data to as much of the existing Insight data warehouse structure as possible.  It is also a best practice to review the contents of the data warehouse and ETL catalog beforehand so you are familiar with what already exists. Just focus on the areas that are important to you, not the entire structure. This will avoid alot of rework. For example, if your business need is to report on test data, then map the HP “Test” entity to the Insight “Test Case” entity, which contains similar columns such as ID, name, verdict, date info and more. If you are reporting on defects, then map the HP “Defect” artifact to the Insight “Request” entity.

Ofcourse, each organization will have their own custom attributes and terminology. You can customize the existing Insight artifacts or create new ones to accommodate this.

What is the effort required? This all depends on your business needs,  goals and skill level of the person implementing it. My suggestion is to take an iterative approach. Do not try to do too much too fast. You will likely not be successful. After defining the target business requirements (in the form of reports), start with one or two fundamental artifacts of your requirements, such as project and user info. Map only the columns that are already available in the Insight data warehouse. Ensure the ETL has completed successfully to the point where you can see HP data in Insight.

When that is done, move on to your “Test” record type, for example. Just like the previous artifacts, only map the columns that are already available in the Insight data warehouse. Again, ensure the ETL completes successfully.

Once that is done, you can go back and customize the Insight ETL catalog (using Cognos Data Manager) to incorporate your custom attributes into those three entities (project, user, test). Again, do not try to do too much too fast. If you have 20 custom attributes attached to your test artifact, start with about 3-5 to make sure you understand the process and are achieving the right results.

I hope that this helps folks understand the integration a bit better and gives them an overview of the process. See my link above for a more detailed technical explanation. Note that this approach applies to any third party tool that can provide direct database access or another supported integration point.

IBM Rational Insight / Cognos – Unexpected crash when viewing reports

This is something new that I have not yet come across in an Insight (Cognos) deployment. When I was attempting to either run a report or load a Framework Manager model package, I got the following errors:

Error recieved when running a report in Cognos Connection
Error received when running a report in Cognos Connection
Loading a Framework Manager Package in Report Studio
Loading a Framework Manager Package in Report Studio


This was an intermittent issue, so sometimes I could run a report successfully and seconds later, I could not. I observed the system processes and notice that at times, the “BiBusTKServerMain.exe” process would spawn, terminate, respawn and terminate again when trying to either run a report or load an FM package.

I reviewed the cogserver.log, and found these errors/failures:

DPR-ERR-2074 Failed to receive a timely response from an external process with a PID XXXX. Ensure that the process with a process ID XXXX is started. If the cause of problem cannot be determined, increase the logging level for the target service in the IBM Cognos Administration tool and reproduce the conditions that caused the error.”

“Failure    RSV-SRV-0040 The report server encountered an internal error. Check additional information associated with this error message. If cause of problem cannot be ascertained, increase the logging level in the IBM Cognos administration tool and reproduce the conditions that caused the error.”

“Failure    CCL-SRV-0501 The BIBusTKServer process caught an unexpected exception.”

There were no DMP files in /cognos/bin or /cognos/bin64 directories. I turned on additional logging using “ipfBiBUSclientconfig.xml” however this did not really give me any additional clues to the resolution.

I ended up chatting with one of my colleagues and after much investigation, we found the issue. In my Cognos Configuration > Environment entries, I had all the URIs using the “localhost” reference. Its always a best practice to change that to a FQDN. I had not done this in this particular environment because it was a temporary test environment that I was using for a short period of time.

I replaced all “localhost” references in every URI property to be the machine hostname (which was mapped in the hosts file), restarted the Cognos BI Report server and the issue was resolved. This appeared to be an issue where the server could [intermittently] not resolve “localhost”.

Cognos Configuration > Environment
Cognos Configuration > Environment


I had a chat with another colleague and we discussed that this also may have had something to do with the IP version (IPv4 versus IPv6) of the particular environment I was using . There is a setting in Cognos Configuration that regulates this and by default is set to and IPv4 address. I did not go back to see if it would do the trick because I already had resolved the issue and needed to proceed, but just wanted to mention it as this likely could have helped.

Cognos Configuration > Environment
Cognos Configuration > Environment


Well, I hope that this helps you all save a few hours time of troubleshooting. It was definitely a good learning experience for myself as a seasoned Cognos veteran.