Tag Archives: api

How to Migrate Watson Conversation service from Test to Production

You’ve created some intents, entities, answers, dialog flows. You have hopefully gone through some iterative testing cycles to validate. Now, you want to promote those changes from your Test to Production environment, so what do you do?

Promoting Your Entire Workspace

Your Watson Conversation Service (WCS) configuration, which is your intents, entities, dialogs, etc are all stored in a workspace. When you want to do a complete migration or promotion from your Test to Production environment, you want to update your Production workspace using the WCS ‘updateWorkspace‘ API. This will take your JSON export from the Test workspace and update your Production workspace with everything it contains.

The benefit to using this method is that the ‘workspace_id’ will remain the same so any external component relying on that ‘workspace_id’ will not require change.

Other Options to Move Workspace Data

Another option available to move or share workspaces, is to use the WCS tooling to export your Test workspace and import it into your Production environment. The challenge with this is that WCS will assign it a new/different ‘workspace_id’ so any existing external components relying on that will need to be updated. So be careful here.

If you only made changes to your intents or entities and you only want to promote those specific changes, you can leverage the respective APIs which are documented here to allow you to do so.

Using any of these approaches, please note that you still will need to consider some time to account for the training of the workspace updates to complete.

How to Check Your API Usage for IBM Bluemix Services

Ever wondered how to check the number of APIs calls that you have consumed with your IBM Bluemix services? If so, you already have that information right under your fingertips. Bluemix has a usage tracker that is easy to use and conveniently provides your API usage for the last 12 months. Please note that there are some services that do not report usage and send a separate bill. You also need to make sure that you have the appropriate privileges to access this information.

To access the Usage Dashboard, select the “Manage” menu item at the top right of your display then hover over “Billing and Usage” then select “Usage”.

You will now be looking at your Usage Dashboard, you will see a bar chart showing you overall usage charges for the last 12 months. The current month will be highlighted.

usage_charges

Now scroll down to the “Services Charges” section and you will see a list of your services. Once you’ve located the service of interest, click on the twisty (little triangle) to the right to expand the information and you will see the number of API calls and associated costs.

 

conversation_apiUsage

If you want to see usage for previous months, just scroll back up to the bar chart and select the month of interest and your services data below will refresh.

That’s all, its that simple!

 

How to configure Cognos for Microsoft Office

This is a Cognos plugin tool that is used to integrate your IBM Cognos BI, IBM Rational Insight or RRDI environment with Microsoft Office so that you can view Cognos reports through Office software. The tool has also been called “Cognos Office” or “Cognos Go-Office” in the past. As of this post, it is called “Cognos for Microsoft Office”.

I have found the best use for this tool is using it with a Powerpoint presentation. Here is an example. Say an employee has a weekly meeting where they present a specific set of slides. The structure of their slides are the same but the data in the reports being shown (which come from Cognos) vary from week to week. So each week, this employee must manually generate the reports via copying and pasting screen shots into the Powerpoint presentation. Its not the most enjoyable task.

However, if this employee installs the “Cognos for Microsoft Office” plugin, then every week, they would simply launch their existing Powerpoint slide deck and click on a “Refresh” button and this will automatically update their existing reports to show the most current data. That is all, in one simple click.

This plugin is a local install, so each user that wants to use it must install it on their local machine, where their Microsoft Office software resides. It is a pretty straight-forward installation and configuration process. Once you have installed the software, here is how to configure it on your local machine. (Please note that I am using IBM Cognos 8 in this example).

1) After installation of the Cognos for Microsoft Office plugin, launch Microsoft Powerpoint and you will find a new tab named “IBM Cognos 8”. Click on this tab.

2) Now, in the “Application” section, click on the IBM Cognos 8 button, then select “Options”

3) Here, specify your Report Server that Office will communicate with. Enter the “System Gateway URI” and give it a “Friendly name”.

4) Click “Test Connection” to verify that your Office software can communicate with your Report Server. If your connection is successful, you should see this message:

5) Now, click “Add” then “Ok”

6) From the toolbar, click “Logon” and select the Report server you just added (in my case, its “My Insight Server”)

 

7) Login with your Report Server credentials (if applicable)

8) Now, on the right hand side of your screen, click on “IBM Cognos 8 Go! Office” (Depending on your version, you may not see this identical text)

 

9) You should now see your Report server directory structure in your Microsoft Office Powerpoint application

 

10) You can now drag and drop your Cognos reports into your Powerpoint presentation

Links to my IBM Innovate Conference presentations

Here are links where one can view or download my IBM Innovate presentations over the last five years.

Innovate 2010: Best Practices and Lessons Learned on Our IBM Rational Insight Deployment

Innovate 2011: The IBM Rational Insight Reporting Solution

Innovate 2012: Deploying Rational Insight into a Heterogeneous Environment

Innovate 2013: Improving Predictability and Efficiency with Kanban metrics using IBM Rational Insight

Innovate 2014: Unleash Your Metrics Outside the Box: Customizing Your IBM Rational Insight Deployment (workshop)

IBM Rational Insight Enablement Links

Alot of folks ask me for information on how to do things or if certain things can be done. Many ask for some presentation collateral. Here is a list of some links to collateral that contain alot of great information about IBM Rational Insight that I have created over time that will address alot of your questions:

IBM Rational Insight Overview 2014

The IBM Rational Insight Reporting Solution

Deploying IBM Rational Insight in a Heterogeneous Environment

Improve Predictability and Efficiency with Kanban Metrics using IBM Rational Insight (presentation)

Improve Predictability and Efficiency with Kanban Metrics using Rational Insight (article version)

Integrating Rational Insight with HP Quality Center and other Third Party tools

Configure Rational Insight with an additional Jazz Team Server

Configure LDAP for Rational Insight when Integrated with Multiple Jazz Team Servers

How to implement delta loads using Rational Insight

Integrating a Microsoft Excel spreadsheet with Rational Insight (live data model)

Define Aggregate Rules for Semi-Additive Measures in Framework Manager

IBM Rational Insight – Setting up your environment for remote execution using Data Manager Network Services

Configuring Rational Insight for Rational Team Concert custom attributes (for RTC version 3.0 and earlier)

Here are some helpful IBM links:

IBM Rational Insight 1.1.1.x Infocenter

IBM Rational Insight Data Models (ETLs, Data Warehouse, Reports)

IBM Rational Insight & CLM Data Dictionaries

Insight / CLM Integration

Workaround for RTC 4.0.5 Data Manager ETLs when run against a 4.0.3 CCM App server

I found this bug today and wanted to share my solution to the problem. My scenario is I have Insight v1.1.1.3 deployed and integrated with 2 CLM deployments, one of which is version 4.0.5 and the other is version 4.0.3. The RTC Data Manager ETLs run fine against the 4.0.5 CCM server however, when running the 4.0.3 DM ETLs against the 4.0.3 CCM server, errors are thrown.

Viewing the ri_jdbc.log file, I was able to detect the first error in the “RTC_WorkItemApprovalState4.0.3” fact build (see below for full error). It is complaining about a column it is trying to query that does not exist. The column is ‘projectAreaArchived’.

When I checked the XML data configuration file in the XDC tool, the respective resource (WorkItemApprovalState) was indeed missing this data element.

xdcResource

The solution to this is to expose the required data element which indeed does exist in the RTC reportable REST API.

  1. Launch the XML Data Configuration tool
  2. Load the workitem403.xdc file
  3. Navigate to the data mapping table: “Resources” > “RTCWorkItem” > “WORKITEM_CCM” > “WorkItemApprovalState” > “WorkItemApprovalState
    dmt
  4. Double-click on the data mapping table
  5. Select the “Columns” tab and then select “Modify”
    modify
  6. Navigate through the RTC reportable REST API to find the ‘/archived’ data element. Highlight it and select “Create”. This will add the new element to the data mapping template.
  7. Rename it to ‘projectAreaArchived’ by clicking in the “Table Column Name” cell and modifying the text
    map
  8. Save the configuration
  9. Close the “WorkItemApprovaState” data mapping template tab
  10. You should receive a dialog that tells you the template has been changed and asks you to update it. Select “Ok”
    templateChange
  11. Save your configuration once again
  12. Your new column will appear faded. You have added the column but now need to load it. Highlight your new column and click on “Load”.
    load
  13. Save your configuration again and this new column is now available made to Insight, hence the ETL will no longer complain about it.

There are 4 different places where this error will be thrown. You will only see one error at a time as the ETL halts upon the first error in this case. Here are the 4 fact builds this affects:

“RTC_WorkItemApproval4.0.3”
“RTC_WorkItemApprovalDescr4.0.3”
“RTC_WorkItemApprovalState4.0.3”
“RTC_WorkItemApprovalType4.0.3”

WIApproval_factBuilds
Follow the steps I provided above for each of these 4 fact builds and you can proceed with your deployment without error.

For reference, here is the full error found in the ri_jdbc.log:

query => SELECT “id”,
“projectAreaItemId”,
“name”,
“DATASOURCE_ID”
FROM “WORKITEM_CCM”.”WorkItemApprovalState”
WHERE modifiedsince=’1899-12-31 00:00:00′ AND projectAreaArchived=false
03/06/2014 13:27:27,444 ERROR Thread-5 com.ibm.rational.drivers.jdbc.xml.internal.PageFetcherThread : CRRRE1203E: The column ‘projectAreaArchived’ does not exist.
03/06/2014 13:27:27,444 ERROR Thread-5 com.ibm.rational.drivers.jdbc.xml.internal.PageFetcherThread : com.ibm.rational.etl.common.exception.ETLException: CRRRE1203E: The column ‘projectAreaArchived’ does not exist.
com.ibm.rational.etl.common.exception.ETLException: com.ibm.rational.etl.common.exception.ETLException: CRRRE1203E: The column ‘projectAreaArchived’ does not exist.
at com.ibm.rational.drivers.jdbc.xml.internal.PageFetcherThread.exeFetch(PageFetcherThread.java:568)
at com.ibm.rational.drivers.jdbc.xml.internal.PageFetcherThread.run(PageFetcherThread.java:136)
Caused by: com.ibm.rational.etl.common.exception.ETLException: CRRRE1203E: The column ‘projectAreaArchived’ does not exist.
at com.ibm.rational.drivers.jdbc.xml.util.ExtractionUtil.getExtractionURLOfVDS(ExtractionUtil.java:185)
at com.ibm.rational.drivers.jdbc.xml.internal.PageFetcherThread.exeFetch(PageFetcherThread.java:227)
… 1 more
03/06/2014 13:27:27,459 INFO main com.ibm.rational.drivers.jdbc.xml.RDSStatement : CRRRE1228I: Close this statement

IBM Rational Insight & IBM Cognos BI – How they compare

Alot of people have asked me about how these two solutions stack up next to each other.  Questions such as “What is the difference between them?”, “Which one is a better solution?” or “Which one will meet my specific business needs?”. The truth is, there is no one correct answer. It all depends on several factors. I’m going to talk about some of the key differences to help everyone better understand this.

IBM  Cognos‘ main components are:

  • BI server – this is where users view, run and create reports and dashboards. This is also referred to as the “Report server”.
  • ETL server – this handles the ETL processing of the data. It extracts the data from the specificed data sources, transforms the data based on business logic and loads that data into the Data Warehouse.
  • Data Warehouse – schema/structure of your database where all of the data will reside. Reports are run against the data that live here.
  • Data Manager – this is the ETL development tool where users specify what their data sources are, how to transform the data and where to load that data. All this is specified in what is called the ETL “catalog”. Once this is defined, it is published as a “data movement task” that the ETL server executes on its behalf. Note: for development/testing, Data Manager also has a built in DM engine that could run the ETL.
  • Framework Manager – this is the framework modeling development tool that is used create a metadata model that is the layer between the raw structure of the Data Warehouse and what the Report Authors see. It creates a cleaner view of the data and organizes it based on business needs.

Cognos also comes with these end-user tools: Report Studio, Query Studio, Analysis Studio, Event Studio, Workspace and Workspace Advanced. You can just google them to see what they do.

Please note that there are many tools bundled into this solution and I have only explained the key infrastructure components. There are also additional databases needed. I do not want to this post to flood you all with too much info 🙂

In additional to the servers, components and tools, Cognos also comes with an optional sample download called “The Great Outdoors“. This provides a simple example of some of the capabilities of the Cognos offering. This sample does not include any ETL samples so its not an end to end example, but it helps show some of what Cognos can do. I have not personally used the sample. I focus more on SDLC/ALM metrics and this does not appear to be an example of that.

IBM Rational Insight is built on the Cognos BI platform. So this means that is also has the same server components “under the hood”: Cognos BI server & ETL server. Insight also leverages the Data Manager & Framework Manager developer tools. Rather than list out all of what Insight provides, I am going to list out the differences between Insight and Cognos:

  • Out of the Box Collateral – Insight provides very extensive  out of the box deployment collateral. This includes XML/XDC mappings, a vast ETL catalog (containing fact builds, dimensions, data marts), a complete Data Warehouse schema, framework data models and several reports and dashboards.

To me, this is one of the key benefits of Insight. Its provides us with all the collateral needed to get deployed and realize value quickly. The effort to create this collateral is not trivial and in my estimate, would take thousands of man hours to recreate. We can use the examples in “as is” form, customize them to meet our needs or simply use them as a guide to create our own collateral. The OOTB collateral takes into consideration all of the key elements of the SDLC including: Program, Project, Iteration, Release, Requirement, Test Case, Defect and much more. My suggestion and best practice is to leverage as much of this as possible to avoid rework.

Additional components created by IBM Rational for Insight:

  • IBM Rational Data Services Server – This server is used when extracting data from IBM Rational Clearquest, IBM Rational Clearcase, IBM Rational RequisitePro and IBM Rational Test Manager. These tools use an “adapter” which is basically a set of Java classes that plug into the Data Services API. The Data Services server then communicates with the point product API through XML to expose the source data.
  • IBM Rational XML Data Configuration – This developer tool is used to expose the source data through XML/Reportable REST APIs. This can be used against any data source (IBM or third party) that has a native supported reportable REST API. This is also used for the tools references above that leverage the Data Services server.

*Insight does not come with the Analysis Studio tool.

Insight and Cognos can both extract from IBM and third party tools in a variety of ways. As an existing IBM Rational customer, one would likely get more value from Insight as several of the point products today (Rational Team Concert, Rational Quality Manager, Rational DOORS NextGen and more) provide their own Insight collateral with each release so its “plug and play” between Insight and their tools. That is huge value. This leverages the IBM Rational XML Data Configuration tool and driver. I work with alot of IBM Rational tools, so Insight is very beneficial to me and gives me a huge head start with reporting.

If I was an existing Cognos customer with an established deployment (DW schema, ETL catalog, etc) then perhaps I would not see as much value in the OOTB collateral that Insight provides as I would already have had my foundation for all this collateral in place. It really all depends on what tools one is using, what the current process is and what the business goals are. Each case is different.

A few other important things to know:

  • There are many other cool Cognos tools that can be purchased and plugged into your existing deployments, such as Cognos Office and Cognos Mobile. These are applicable to both Cognos and Insight. I have used both of these and they are great. Office for Powerpoint is cool because I can embed my reports in a slide deck once, and use it for weekly/monthly meetings, without having to recreate it each time. With mobile technology being on the rise, Cognos Mobile becomes more valuable to people on the go.
  • IBM Cognos has an offering named “Cognos Insight“. This is not IBM Rational Insight, but its a different solution.

So keep this in mind…Insight & Cognos both use the same Cognos components and engine to accomplish the same reporting tasks. The key differences are what I have mentioned above. This should help you understand how they differ and help determine which one may be the more suitable choice for your organization.

Regardless of your selection, they are both very powerful pieces of software.

Integrating HP Quality Center (or HP ALM/other Third Party tools) with IBM Rational Insight

I’ve had many people ask me about this integration and if it is possible, how it works, effort required to make it work and various other questions. IBM Rational Insight can integrate with IBM and Third-Party data sources in a variety of ways including REST APIs, ODBC and generic XML.

I have a detailed document that describes this integration on IBM’s Jazz.net.

Is it possible? Yes.

How does it work? When working with HPQC or HPALM, you first need to decide how you will be integrating. I am not an HP expert and am not familiar with their [REST] APIs, which would be a more ideal approach. In my experience, I have helped others integrate HP with Insight via direct database access. The concept is simple, you gain read access to the HP repository with a valid user name and password. Once you have this, Insight can extract data from it.

While extracting HP data, it is a best practice to map that data to as much of the existing Insight data warehouse structure as possible.  It is also a best practice to review the contents of the data warehouse and ETL catalog beforehand so you are familiar with what already exists. Just focus on the areas that are important to you, not the entire structure. This will avoid alot of rework. For example, if your business need is to report on test data, then map the HP “Test” entity to the Insight “Test Case” entity, which contains similar columns such as ID, name, verdict, date info and more. If you are reporting on defects, then map the HP “Defect” artifact to the Insight “Request” entity.

Ofcourse, each organization will have their own custom attributes and terminology. You can customize the existing Insight artifacts or create new ones to accommodate this.

What is the effort required? This all depends on your business needs,  goals and skill level of the person implementing it. My suggestion is to take an iterative approach. Do not try to do too much too fast. You will likely not be successful. After defining the target business requirements (in the form of reports), start with one or two fundamental artifacts of your requirements, such as project and user info. Map only the columns that are already available in the Insight data warehouse. Ensure the ETL has completed successfully to the point where you can see HP data in Insight.

When that is done, move on to your “Test” record type, for example. Just like the previous artifacts, only map the columns that are already available in the Insight data warehouse. Again, ensure the ETL completes successfully.

Once that is done, you can go back and customize the Insight ETL catalog (using Cognos Data Manager) to incorporate your custom attributes into those three entities (project, user, test). Again, do not try to do too much too fast. If you have 20 custom attributes attached to your test artifact, start with about 3-5 to make sure you understand the process and are achieving the right results.

I hope that this helps folks understand the integration a bit better and gives them an overview of the process. See my link above for a more detailed technical explanation. Note that this approach applies to any third party tool that can provide direct database access or another supported integration point.