THINK 2018 Must-Attend Sessions

Like all years, there are always so many sessions that I want to see. But let’s go through my agenda, day by day.

Sunday.

I’m landing Sunday, and heading straight to the conference, and hopefully I’ll make it to the IBM Cognos Analytics Jam Session. I’m not 100% certain what a “jam session” is, but it sounds sweet. You should be super jelly if you can’t attend.

Monday.

08:30 – IBM Cognos Analytics 11: Discover What’s New.
There are instances of this session, but I feel I might as well get it over with at the beginning.
10:30 – What’s New and What’s Next for Business Analytics.
The speakers will go over the recent and planned enhancements for Watson and Cognos. They better not change the name to IBM Analytics Cloud Watson; that would make me IBMAnalyticsCloudWatsonPaul and that’s a mouthful.
12:30 – Interactive Reporting for Business Users
With the focus on self-service reporting, I may soon find myself obsolete. But, if I can help train the end users to better build their own dashboards, I could retire early.
13:30 – Being GDPR-Ready with Cognos Analytics
General Data Protection Regulation is coming into affect in May 25 in the EU. I view this as applied data security, which is always interesting.
13:30 – Proven Architecture Styles with Analytics with Real-World Demos
Double-booking for me! This is hard, both the 13:30 sessions are single time only, and they’re both very interesting. This session goes into several real-life example analytics patterns. Learning how to avoid the pitfalls other people have encountered is incredibly important.
14:30 – How to Get the Most from Embedding Analytics in Your Applications
I’ve had projects where we’ve embedded Cognos into existing applications. The description downplays the technical aspects, which I’m especially interested in, but it looks like it will show some practical examples.
15:30 – Certifications!
One of my goals here is to get some more certifications. I’ve been crunching the SPSS manuals, and I _WILL_ get at least one cert with it.

Tuesday.
11:30 – Upgrading to IBM Cognos Analytics.
I’m working on an upgrade project right now, and I’m hoping to get some insights to make the process a little smoother.
11:30 – FleetPride and Cresco International Change the Supply Chain Game with IBM’s Cycle of Analytics
My good friend Sanjeev is going to be giving a talk on the work he did with FleetPride!
13:30 – Ready for Business: Building a Spectrum of Capabilities for Analytics at a Milwaukee Manufacturer
That’s a mouthful of a session title. This appears to be a very quick session, lasting only 20 minutes. But I know what they manufacture and I’m interested to see what they’ve done. Also, a little bit of self-promoting never hurt anyone, if they use Cognos, I live not far away!
14:30 – All Aboard! Experience Onboarding to Cognos Analytics on Cloud
The cloud is the future. Let’s be honest, maintaining your own servers is a pain in the neck, and in the wallet. I’m going to learn how to ease the pain of cloudifying (enclouderating?) Cognos environments.
15:30 – User Experience: Future Design for IBM Business Analytics—Moving from Optional to Essential
This sounds intriguing, “Soon, the way you interact with IBM Business Analytics will be tailored to your cognitive profile”. Exactly what does this entail? Will the front screen of Cognos be a dynamically generated dashboard that matches my most commonly viewed objects? I’m honestly very curious about this.
19:30 – Barenaked Ladies!
’nuff said.

Wednesday
09:30 – Front Row Seat to the Future: Ticketmaster’s Journey to Cognos Analytics on Cloud
Real examples of cloudifying Cognos. There are a lot of pros to doing this, and plenty of possible pitfalls. I’m looking forward to seeing how they did it.
10:30 – Roadmap Alert! See What IBM Business Analytics Has in Store for You
The description has, “augmented intelligence for personalized analytics” and “with features such as conversational analytics and chatbots” and “infuse traditional analytics with capabilities like machine learning”. How will this actually work with Cognos? I’m very excited to see this session.
11:30 – Meet the Designers Behind IBM Business Analytics
Very important. The people behind the latest UI changes to Business Analytics. I’m planning on talking, calmly and rationally, about the removal of the toolbars in CA.
11:30 – Communicating Effectively with Reporting, Dashboarding and Storytelling: Now and the Future
Another double-booking! Very painful, because I really want to see this too. They’ll be talking about how they plan on making dashboad contents easier, and visualizations more effective. Someone send me notes from this session!
12:30 – Contextual Analytics Using SPSS Text Analytics and Cognos 11 Visualizations in the Airline Industry
This is fascinating. I’m currently studying SPSS, so I’m looking forward to seeing real examples of taking SPSS and converting the data into dashboards.
15:30 – How to Flat Pack the Cognos Analytics Migration—The IKEA Way!
The description doesn’t say what IKEA was migrating _from_. An earlier Cognos build? A different platform altogether? I’m curious about the thought processes that went into the decision. There are so many cases where companies migrate to a new tool, but want everything to remain exactly the same.
16:30 – Gamifying Success: How GameStop Shifted from BI Producers to Analytics Enablers
Giving end users access to self-service tools is never an easy process. There’s always a learning curve, resistance, and many false steps. It sounds like they have a successful self-service environment, and I’m looking forward to hearing how they implemented it.

Thursday
08:30 – Upgrading to Cognos Analytics: Secrets from Nordic Customer Deployments
Another upgrade session. This looks like it’s more technically focused than the others. They go into how they handles depricated features, like portal pages.
09:00 – Ready for IBM Analytics: What’s in It for Me? What’s in It for My Clients
I might miss this one, but I’d prefer not to. Gartner has shown that Cognos has taken a massive approval hit with the latest version, and this might show some answers.
11:30 – Increasing ROI in Pharmaceuticals with an IBM Cognos Analytics Upgrade
One of my previous clients was a pharmaceuticals company, so I’m always curious about how other companies structure their reports. Also, the session goes into the complexities of upgrading to CA with CQM-DQM and JavaScript migration. Both issues I’ve dealt a great deal with recently.
12:30 – Building a Self-Service Environment Command Center in IBM Cognos Analytics
A client of PMsquare! I haven’t had the opportunity to work at Baxter, but several of my colleagues have. From what I understand they built a supercharged auditing environment to track usage. It sounds awesome!

Did I miss anything important? You’ve got about 24 hours to let me know!

IBM Think! (And dimensional report design)

I haven’t been posting much recently, and you can always tell how busy I am by the frequency of my posts. Never fear, the crazy project I’m on will be ending soon and I can get back to posting semi-regularly.

Before anything else, a very important announcement.

I’ll be attending IBM Think! Do you want to meet me? Tell me how awesome I am and how my solutions have saved your life? Now’s your chance! Send a message to the PMsquare people here and we can set something up! Want to tell me that one of my solutions destroyed your Cognos set up? We can talk it over a beer or two in a well lit public area. During the expo, I’ll either be hanging around booth #716, or wandering around restocking my supply of branded pens and office swag.

Now to the meat of the post.

Let’s talk about dimensional report design.

Building dimensional reports is more of the more complex tasks a Cognos developer can face. Relational reports over a few tables is an easy task, but the skills learned don’t necessarily transfer over. While similar in appearance, the queries use a very different style.

When building a query, it might be tempting to explicitly create a tuple for each item. Case in point, consider the following requirement:

year in column, prior year, Year over Year%
4 measures
Org Hierarchy in rows
desired output

It might be tempting to make each column a separate tuple.

currentyearSales: tuple([Sales],[Current Year])
priorYearSales: tuple([Sales],[Prior Year])
and so on.

But let’s take a look at the underlying MDX that’s generated.

WITH
  MEMBER [Measures].[XQE_V5M_Reseller Order Quantity YoY_CM13] AS ((([Date].[Calendar].[Calendar Year].&[2013], [Measures].[Reseller Order Quantity])-([Date].[Calendar].[Calendar Year].&[2012], [Measures].[Internet Order Quantity]))/([Date].[Calendar].[Calendar Year].&[2012], [Measures].[Internet Order Quantity])), SOLVE_ORDER = 4, CAPTION = 'Reseller Order Quantity YoY'
  MEMBER [Measures].[XQE_V5M_Reseller Sales Amount YoY_CM14] AS ((([Date].[Calendar].[Calendar Year].&[2013], [Measures].[Reseller Sales Amount])-([Date].[Calendar].[Calendar Year].&[2012], [Measures].[Reseller Sales Amount]))/([Date].[Calendar].[Calendar Year].&[2012], [Measures].[Reseller Sales Amount])), SOLVE_ORDER = 4, CAPTION = 'Reseller Sales Amount YoY'
  MEMBER [Measures].[XQE_V5M_Internet Order Quantity YoY_CM15] AS ((([Date].[Calendar].[Calendar Year].&[2013], [Measures].[Internet Order Quantity])-([Date].[Calendar].[Calendar Year].&[2012], [Measures].[Internet Order Quantity]))/([Date].[Calendar].[Calendar Year].&[2012], [Measures].[Internet Order Quantity])), SOLVE_ORDER = 4, CAPTION = 'Internet Order Quantity YoY'
  MEMBER [Measures].[XQE_V5M_Internet Sales Amount YoY_CM16] AS ((([Date].[Calendar].[Calendar Year].&[2013], [Measures].[Internet Sales Amount])-([Date].[Calendar].[Calendar Year].&[2012], [Measures].[Internet Sales Amount]))/([Date].[Calendar].[Calendar Year].&[2012], [Measures].[Internet Sales Amount])), SOLVE_ORDER = 4, CAPTION = 'Internet Sales Amount YoY'
  MEMBER [Measures].[XQE_V5M_Reseller Order Quantity 2012_CM17] AS ([Date].[Calendar].[Calendar Year].&[2012], [Measures].[Internet Order Quantity]), SOLVE_ORDER = 4, CAPTION = 'Reseller Order Quantity 2012'
  MEMBER [Measures].[XQE_V5M_Reseller Sales Amount 2012_CM18] AS ([Date].[Calendar].[Calendar Year].&[2012], [Measures].[Reseller Sales Amount]), SOLVE_ORDER = 4, CAPTION = 'Reseller Sales Amount 2012'
  MEMBER [Measures].[XQE_V5M_Internet Order Quantity 2012_CM19] AS ([Date].[Calendar].[Calendar Year].&[2012], [Measures].[Internet Order Quantity]), SOLVE_ORDER = 4, CAPTION = 'Internet Order Quantity 2012'
  MEMBER [Measures].[XQE_V5M_Internet Sales Amount 2012_CM20] AS ([Date].[Calendar].[Calendar Year].&[2012], [Measures].[Internet Sales Amount]), SOLVE_ORDER = 4, CAPTION = 'Internet Sales Amount 2012'
  MEMBER [Measures].[XQE_V5M_Reseller Order Quantity 2013_CM21] AS ([Date].[Calendar].[Calendar Year].&[2013], [Measures].[Reseller Order Quantity]), SOLVE_ORDER = 4, CAPTION = 'Reseller Order Quantity 2013'
  MEMBER [Measures].[XQE_V5M_Reseller Sales Amount 2013_CM22] AS ([Date].[Calendar].[Calendar Year].&[2013], [Measures].[Reseller Sales Amount]), SOLVE_ORDER = 4, CAPTION = 'Reseller Sales Amount 2013'
  MEMBER [Measures].[XQE_V5M_Internet Order Quantity 2013_CM23] AS ([Date].[Calendar].[Calendar Year].&[2013], [Measures].[Internet Order Quantity]), SOLVE_ORDER = 4, CAPTION = 'Internet Order Quantity 2013'
  MEMBER [Measures].[XQE_V5M_Internet Sales Amount 2013_CM12] AS ([Date].[Calendar].[Calendar Year].&[2013], [Measures].[Internet Sales Amount]), SOLVE_ORDER = 4, CAPTION = 'Internet Sales Amount 2013'
SELECT 
  {CROSSJOIN({[Date].[Calendar].[Calendar Year].&[2013]}, {[Measures].[XQE_V5M_Internet Sales Amount 2013_CM12], [Measures].[XQE_V5M_Internet Order Quantity 2013_CM23], [Measures].[XQE_V5M_Reseller Sales Amount 2013_CM22], [Measures].[XQE_V5M_Reseller Order Quantity 2013_CM21]}), CROSSJOIN({[Date].[Calendar].[Calendar Year].&[2012]}, {[Measures].[XQE_V5M_Internet Sales Amount 2012_CM20], [Measures].[XQE_V5M_Internet Order Quantity 2012_CM19], [Measures].[XQE_V5M_Reseller Sales Amount 2012_CM18], [Measures].[XQE_V5M_Reseller Order Quantity 2012_CM17]}), ([Date].[Calendar].DEFAULTMEMBER, [Measures].[XQE_V5M_Internet Sales Amount YoY_CM16]), ([Date].[Calendar].DEFAULTMEMBER, [Measures].[XQE_V5M_Internet Order Quantity YoY_CM15]), ([Date].[Calendar].DEFAULTMEMBER, [Measures].[XQE_V5M_Reseller Sales Amount YoY_CM14]), ([Date].[Calendar].DEFAULTMEMBER, [Measures].[XQE_V5M_Reseller Order Quantity YoY_CM13])} DIMENSION PROPERTIES PARENT_LEVEL,  PARENT_UNIQUE_NAME ON AXIS(0), 
  DESCENDANTS([Sales Territory].[Sales Territory].[All Sales Territories], 3, SELF_AND_BEFORE) DIMENSION PROPERTIES PARENT_LEVEL,  PARENT_UNIQUE_NAME ON AXIS(1)
FROM [Adventure Works]  CELL PROPERTIES CELL_ORDINAL,  FORMAT_STRING,  LANGUAGE,  VALUE

Looking at this in the profiler, I can see that it’s taking 20 ms to process, and generates 67 subcubes.

In this case it’s better to use the years as a set, and rely on the implicit grouping of a crosstab. This results in significantly fewer data items.

But what about the Year over Year% columns? Manually calculating each column would certainly be easy to do, but again, it’s not necessary. We can create a calculated member in the time hierarchy that calculates it for us.
member(([Current Year] – [Prior Year]) / [Prior Year] , ‘YoY’,’YoY’,[Cube].[Time Dim].[Time Hier])

It might look silly to people coming from a relational background. After all, (2017-2016)/2016 = 4.96%. In this case, the calculation is happening to the nested measures. We can then select the member fact cells of the calculated member, and format all of the cells as percentage.

Let’s take a look at the underlying MDX:

WITH
MEMBER [Date].[Calendar].[XQE_V5M_CM1] AS ((([Date].[Calendar].[Calendar Year].&[2013])-([Date].[Calendar].[Calendar Year].&[2012]))/([Date].[Calendar].[Calendar Year].&[2012])), SOLVE_ORDER = 4, CAPTION = ‘YoY %’
SELECT
{CROSSJOIN({[Date].[Calendar].[Calendar Year].&[2013], [Date].[Calendar].[Calendar Year].&[2012]}, {[Measures].[Internet Sales Amount], [Measures].[Internet Order Quantity], [Measures].[Reseller Sales Amount], [Measures].[Reseller Order Quantity]}), CROSSJOIN({[Date].[Calendar].[XQE_V5M_CM1]}, {[Measures].[Internet Sales Amount], [Measures].[Internet Order Quantity], [Measures].[Reseller Sales Amount], [Measures].[Reseller Order Quantity]})} DIMENSION PROPERTIES PARENT_LEVEL, PARENT_UNIQUE_NAME ON AXIS(0),
DESCENDANTS([Sales Territory].[Sales Territory].[All Sales Territories], 3, SELF_AND_BEFORE) DIMENSION PROPERTIES PARENT_LEVEL, PARENT_UNIQUE_NAME ON AXIS(1)
FROM [Adventure Works] CELL PROPERTIES CELL_ORDINAL, FORMAT_STRING, LANGUAGE, VALUE
[/sourecode]

In this case the runtime is 12 ms with 31 subcubes generated.

That’s 40% faster. While in this example the actual difference is insignificant, in real life the runtime difference for complex queries can be profound. The report that instigated this post was taking over an hour to process, but with the changes I mentioned it dropped down to 3 minutes.

The end result here is a faster, more compact query. Much easier to maintain, much easier to change, and much easier to hand off to clients so you never have to look at it again.

Report XML Below
OLAP Report Design (721 downloads)

JavaScript in Cognos Analytics, and I need your help!

This is a repost of an article I wrote for the PMsquare journal, with permission of course. The original can be found here. Make sure you subscribe to their newsletter for other great articles!

In every new release of Cognos, there are some ups, and there are some downs. And while some people may have a lot to complain about in the new version, there is are a few shining advances that force me to forgive all the questionable design decisions (even the loss of the menu and button bars in Report Studio).

Today, and the in next few articles, we’ll be talking about JavaScript. They don’t call me JavaScriptPaul for nothing, y’know (nobody does yet, but someone might some day).

JavaScript and Cognos has always been a touchy subject. Historically unsupported, incompatible with most libraries, and with a cryptic undocumented internal API, JavaScript has been a major challenge to implement in a Cognos report. In Cognos 10.2, IBM started officially recognizing that people wanted more, creating the basic Prompt API. While limited, it was a start to making truly interactive reports.

And now, in Cognos 11, we finally have a fully supported JavaScript control.

The new JavaScript control is for use with the new Interactive Mode. Non-interactive mode appears to work the same way as C10. Inline JS will only work with non-interactive mode. The big problem I have with this is you have to save a JS file onto a server somewhere. This makes development a problem, especially if you’re a lowly dev who doesn’t have direct access to save files on the server. On the flip side, if you are a lowly dev, all you need to know is where these JS file are and what to pass to them.

The Interactive Mode will dynamically download the files and cache them in the browser. This makes for a slightly faster user experience.

Unlike the C10 API everything available is documented. On the positive side, this means that all the JS functions are fully supported. On the negative side, this does mean that there aren’t a lot of them yet. All of the undocumented and unsupported functions, like oCV_NS_.getSelectionController().isDrillLinkOnCrosstabCell() (Yes, this is a real function and yes I’ve used it) have been compiled into a random string of letters of numbers.

2. Randomized functions

I’ll touch on a few of the new features briefly, then show a working example.

In C10 and previous there were three ways of getting data into a JavaScript Object. Easiest way would be to associate a value prompt with one, but then we’re limited to only two attributes. Second way would be to dump everything into a list, but then we need to loop through a table – slow and annoying. The third way is to use repeaters to inline the JS. The big problem with this is there’s no formatting option for numbers, and some strings are problematic.

In C11 the JavaScript controls can be assigned to a specific dataset from a query. This circumvents the problem with excess data AND the issue with invalid characters. In addition to datasets, we can pass a JSON string to the control containing additional configuration information.

3. Data and Configuration

Calling specific report elements, such as blocks and lists, can be done with a simple call to the page, stacking .getControlByName. Once you have the control, there are a few basic things you can – setting visability, width, height, colors. But you CAN get the HTML element – and with that you can do a lot.

An often requested function is the ability to select visible columns in a list. In fact, IBM even has an example of this on their demo server.
4. IBM showing and hiding columns

Personally I don’t like that solution. End users don’t want to type the column index, and when they page down it doesn’t remember the selection. I solved that using sessionStorage, but let’s focus on the Cognos centric code.

The JavaScript starts by defining the function.

define( function() {
"use strict";
function columnSelector(){};

Next, we initialize the function.

columnSelector.prototype.initialize = function( oControlHost, fnDoneInitializing )
{
  var o = oControlHost.configuration;
	this.m_sListName = o ? o["List name"] : "List1";
  this.m_aStatic = o ? o["Static choices"] : [];

  if(!window.sessionStorage.getItem(this.m_sListName+'SelCols')) window.sessionStorage.setItem(this.m_sListName+'SelCols','[]');

  if(!window.sessionStorage.getItem(this.m_sListName+'SelColsFR')) window.sessionStorage.setItem(this.m_sListName+'SelColsFR','1');

  fnDoneInitializing();
};

oControlHost is the object passed to the script from Cognos. It’s a unique identifier that includes any extra configuration data defined in Report Studio. The List name is optional, so long as you have List1 in the output. The static choices also, optional. Next we have sessionStorage. This is what lets the page navigation remember what the user selected.

I believe fnDoneInitializing instructs Cognos that it’s actually ready to go to the next step.

Next we can actually start building the control on the page. Notice these functions are attaching themselves to the parent. This allows us to use other variables attached to it, like this.m_sListName, across the various functions.

columnSelector.prototype.draw = function( oControlHost )
{
	var elm = oControlHost.container,
      list = oControlHost.page.getControlByName( this.m_sListName ).element,
      listHeaders = list.rows[0].childNodes,
      listHCount = listHeaders.length,
      selArr = eval(window.sessionStorage.getItem(this.m_sListName+'SelCols')),
      firstRun = window.sessionStorage.getItem(this.m_sListName+'SelColsFR'),
      sel = document.createElement('select');

  sel.multiple=true;
  sel.style.height="100%";
  sel.style.width="100%";
    
  for (var i = 0;i<listHCount;++i){
    var selected = listHeaders[i].style.display=='none'?false:true,
        opt = document.createElement('option');

    opt.value = i;
    opt.text = listHeaders[i].innerText;
    
    if(window.sessionStorage.getItem(this.m_sListName+'SelColsFR')==0){
      if(selArr.includes(i)) {
          opt.selected=true;
          oControlHost.page.getControlByName( this.m_sListName ).setColumnDisplay(i,true)
        } else {opt.selected=false
        
        oControlHost.page.getControlByName( this.m_sListName ).setColumnDisplay(i,false)
        }
    }
    else{opt.selected=selected};
     
    sel.appendChild(opt);
  };

	elm.appendChild(sel);
  
  window.sessionStorage.setItem(this.m_sListName+'SelColsFR',0);
	this.elm = sel;
	this.elm.onchange = this.onChange.bind( this, oControlHost );
};

We define the select prompt, find the selected list, and loop through the first row. If the cell style is set to display:none, then it’s hidden and the option in the select prompt should not be selected. The important thing though is the select is defined using the list as a source. This makes it easier for the developer.

The sessionStorage bit is to ensure the first run works as expected, and the page down remembers what’s selected.

Next we have to define what happens when the select is changed.

columnSelector.prototype.onChange = function( oControlHost ){
	var ctrl = oControlHost.page.getControlByName( this.m_sListName ),
      selOpts = this.elm.options,
      selArr = [],
      selLen = selOpts.length;
    
  for (var i=0;i<selLen;++i){
    
    if(this.m_aStatic.includes(i)) {
      selOpts[i].selected=true;
    };
  
    if(selOpts[i].selected) selArr.push(i);
    ctrl.setColumnDisplay( i, selOpts[i].selected );
  };
  
  window.sessionStorage.setItem(this.m_sListName+'SelCols','['+selArr+']')
  
};
[/sourecode]

And finally, let's close off the function.

[sourcecode language="javascript"]
return columnSelector;
});

Using this in Cognos is fairly easy. First, we need to make sure it’s saved somewhere accessible. In this case I’m keeping it \cognos\analytics\webcontent\javascript. Referencing it in the report isn’t as smooth as I’d like, the developer will actually have to enter the path to the file.
5. JS Path

Next we define the configuration object manually.
6. Configuration

The last bit here is the UI Type.
7. UI Type

For a control like this where we’re creating an input, we’d want to use “UI without event propogation”. If we were setting up a Prompt API script, one that interacts with the page without creating an object on the page, we’d use “none”. Something that requires bubbling, “UI with event propagation”.

And now when we run it, everything works! As an added bonus, when a user pages down to a new page, it will use the previous page’s columns. When paging up, it remembers the state of that page.

8. It works

Now, the savvy reader may have noticed some negativity I have towards the way report developers select the desired control. I need YOUR help to fix it! I have submitted an RFE to the IBM Request For Enhancement site. IBM prioritizes fixes and changes based on demand, so I need everyone to click here to vote. You will need to log in with your IBM account. Vote early, and vote often, and I’ll personally send you 10 CognosPoints for your vote!