THINK 2018 – Monday

The first day was interesting. IBM merged three separate conferences into a single mega-conference, and the difference is palpable. Unfortunately, this is not exactly a good thing.

In previous years there were a number of advanced classes. Insights into advanced logging and how IBM analyzed these files was one of the best sessions I’ve attended. Sadly this year IBM seems to has eschewed the advanced stuff for introductory marketing-type classes.

What’s worse, the rooms are all tiny compared to demand. I actually missed the first two sessions of the day due to extreme over booking. The “what’s new and coming” session apparently had 500 people register for a room with a capacity of 70. The same thing happened over and over. Each Cognos-centric session had far more people waiting in line than room available. Hopefully Tuesday will be better.

The sessions I did manage to attend where interesting, and the self-service tools are incredibly impressive. Some of the features I’ve asked for are being planned, but no promises were given.

In the expo I spoke to a few very interesting vendors.

1. InfluxData
Open source database specifically designed to handle large timestamped based databases. Think telecoms, with hundreds of thousands of rows being entered every second. The interesting thing here is that it’s a database built from the ground up. It uses an SQL-like querying language, and can return data in JSON format. They have a tool that can aggregate and load the data on the fly, making it an interesting tool for high data volume environments.

2. Sauce Labs
Automated testing for application development. Imagine you’re building an app for iPhone and Android. You want to make sure it works on as many phones as possible. How many do we have? a few iPhones, and half a million devices from Samsung, Huwei, HTC, and a host of others. Sauce Labs will test your app on each and every environment you want. It can compare outputs (pixel level differences), and system performance. It might be possible that on OnePlus it works great, but on the Nexus 6P there’s a memory leak. What’s going on? Sauce can help you narrow down exactly where the problem is. Also, they were giving away hot-sauce. No swag could ever compete.

3. Code42.
Endpoint data security. This stuff has always fascinated me. They can track a document, Excel, PDF, whatever. Who’s seen it? Where is it being mailed to? Is it being pulled off site? Security always has a special place in my heart. When other people do it, I get to try to break it. When I do it, I get to try to break it even more. But I’ve never considered what happens AFTER the data is exported. These people have, which is awesome.

And finally, a big shout out to Lisa from Motio. Motio is currently making my life significantly easier on an upgrade project, and everyone attending THINK should stop by their booth (716) to say howdy. PMsquare is, coincidentally, also at booth 716, so stop by there and tell them how awesome I am.

THINK 2018 Must-Attend Sessions

Like all years, there are always so many sessions that I want to see. But let’s go through my agenda, day by day.

Sunday.

I’m landing Sunday, and heading straight to the conference, and hopefully I’ll make it to the IBM Cognos Analytics Jam Session. I’m not 100% certain what a “jam session” is, but it sounds sweet. You should be super jelly if you can’t attend.

Monday.

08:30 – IBM Cognos Analytics 11: Discover What’s New.
There are instances of this session, but I feel I might as well get it over with at the beginning.
10:30 – What’s New and What’s Next for Business Analytics.
The speakers will go over the recent and planned enhancements for Watson and Cognos. They better not change the name to IBM Analytics Cloud Watson; that would make me IBMAnalyticsCloudWatsonPaul and that’s a mouthful.
12:30 – Interactive Reporting for Business Users
With the focus on self-service reporting, I may soon find myself obsolete. But, if I can help train the end users to better build their own dashboards, I could retire early.
13:30 – Being GDPR-Ready with Cognos Analytics
General Data Protection Regulation is coming into affect in May 25 in the EU. I view this as applied data security, which is always interesting.
13:30 – Proven Architecture Styles with Analytics with Real-World Demos
Double-booking for me! This is hard, both the 13:30 sessions are single time only, and they’re both very interesting. This session goes into several real-life example analytics patterns. Learning how to avoid the pitfalls other people have encountered is incredibly important.
14:30 – How to Get the Most from Embedding Analytics in Your Applications
I’ve had projects where we’ve embedded Cognos into existing applications. The description downplays the technical aspects, which I’m especially interested in, but it looks like it will show some practical examples.
15:30 – Certifications!
One of my goals here is to get some more certifications. I’ve been crunching the SPSS manuals, and I _WILL_ get at least one cert with it.

Tuesday.
11:30 – Upgrading to IBM Cognos Analytics.
I’m working on an upgrade project right now, and I’m hoping to get some insights to make the process a little smoother.
11:30 – FleetPride and Cresco International Change the Supply Chain Game with IBM’s Cycle of Analytics
My good friend Sanjeev is going to be giving a talk on the work he did with FleetPride!
13:30 – Ready for Business: Building a Spectrum of Capabilities for Analytics at a Milwaukee Manufacturer
That’s a mouthful of a session title. This appears to be a very quick session, lasting only 20 minutes. But I know what they manufacture and I’m interested to see what they’ve done. Also, a little bit of self-promoting never hurt anyone, if they use Cognos, I live not far away!
14:30 – All Aboard! Experience Onboarding to Cognos Analytics on Cloud
The cloud is the future. Let’s be honest, maintaining your own servers is a pain in the neck, and in the wallet. I’m going to learn how to ease the pain of cloudifying (enclouderating?) Cognos environments.
15:30 – User Experience: Future Design for IBM Business Analytics—Moving from Optional to Essential
This sounds intriguing, “Soon, the way you interact with IBM Business Analytics will be tailored to your cognitive profile”. Exactly what does this entail? Will the front screen of Cognos be a dynamically generated dashboard that matches my most commonly viewed objects? I’m honestly very curious about this.
19:30 – Barenaked Ladies!
’nuff said.

Wednesday
09:30 – Front Row Seat to the Future: Ticketmaster’s Journey to Cognos Analytics on Cloud
Real examples of cloudifying Cognos. There are a lot of pros to doing this, and plenty of possible pitfalls. I’m looking forward to seeing how they did it.
10:30 – Roadmap Alert! See What IBM Business Analytics Has in Store for You
The description has, “augmented intelligence for personalized analytics” and “with features such as conversational analytics and chatbots” and “infuse traditional analytics with capabilities like machine learning”. How will this actually work with Cognos? I’m very excited to see this session.
11:30 – Meet the Designers Behind IBM Business Analytics
Very important. The people behind the latest UI changes to Business Analytics. I’m planning on talking, calmly and rationally, about the removal of the toolbars in CA.
11:30 – Communicating Effectively with Reporting, Dashboarding and Storytelling: Now and the Future
Another double-booking! Very painful, because I really want to see this too. They’ll be talking about how they plan on making dashboad contents easier, and visualizations more effective. Someone send me notes from this session!
12:30 – Contextual Analytics Using SPSS Text Analytics and Cognos 11 Visualizations in the Airline Industry
This is fascinating. I’m currently studying SPSS, so I’m looking forward to seeing real examples of taking SPSS and converting the data into dashboards.
15:30 – How to Flat Pack the Cognos Analytics Migration—The IKEA Way!
The description doesn’t say what IKEA was migrating _from_. An earlier Cognos build? A different platform altogether? I’m curious about the thought processes that went into the decision. There are so many cases where companies migrate to a new tool, but want everything to remain exactly the same.
16:30 – Gamifying Success: How GameStop Shifted from BI Producers to Analytics Enablers
Giving end users access to self-service tools is never an easy process. There’s always a learning curve, resistance, and many false steps. It sounds like they have a successful self-service environment, and I’m looking forward to hearing how they implemented it.

Thursday
08:30 – Upgrading to Cognos Analytics: Secrets from Nordic Customer Deployments
Another upgrade session. This looks like it’s more technically focused than the others. They go into how they handles depricated features, like portal pages.
09:00 – Ready for IBM Analytics: What’s in It for Me? What’s in It for My Clients
I might miss this one, but I’d prefer not to. Gartner has shown that Cognos has taken a massive approval hit with the latest version, and this might show some answers.
11:30 – Increasing ROI in Pharmaceuticals with an IBM Cognos Analytics Upgrade
One of my previous clients was a pharmaceuticals company, so I’m always curious about how other companies structure their reports. Also, the session goes into the complexities of upgrading to CA with CQM-DQM and JavaScript migration. Both issues I’ve dealt a great deal with recently.
12:30 – Building a Self-Service Environment Command Center in IBM Cognos Analytics
A client of PMsquare! I haven’t had the opportunity to work at Baxter, but several of my colleagues have. From what I understand they built a supercharged auditing environment to track usage. It sounds awesome!

Did I miss anything important? You’ve got about 24 hours to let me know!

IBM Think! (And dimensional report design)

I haven’t been posting much recently, and you can always tell how busy I am by the frequency of my posts. Never fear, the crazy project I’m on will be ending soon and I can get back to posting semi-regularly.

Before anything else, a very important announcement.

I’ll be attending IBM Think! Do you want to meet me? Tell me how awesome I am and how my solutions have saved your life? Now’s your chance! Send a message to the PMsquare people here and we can set something up! Want to tell me that one of my solutions destroyed your Cognos set up? We can talk it over a beer or two in a well lit public area. During the expo, I’ll either be hanging around booth #716, or wandering around restocking my supply of branded pens and office swag.

Now to the meat of the post.

Let’s talk about dimensional report design.

Building dimensional reports is more of the more complex tasks a Cognos developer can face. Relational reports over a few tables is an easy task, but the skills learned don’t necessarily transfer over. While similar in appearance, the queries use a very different style.

When building a query, it might be tempting to explicitly create a tuple for each item. Case in point, consider the following requirement:

year in column, prior year, Year over Year%
4 measures
Org Hierarchy in rows
desired output

It might be tempting to make each column a separate tuple.

currentyearSales: tuple([Sales],[Current Year])
priorYearSales: tuple([Sales],[Prior Year])
and so on.

But let’s take a look at the underlying MDX that’s generated.

WITH
  MEMBER [Measures].[XQE_V5M_Reseller Order Quantity YoY_CM13] AS ((([Date].[Calendar].[Calendar Year].&[2013], [Measures].[Reseller Order Quantity])-([Date].[Calendar].[Calendar Year].&[2012], [Measures].[Internet Order Quantity]))/([Date].[Calendar].[Calendar Year].&[2012], [Measures].[Internet Order Quantity])), SOLVE_ORDER = 4, CAPTION = 'Reseller Order Quantity YoY'
  MEMBER [Measures].[XQE_V5M_Reseller Sales Amount YoY_CM14] AS ((([Date].[Calendar].[Calendar Year].&[2013], [Measures].[Reseller Sales Amount])-([Date].[Calendar].[Calendar Year].&[2012], [Measures].[Reseller Sales Amount]))/([Date].[Calendar].[Calendar Year].&[2012], [Measures].[Reseller Sales Amount])), SOLVE_ORDER = 4, CAPTION = 'Reseller Sales Amount YoY'
  MEMBER [Measures].[XQE_V5M_Internet Order Quantity YoY_CM15] AS ((([Date].[Calendar].[Calendar Year].&[2013], [Measures].[Internet Order Quantity])-([Date].[Calendar].[Calendar Year].&[2012], [Measures].[Internet Order Quantity]))/([Date].[Calendar].[Calendar Year].&[2012], [Measures].[Internet Order Quantity])), SOLVE_ORDER = 4, CAPTION = 'Internet Order Quantity YoY'
  MEMBER [Measures].[XQE_V5M_Internet Sales Amount YoY_CM16] AS ((([Date].[Calendar].[Calendar Year].&[2013], [Measures].[Internet Sales Amount])-([Date].[Calendar].[Calendar Year].&[2012], [Measures].[Internet Sales Amount]))/([Date].[Calendar].[Calendar Year].&[2012], [Measures].[Internet Sales Amount])), SOLVE_ORDER = 4, CAPTION = 'Internet Sales Amount YoY'
  MEMBER [Measures].[XQE_V5M_Reseller Order Quantity 2012_CM17] AS ([Date].[Calendar].[Calendar Year].&[2012], [Measures].[Internet Order Quantity]), SOLVE_ORDER = 4, CAPTION = 'Reseller Order Quantity 2012'
  MEMBER [Measures].[XQE_V5M_Reseller Sales Amount 2012_CM18] AS ([Date].[Calendar].[Calendar Year].&[2012], [Measures].[Reseller Sales Amount]), SOLVE_ORDER = 4, CAPTION = 'Reseller Sales Amount 2012'
  MEMBER [Measures].[XQE_V5M_Internet Order Quantity 2012_CM19] AS ([Date].[Calendar].[Calendar Year].&[2012], [Measures].[Internet Order Quantity]), SOLVE_ORDER = 4, CAPTION = 'Internet Order Quantity 2012'
  MEMBER [Measures].[XQE_V5M_Internet Sales Amount 2012_CM20] AS ([Date].[Calendar].[Calendar Year].&[2012], [Measures].[Internet Sales Amount]), SOLVE_ORDER = 4, CAPTION = 'Internet Sales Amount 2012'
  MEMBER [Measures].[XQE_V5M_Reseller Order Quantity 2013_CM21] AS ([Date].[Calendar].[Calendar Year].&[2013], [Measures].[Reseller Order Quantity]), SOLVE_ORDER = 4, CAPTION = 'Reseller Order Quantity 2013'
  MEMBER [Measures].[XQE_V5M_Reseller Sales Amount 2013_CM22] AS ([Date].[Calendar].[Calendar Year].&[2013], [Measures].[Reseller Sales Amount]), SOLVE_ORDER = 4, CAPTION = 'Reseller Sales Amount 2013'
  MEMBER [Measures].[XQE_V5M_Internet Order Quantity 2013_CM23] AS ([Date].[Calendar].[Calendar Year].&[2013], [Measures].[Internet Order Quantity]), SOLVE_ORDER = 4, CAPTION = 'Internet Order Quantity 2013'
  MEMBER [Measures].[XQE_V5M_Internet Sales Amount 2013_CM12] AS ([Date].[Calendar].[Calendar Year].&[2013], [Measures].[Internet Sales Amount]), SOLVE_ORDER = 4, CAPTION = 'Internet Sales Amount 2013'
SELECT 
  {CROSSJOIN({[Date].[Calendar].[Calendar Year].&[2013]}, {[Measures].[XQE_V5M_Internet Sales Amount 2013_CM12], [Measures].[XQE_V5M_Internet Order Quantity 2013_CM23], [Measures].[XQE_V5M_Reseller Sales Amount 2013_CM22], [Measures].[XQE_V5M_Reseller Order Quantity 2013_CM21]}), CROSSJOIN({[Date].[Calendar].[Calendar Year].&[2012]}, {[Measures].[XQE_V5M_Internet Sales Amount 2012_CM20], [Measures].[XQE_V5M_Internet Order Quantity 2012_CM19], [Measures].[XQE_V5M_Reseller Sales Amount 2012_CM18], [Measures].[XQE_V5M_Reseller Order Quantity 2012_CM17]}), ([Date].[Calendar].DEFAULTMEMBER, [Measures].[XQE_V5M_Internet Sales Amount YoY_CM16]), ([Date].[Calendar].DEFAULTMEMBER, [Measures].[XQE_V5M_Internet Order Quantity YoY_CM15]), ([Date].[Calendar].DEFAULTMEMBER, [Measures].[XQE_V5M_Reseller Sales Amount YoY_CM14]), ([Date].[Calendar].DEFAULTMEMBER, [Measures].[XQE_V5M_Reseller Order Quantity YoY_CM13])} DIMENSION PROPERTIES PARENT_LEVEL,  PARENT_UNIQUE_NAME ON AXIS(0), 
  DESCENDANTS([Sales Territory].[Sales Territory].[All Sales Territories], 3, SELF_AND_BEFORE) DIMENSION PROPERTIES PARENT_LEVEL,  PARENT_UNIQUE_NAME ON AXIS(1)
FROM [Adventure Works]  CELL PROPERTIES CELL_ORDINAL,  FORMAT_STRING,  LANGUAGE,  VALUE

Looking at this in the profiler, I can see that it’s taking 20 ms to process, and generates 67 subcubes.

In this case it’s better to use the years as a set, and rely on the implicit grouping of a crosstab. This results in significantly fewer data items.

But what about the Year over Year% columns? Manually calculating each column would certainly be easy to do, but again, it’s not necessary. We can create a calculated member in the time hierarchy that calculates it for us.
member(([Current Year] – [Prior Year]) / [Prior Year] , ‘YoY’,’YoY’,[Cube].[Time Dim].[Time Hier])

It might look silly to people coming from a relational background. After all, (2017-2016)/2016 = 4.96%. In this case, the calculation is happening to the nested measures. We can then select the member fact cells of the calculated member, and format all of the cells as percentage.

Let’s take a look at the underlying MDX:

WITH
MEMBER [Date].[Calendar].[XQE_V5M_CM1] AS ((([Date].[Calendar].[Calendar Year].&[2013])-([Date].[Calendar].[Calendar Year].&[2012]))/([Date].[Calendar].[Calendar Year].&[2012])), SOLVE_ORDER = 4, CAPTION = ‘YoY %’
SELECT
{CROSSJOIN({[Date].[Calendar].[Calendar Year].&[2013], [Date].[Calendar].[Calendar Year].&[2012]}, {[Measures].[Internet Sales Amount], [Measures].[Internet Order Quantity], [Measures].[Reseller Sales Amount], [Measures].[Reseller Order Quantity]}), CROSSJOIN({[Date].[Calendar].[XQE_V5M_CM1]}, {[Measures].[Internet Sales Amount], [Measures].[Internet Order Quantity], [Measures].[Reseller Sales Amount], [Measures].[Reseller Order Quantity]})} DIMENSION PROPERTIES PARENT_LEVEL, PARENT_UNIQUE_NAME ON AXIS(0),
DESCENDANTS([Sales Territory].[Sales Territory].[All Sales Territories], 3, SELF_AND_BEFORE) DIMENSION PROPERTIES PARENT_LEVEL, PARENT_UNIQUE_NAME ON AXIS(1)
FROM [Adventure Works] CELL PROPERTIES CELL_ORDINAL, FORMAT_STRING, LANGUAGE, VALUE
[/sourecode]

In this case the runtime is 12 ms with 31 subcubes generated.

That’s 40% faster. While in this example the actual difference is insignificant, in real life the runtime difference for complex queries can be profound. The report that instigated this post was taking over an hour to process, but with the changes I mentioned it dropped down to 3 minutes.

The end result here is a faster, more compact query. Much easier to maintain, much easier to change, and much easier to hand off to clients so you never have to look at it again.

Report XML Below
OLAP Report Design (81 downloads)