Hacking datasets, everything IBM doesn’t want you to know!

Excuse the clickbait, couldn’t resist. I submitted to Analytics University but it looks like it wasn’t accepted. Maybe IBM doesn’t want you to know about this one amazing trick.

Datasets are amazing. I can’t fully express my love for them. In-memory local data. Fast and, when modeled with data modules, incredibly versatile. The only downside is the development environment for the datasets. It’s basically a cut down report studio. You can drag in model items, and maybe add a few detail filters. But what if you want to do something more complex? Let’s say you want to build a dataset over a cube, and use some advanced MDX functions? What if you want to union a few queries? Or what if, for some really crazy reason, you want to write a pure SQL statement?

Let’s take a look at the basic structure. This is on the latest release version:

,
It looks simple, and you can double click on the individual items to perform simple calculations. Personally it’s not enough. I want full control over the dataset. How can we do this? Turns out that the dataset GUI is really report studio that’s been cut down. Crippled. Mutilated. Let’s fix this. In this example I want to add some advanced calculations to the dataset. I want to count the orders and find the total days between close and ship date. Then I’ll want to make another query and join the two queries locally.

First we need to find the storeID of the dataset I want to modify. Navigate to the dataset, click on the ellipsis or the three dots, and click properties.

2. Finding ID

That’s the ID we need. An interesting note, sometimes the ID has an extra 0 length space character immediately after that first “i”. If it has that, you need to remove it before it will work with this trick.

Now let’s take a look at the URL to open reports in Report Studio

https://SERVER/bi/?perspective=authoring&id=i8B1B499D47484395A991D046ABB75437&isViewer=false&isNewFromModule=false&isNewFromPackage=false&isNewDataSetFromModule=false&isNewDataSetFromPackage=false&isTemplate=false&isDataset=false&UIProfile=Titan&cmProperties%5Bid%5D=i8B1B499D47484395A991D046ABB75437&rsFinalRunOptions%5Bformat%5D=HTML&rsFinalRunOptions%5Ba11y%5D=false&rsFinalRunOptions%5Bbidi%5D=false&rsFinalRunOptions%5BrunInAdvancedViewer%5D=false&rsFinalRunOptions%5BDownload%5D=false&rsFinalRunOptions%5Bprompt%5D=true&rsFinalRunOptions%5BisApplication%5D=false&isPreview=false&promptParameters=%5B%5D

Lots of useless flags, but one stands out to me. That isDataset can be toggled to true to make the saved file a dataset! Let’s cut down that URL and use the URL from our dataset:

First, let’s take a look at the URL to open report studio.

https://SERVER/bi/?perspective=authoring&id=i0CDF1E1C6B4E4CAAAAA39CD0DCFCD242&isDataset=true&UIProfile=Titan&cmProperties%5Bid%5D=i0CDF1E1C6B4E4CAAAAA39CD0DCFCD242

That’s a much more reasonable URL and look what happens when we open it!

It’s beautiful! We have the page explorer, we have the query explorer. It’s all there! Let’s make the changes we want. I go to the query and add a new data item, close to ship _days_between([Sales (query)].[Time (close date)].[Date (close date)],[Sales (query)].[Time (ship date)].[Date (ship date)]) with the aggregation of total. And then I drag order number in and set the aggregation to count distinct. We can perform the calculation in the data module later. Finally I add a new data item called Previous Year Month which is [Month key] – 100. This will allow me to build an identical query and pull the same measures for last year.


Finally, I can go back to the report page, set the query to that joined query, and add in the new data items.

But there’s a problem! Take a close look at that screenshot a few paragraphs above. It’s missing the data items in the insertable objects pane! Oh the humanity. Fortunately it’s nothing a little skullduggery can’t fix. Press F12 to open the dev toolbar. Use the element selector to select that space next to “Source” and take a look at the HTML. The data items tab is actually there, but it’s set to display:none! Change that to display:block or remove that style, and it will come back!

Important note. before you attempt to add the items, make sure to turn off “Automatic group and summary behavior for lists”. With that option on you’ll get a weird and unhelpful error message when attempting to add the new data items.

Let’s add the item, and save. Go back to the welcome page and let’s try to run it.

It works! And now let’s see if the new things we added work in dashboards.

And the dashboard works exactly as I would expect. In the next version of Cognos there is going to be an easier way to handle relative time periods, but there are plenty of other use cases for doing local joins in datasets.

THINK 2018 Thursday – and a quick recap

It’s been a hectic week for me, so my apologies for not posting this earlier.

Thursday was the last day of the conference, and had the absolute best session so far. While the sessions so far have been mostly introductory or soft, Thursday finally had a hard technical session.

Sadly I missed most of the morning sessions, so I only have two to report on.

Merck Pharmaceuticals upgraded to Cognos 11, Series 7 authentication to Active Directory. While this would normally be a nightmare-and-a-half, Merck brought in the IBM AVP group to help handle the migration. Apparently they have tools that will ease the process, and map from one auth source to the other. Sadly the cost of these tools are bundled with hiring AVP to come and help. It’s not something I can download and play with.

Baxter wins the best technical session of the week. Alex used the Cognos audit model, modified it to his needs, and then built a data set on top of it. In the presentation, which I’m helpfully providing a link to, he added a number of calculated fields to better group the data. Furthermore, he did a count distinct on the request ID. As the audit package stands, it counts the timestamp of the run, which changes for each prompt selection. Once that was done, he wrapped the entire model in a data set, giving tremendous improvements in speed and usability. The presentation can be found here: https://1.dam.s81c.com/m/374561c83aa32d1c/original/Think_2018_Session7034_Pataky_20180322v2-pdf.pdf

Ultimately the conference was somewhat disappointing this year. The technical sessions were few and far between. The few Cognos sessions there were tended to emphasize the same, albeit good, lesson – “When upgrading Cognos, plan twice, implement once”. Due to the extreme number of attendees, coupled with the dearth of BA sessions, rooms were overcrowded, and occasionally impossible to enter. Several people I talked to had to choose between eating lunch, or staying in the room to make sure they could attend the next session.

The roadmap was good, as was the session with the designers. Wonderful news on that front, we’ll be able to “pin” the popover toolbar to the top. Basically we’re getting the toolbar back! I’m hoping to get early access, in some way or another, so I can write about the awesome new features.

Think 2018 – Wednesday

Wow, today was busy. The crowds are much smaller, the rooms still fill up all the way, but everyone can actually attend.

I started the day with “Front Row Seat to the Future: Ticketmaster’s Journey to Cognos Analytics on Cloud”. It’s interesting that many of the Cognos user success sessions involve moving to the cloud. It’s certainly seems to have worked for them. Part of the process involved a good amount of planning. Auditing users, reports, usage. Migrate only things that were used. By embracing the new self-service capabilities, TicketMaster managed to successfully migrate, and improve user adoption of Cognos.

Next was the Road Map. This was the session I’ve been waiting for. It was set up as a game-show type thing, with Rachel Su, Kevin McFaul, and Jason Tavoulis taking turns presenting changes in their products (Report Studio, Dashboards, and Exploration) to a panel of judges. BI Execs WestJet, GameStop, and QuadReal all took part and gave hearts, thumbs up, and indifferent emojis in response to the new features.

Rachel was demoed the changes to report studio. Dragging in new objects essentially creates a table son the fly, making positioning things much easier. Formatting things is much easier, including multi select formatting! Navigating between pages is much easier, with a menu at the top letting you switch, without having to go to the insertable pane.

Kevin was showing off dashboard. Copy from existing reports, nice grid background and can see underlying data. Better formatting. He got a hug from WestJet over the dashboard formatting. I think users could very quickly copy/paste from existing reports to dashboards. I’m curious how this would work with really complex reports. Custom polygons on mapbox maps. Can lasso points in a map to filter other objects. Can lasso other visualizations. New vis, Watson spiral.

Jason showed off something new. From any visualization, you can now expand it and EXPLORE. Remember the old Watson Analytics stuff? Watson, now in Cognos, will suggest related visualizations, it will use natural language to talk about points in the visualization, including things that may not be readily apparent. Any insights it makes can be clicked on, and explored further. You can even see the decision tree used to make that insight. Another interesting thing is that Watson will accept natural language questions. In the demo Jason asked “show me the hourly capacity rates by day” and Cognos/Watson returned a graph. In every case, looking at a new graph will save the previous graph in a list on the left. And, again, you can copy the graph from there and paste into a dashboard.

Kevin did talk about the future, including some (incredibly light on details) things about user data modelling, and a new tool to extend and embed the dashboard in applications.

I spoke with the Design Team, not the developers making Cognos, but the ones making it look nice and easy to use. I had an interesting conversation with one of them about some of the issues I’ve encountered with Report Studio. She’ll get back to me on some of it, but did say that in some future release the super annoying popover menu will be pinnable to the top of the screen. Almost like a toolbar.

I wasn’t able to make some of the afternoon sessions due to some time constraints, but I did get to the Ikea and GameStop sessions. Ikea upgrading to Cognos Analytics, “How to Flat Pack the Cognos Analytics Migration—The IKEA Way!” and GameStop “Gamifying Success: How GameStop Shifted from BI Producers to Analytics Enablers”. In both cases I was super impressed. Ultimately, with both sessions the lesson was clear. Make a plan, set dates, and clean up the environment before making huge changes.

The GameStop session had a really great session doc here: https://1.dam.s81c.com/m/23abe93d624d0749/original/GameStop_IBM_Think_2018-pdf.pdf