#ibmiod Wednesday

Dynamic Cubes was the word of the day for me. It’s an exciting new feature of Cognos that lets you build cubes on top of relational systems. By using these cubes, you can get tremendous improvements in your report runtimes. The method of building them seems fast and easy, and while I have a few minor misgivings about some of the design decisions, I am really looking forward to getting an opportunity to play with them.

At the most basic level, Dynamic Cubes are an extension of the DQM engine. As I’m sure everyone in the world is aware, DQM is the 64bit querying engine released with Cognos 10.1. They are based on relational data sources, and work by caching the contents of your data warehouse.

First the cube needs to be modeled. I didn’t get a chance to see the cube modeler, but the developers are saying it looks and feels very similar to a cleaned up Transformer. Attributes can be defined (sadly lacking in Transformer), and dynamic dates are easy to build. It is important to note that the new modeling tool is far less forgiving for nonsense than Transformer. MUNs must be unique, or it will throw an error. The data warehouse must be set up in a star schema or a snowflake for the cube to work.

Once the cube has been built and published, it needs to be started. After it starts up it will start building the various caches. The member cache consists of all of the members in the dimension tables. Aggregation caches are populated as the cube runs. These can be contextually aware aggregates, so the cube will benefit if you have aggregated tables. The data cache is populated as reports are run. If a report has been run before, it will be in the cache and will be rendered instantly. An expression cache exists – any matching expression will be populated instantly.

As reports are run, it will be possible to determine if the aggregations used are optimal. The dynamic query analyzer has a new option for the cubes. You can have it check the run history for specific reports, or users, and it will optimize the caches aggregations to ensure the best performance.

The developers responsible for this innovation referenced a case study in which a report that took roughly a day to run went down to 3 seconds. From what I understand, the data warehouse was the same, but they needed to add some more ram.

The dynamic cubes take a fair amount of ram. IBM will release a white paper discussing the various sizing and ram requirements.

It seems that real life is creeping it’s way back in. Today was the final day of the Expo, and unfortunately I didn’t have a chance to meet with all of the people manning the booths. Over the next week I’ll try to go over the remaining documentation that I pulled from various booths.