#IBMIOD Monday

The last few days have been completely exhausting, and there are still three more days of the conference. As part of my deal with PerformanceG2, I’ll be posting more details of the conference on their blog. They should be up soon here.

Like last year I’ve had a lot of fun going to the various booths at the expo and hearing what people have to offer. Pens and other toys are just bonus.

An extremely interesting technology is coming from Servergy. They make extremely cool running, low power servers. Their big selling point is the savings you’d get on electricity bills from powering the servers, to the air conditioners you wouldn’t need. Their demo unit, running at 100% CPU, showed 0.56 amps. I am by no means a hardware guy, but this all seemed very very impressive, and I suspect a few of my clients may be interested in learning more.

An Ernst and Young reps chatted with me about the importance of Forensice Data Analytics – fraud identification and prevention. http://www.ey.com/

Fusion-io is offering a flash adapter for System X. From what I understand, it will cache a database in memory, providing significant performance improvements. http://www.fusionio.com/

Esri was giving out their latest map book, and I was lucky enough to snag one of the last ones they had. GIS has always fascinated me, and the book shows their maps are as much art as informative.

I visited a few IBM booths. IBM consistently and regularly blows me away with the new tools they’re developing. End users now can build complex statistical models with only a few clicks, there is software that will send alerts when it detects imminent infrastructure failure, natural language recognition that will automatically search through data sets to try to answer your questions.

There were a few other technology vendors I spoke with, but unfortunately those guys didn’t have any documentation handy! Guys, with all the flashing lights of the Expo, make sure I have some papers to remind me what you’re all doing! (I did manage to score a hat from PerformanceG2’s arch nemesis, but I won’t mention their name since I don’t want to upset my wonderful hosts).

Did I miss anyone? Are there any groups I should make a special effort to visit? The expo is open until Thursday, so drop a comment and I’ll make a special visit.

Information On Demand 2013!

Well, it’s once again time to start planning for the IOD. It was worth every penny last year, and I am certain it will only be better this year. The IOD is a great place to connect with other IBM business partners and customers. Where else can you talk with people from Boeing and Costco while eating breakfast? One of my most memorable moments was waiting for the flight back, talking to a woman from the Miami-Dade police on the way they use Cognos and BI to catch criminals.

The sessions are broken out by tracks of interest. Each track can have multiple sessions at the same time. To my despair, it’s impossible to attend more than one simultaneously. In addition to the sessions, there are hands on workshops about what you can do with the latest technology. The people manning the workshops are knowledgeable and will very cheerfully talk about how their tools work.

The early bird special ends June 28 for customers (September 13 for business partners), so I recommend registering earlier rather than later. You may want to contact your Cognos business partner of choice to find out if they have a promo code this year. So far I am aware of only three companies that are offering early bird registration specials, The following list is informational, they are in alphabetical order and I am not endorsing any one company over another, more will be added as more companies come out with deals. Feel free to leave a comment if you know of any others that should be mentioned.

  1. BSP – G13BSPSOFT
  2. Ironside – G13IRONSDE
  3. Motio – G13MOTIO
  4. PerformanceG2 – G13PERFG2

I’m holding off registering until IBM let’s me know if my session proposal has been accepted (I hear that they received over 2500 sessions proposals this year, so chances are slim).

Obviously, I am planning on going this year, and I have an offer. If any company wants to sponsor my trip (partially or fully) I will very happily give their products a completely biased review and will randomly tell people how awesome my sponsors’ products are. (Or I could do work for them, or something. I’m open to suggestions.)

And remember – PerformanceG2 for all your training needs!

#ibmiod Wednesday

Dynamic Cubes was the word of the day for me. It’s an exciting new feature of Cognos that lets you build cubes on top of relational systems. By using these cubes, you can get tremendous improvements in your report runtimes. The method of building them seems fast and easy, and while I have a few minor misgivings about some of the design decisions, I am really looking forward to getting an opportunity to play with them.

At the most basic level, Dynamic Cubes are an extension of the DQM engine. As I’m sure everyone in the world is aware, DQM is the 64bit querying engine released with Cognos 10.1. They are based on relational data sources, and work by caching the contents of your data warehouse.

First the cube needs to be modeled. I didn’t get a chance to see the cube modeler, but the developers are saying it looks and feels very similar to a cleaned up Transformer. Attributes can be defined (sadly lacking in Transformer), and dynamic dates are easy to build. It is important to note that the new modeling tool is far less forgiving for nonsense than Transformer. MUNs must be unique, or it will throw an error. The data warehouse must be set up in a star schema or a snowflake for the cube to work.

Once the cube has been built and published, it needs to be started. After it starts up it will start building the various caches. The member cache consists of all of the members in the dimension tables. Aggregation caches are populated as the cube runs. These can be contextually aware aggregates, so the cube will benefit if you have aggregated tables. The data cache is populated as reports are run. If a report has been run before, it will be in the cache and will be rendered instantly. An expression cache exists – any matching expression will be populated instantly.

As reports are run, it will be possible to determine if the aggregations used are optimal. The dynamic query analyzer has a new option for the cubes. You can have it check the run history for specific reports, or users, and it will optimize the caches aggregations to ensure the best performance.

The developers responsible for this innovation referenced a case study in which a report that took roughly a day to run went down to 3 seconds. From what I understand, the data warehouse was the same, but they needed to add some more ram.

The dynamic cubes take a fair amount of ram. IBM will release a white paper discussing the various sizing and ram requirements.

It seems that real life is creeping it’s way back in. Today was the final day of the Expo, and unfortunately I didn’t have a chance to meet with all of the people manning the booths. Over the next week I’ll try to go over the remaining documentation that I pulled from various booths.