Consulting musings #1, Don’t Panic

Firstly, a very happy new year to all. All the best for your 2012, wherever you are!

I have been working as a consultant for around 9 years, and I have experienced a fair amount in that time. From my first client assignment (where I was almost too nervous to get out of my car in their parking lot!), right up to now where I have the pleasure of leading a consulting team… It’s interesting that the memories that stand out are those of the people I have been able to work with over the years. Some that were fantastic, and some that were less than fantastic (!)… I have been able to learn heaps from them all.

So, I wanted to begin this new year with a short series of lessons I’ve learnt from others, and I hope they will be a little amusing and maybe even helpful too…

So, let me begin with my consulting musing #1, whatever you do, “Don’t Panic”!
This really summarises my shuffles and changes through various roles. Every time that I would get comfortable in a certain scenario, like actually being able to get out of my car at client sites; another challenge would appear which would always seem slightly beyond my current skills.
Like the day I was asked to visit a client to assist with a particular MDX issue they were experiencing. Truthfully, at the time I could only spell MDX, never mind consult on it… But I had at least mastered the art of getting out of the car, so I bravely stepped into their offices and was led directly into a small room with about 10 people, and one chair and computer (which was for me)… Talk about the hot seat!!
An hour later, I had solved their issue, they were grateful, and I was relieved. Two key lessons from this one:
1. Listen. Yes, the basics, just listen. I sat for half an hour and just let them explain their issue, ask leading questions to clarify, and keep listening.
2. Remember that no man is an island. Once I understood the issue, I jumped on the phone to a senior colleague, distilled the issue to him, and was able to then convey his answer to the client. (which fortunately worked!!) This one gets tough, as it forces us to occasionally swallow our pride, and make use of the network around us. However, everyone wins in these types of situations. I learnt, my senior colleague was able to apply his own troubleshooting skills, and above all, the client got a good result.

So, relax, don’t panic, and have a little fun while you’re about it 🙂

Leave a comment

DW != BI?

For the past few years, ITWeb in South Africa have run a number of annual Business Intelligence (BI) conferences. In addition to this, they have run seperate conferences for Data Warehousing (DW).

Now, personally I have never really understood this approach of seperating the concepts. Basically this is implying that possibly BI != DW.

Based on my experience in delivering BI projects; I have always found that a good BI project absolutely requires a good DW component. Despite Analysis Services’ ability to conform data through the virtual world of the UDM (and I include PowerPivot and other 3rd party BI front end tools with similar claims in this discussion), pragmatism and the usual dodgy data quality issues have always dictated that best practise ETL and DW practises are needed before we expose data to end users.

In fact, I have always found the relationship to be so tight, that my internal framing of the world has dictated that BI == DW.

As with any good internal assumption or belief, it is currently being challenged. I’m involved in assisting an international company in developing a group wide reporting data warehouse. Now, the programme is so large, that our current scope of work is “simply” to consolidate the data from the inevitable wide variety of sources into a central data model. This stage of the project has dictated that the approach is very different to what I am used to. i.e. I am no longer directly involved in data delivery, so my usual toolkit of Excel, Strategy Companion, Performance Point and a bit if good old Business Strategy conversations are simply not in the picture.

So, maybe there is space in the world for the idea that DW != BI. (i.e. maybe ITWeb has a point after all)

So, maybe if we flip my formulas into logical implication statements, we could say:

BI –> DW

And although my current project does not necessarily dicate that DW –> BI, I would argue that for the long term success and acceptance of the data we’re consolidating, BI projects will need to be initiated to allow end users to dive into the data asset we’re creating.

Looking forward to being involved in that 🙂

, ,

Leave a comment

Master data services – configuration glitch

A fairly high percentage of any developers life will require troubleshooting. And yes, I found myself back in that position with a recent install of master data services.

Symptom

When trying to load the mds IIS site through the browser, the default.aspx would simply not render. So, the loading wheel would simply keep spinning in the tab, without returning any content. Occasionally I would be fortunate to get a ‘page could not be loaded’ timeout response, but most times the wheel would literally keep spinning.

Factors

This install was distributed, I.e SQL database was on separate server to IIS. Also SQL server was configured in non-standard port for security.
The install of mds ran successfully.
The config tool ran successfully, creating the site and db successfully.
But still, no page loading?

Resolution

Turns out that the config tool drops the SQL port specified when storing the connection string to the web.config XML file. Update this string in the web.config file by hand, and suddenly the page starts loading as expected.
Posted from WordPress for Windows Phone

,

Leave a comment

Powerpivot Gallery – Snapshot Creation Error – IE9 culprit!

When uploading a PowerPivot model into an existing SharePoint 2010 Powerpivot Gallery, I received the infamous “an error occured while capturing snapshots” error, where the hourglass changes into a disturbing red cross.

Quite odd seeing as I had used this exact library in the past and successfully created snapshots.

Now, there were errors that were quite prevalent in the CTP builds of SharePoint and related to security and underlying services. So, I used this thinking according to Dave Wickert’s blog, http://bit.ly/eQGCWD. And my server passed all of the checkpoints, i.e. everything was setup correctly. I did check the ULS logs, and only found the generic Timeout Expired error.

So, in a move of desperation, I uninstalled Internet Explorer 9 Beta.

Rebooted.

And the snapshots started working again!

Another reason not to experiment with IE9 just yet… g

, ,

2 Comments

New Adventure Beginning in 2011

I am excited to announce that I have accepted a position with Avanade Australia, as Group Manager for Business Intelligence, as of January 2011. Meaning that I currently have 46 nights left under African skies before my family and I jet off to the great Down-Under.

I will be based in Sydney, and will be looking after customers in Sydney, Melbourne and Brisbane.

I’m really looking forward to the opportunity to see more of our beautiful planet, meet new people, and have a hand in growing the Microsoft BI business in a new part of the world. (and to enjoy the New Year’s fireworks over the harbour bridge!!)

Leave a comment

PowerPivot DAX + Financial Time?

So, I had a fantastic time at TechEd Africa last week. I had the privilege of presenting two sessions, one of them being a session on DAX, the expression language for PowerPivot. In that session, I made the following statement:

“DAX time calculation functions; such as YTD, QTD, MTD; only understand Calendar time, with no support for custom Financial Time hierarchies”

In this blog post, I’d like to correct myself. Because, I wasn’t entirely correct 🙂

———————————————————————————————————————————

In fact, DAX does provide some support for custom Financial Time Hierarchies; as it allows for the last day of the financial year to be specified in a number of the time calculation functions.

e.g. let’s look at the typical Year to Date calculation for Sales Amount in Contoso.

The DAX version of this would look something like:

=TOTALYTD(SUM(FactSales[SalesAmount]), DimDate[DateKey])

However, TOTALYTD also provides support for two optional parameters, namely Filter and Year_End_Date.

So, if we wanted to calculate the YTD for a financial year which is April – March, the DAX would look something like:

=TOTALYTD(SUM(FactSales[SalesAmount]), DimDate[DateKey], ALL(FactSales), “31 Mar”)

(so, the Filter basically does nothing, and the Year_End_Date is simply the date and month – yes I specify the month name as text to avoid regional setting issues! Also, including the year in the parameter is not required and won’t really make sense as we would want the expression to evaluate for any year)

The result of these statements looks as follows:

DAX Financial Time Calcs in action

DAX Financial Time Calcs in action

You’ll notice that the Financial YTD resets itself in April as required.

———————————————————————————————————————————

So, yes, there is support for Financial time.

However, I wasn’t entirely incorrect in my statement earlier, as this logic doesn’t strictly go down to the day level. So, retail environments which have custom trading months, such as 26th – 25th, are not currently supported by the time calculation functions.

Leave a comment

PowerPivot DAX Session Notes 2 – Previous Row

Those that are familiar with PowerPivot and DAX, will be aware that DAX provides the ability to reference columns.
However, it does not provide support for Cell based references (such as A1, E23).
 
This is normally no big issue, until I hit a small design issue this week… I had the requirement to refer to a previous row of data within the PowerPivot model.
And yes, it is possible, using the EARLIER DAX function.
 
Let me talk through the data quickly… I received some stock futures trading data, where each trade had two rows. One row would be the opening position at a point in time, and when the future actually closed, a second row would come through with the closing position. The profit made would be the difference between the opening position and the closing position.
 
Something like the following table…
 
 
The basic process to solve this problem was to do the following:
1. for every row, determine the Previous Date that applies to this particular Client Code.
2. Once the Date is established, determine the Value at that point in time.
3. Profit is then simply Current Value – Previous Value for CLOSE rows.
 
1. Using the following DAX statement, I could calculate the [DatePrevious] column

=CALCULATE(MAX(Table1[Date]), (FILTER(Table1, EARLIER(Table1[Code])=Table1[Code] && EARLIER(Table1[Date])>Table1[Date])))

Note: I’m still coming to grips with the EARLIER function, seems to work semantically the same as an INNER join… will hopefully post a better explanation when I understand some more…

2. Using similar thinking, I could then join back to [Table1] and return the Value at the point in time established in the previous step.

(I would love any feedback if it’s possible to merge this with the previous step, may be more efficient)

=CALCULATE(SUM(Table1[Value]), (FILTER(Table1, EARLIER(Table1[Code])=Table1[Code] && EARLIER(Table1[DatePrevious])=Table1[Date])))

3. Finally, Profit is then just the difference between the current value and the previous value for the CLOSE rows.

=if(Table1[Position]="CLOSE", Table1[Value]-Table1[ValuePrevious], BLANK())

Results in the table below.

 

,

7 Comments

PowerPivot DAX Session Notes 1 – Dates

Part of my preperation for TechEd Africa this year has required me to really get into PowerPivot and DAX, and I thought I’d share some of the lessons learnt along the way…
 
Dates
 
So, in standard Analysis Services models, the concept of a Date dimension is quite accepted… and the world of PowerPivot is not much different.
Part of creating this type of Dimension / Lookup table involves adding attributes like ‘Month’, ‘Year’, ‘Day of Week’ to a given date. Given the date expressions in Excel, it is quite straight forward to add these attributes using these standard Excel functions.
 
One that I didn’t find out the box though was a MonthName(<date>) function.
So, approaches to getting this right are:
1. Using a massive switch / case statement; yuck 🙂
2. Using a lookup table of sorts to map MonthNumber to MonthName; less yuck.
3. Use Excel date formatting features to achieve the same result… (and yes, I only learnt about this this morning, always more to learn)
 – =TEXT([@Date],”mmmm”)
 
Another common hinderance is the sort order of months.
By default sorting is alphabetical, meaning April suddenly appears before February. (and oddly enough this does confuse end users!)
 
 
Again, the more SQL based approaches would involve injecting some kind of number into the month name and sorting accordingly. (But 02-February is also not so user friendly on reports!)
Fortunately, Standard Excel sorting functions understand Months. So, if you sort the months A-Z, Excel will sort correctly.
 
 

, ,

Leave a comment

Excel 2010 – Save & Send to SharePoint Issue

So, I have just finished installing (yet another) BI 2010 Virtual Machine, on Windows Server 2008 R2 RTM.
 
On this one, I was unable to use Excel’s “Save & Send” feature to publish a workbook to Sharepoint.
When I tried to use the “Browse for a Location” dialog, I would type in the Sharepoint URL, and got a message along the lines of “this application can’t open this location”.
 
Quite odd.
 
Turns out that you need the Desktop Experience feature turned on Windows Server 2008 R2 to allow this to work.
Thanks Alan Whitehouse! http://bit.ly/bk1VKW

, , ,

Leave a comment

PowerPivot – Compression Stat

Today I found myself needing to load a .csv file into powerpivot.
This .csv file has around 11M rows in it, 45 columns, fully denormalised, and is around 3,4G.
 
So, to test and validate Microsoft’s claims that the Vertipaq engine does a decent job at compression, I loaded the .csv file as-is directly into a PowerPivot model.
 
And yes, Microsoft were right. The same data in PowerPivot compressed to a 140M .xlsx file!!
That’s a compression ratio of 4.1%!!
 
 

Leave a comment