Archive for category Technical Ramblings

DW != BI?

For the past few years, ITWeb in South Africa have run a number of annual Business Intelligence (BI) conferences. In addition to this, they have run seperate conferences for Data Warehousing (DW).

Now, personally I have never really understood this approach of seperating the concepts. Basically this is implying that possibly BI != DW.

Based on my experience in delivering BI projects; I have always found that a good BI project absolutely requires a good DW component. Despite Analysis Services’ ability to conform data through the virtual world of the UDM (and I include PowerPivot and other 3rd party BI front end tools with similar claims in this discussion), pragmatism and the usual dodgy data quality issues have always dictated that best practise ETL and DW practises are needed before we expose data to end users.

In fact, I have always found the relationship to be so tight, that my internal framing of the world has dictated that BI == DW.

As with any good internal assumption or belief, it is currently being challenged. I’m involved in assisting an international company in developing a group wide reporting data warehouse. Now, the programme is so large, that our current scope of work is “simply” to consolidate the data from the inevitable wide variety of sources into a central data model. This stage of the project has dictated that the approach is very different to what I am used to. i.e. I am no longer directly involved in data delivery, so my usual toolkit of Excel, Strategy Companion, Performance Point and a bit if good old Business Strategy conversations are simply not in the picture.

So, maybe there is space in the world for the idea that DW != BI. (i.e. maybe ITWeb has a point after all)

So, maybe if we flip my formulas into logical implication statements, we could say:

BI –> DW

And although my current project does not necessarily dicate that DW –> BI, I would argue that for the long term success and acceptance of the data we’re consolidating, BI projects will need to be initiated to allow end users to dive into the data asset we’re creating.

Looking forward to being involved in that 🙂

, ,

Leave a comment

Master data services – configuration glitch

A fairly high percentage of any developers life will require troubleshooting. And yes, I found myself back in that position with a recent install of master data services.

Symptom

When trying to load the mds IIS site through the browser, the default.aspx would simply not render. So, the loading wheel would simply keep spinning in the tab, without returning any content. Occasionally I would be fortunate to get a ‘page could not be loaded’ timeout response, but most times the wheel would literally keep spinning.

Factors

This install was distributed, I.e SQL database was on separate server to IIS. Also SQL server was configured in non-standard port for security.
The install of mds ran successfully.
The config tool ran successfully, creating the site and db successfully.
But still, no page loading?

Resolution

Turns out that the config tool drops the SQL port specified when storing the connection string to the web.config XML file. Update this string in the web.config file by hand, and suddenly the page starts loading as expected.
Posted from WordPress for Windows Phone

,

Leave a comment

Powerpivot Gallery – Snapshot Creation Error – IE9 culprit!

When uploading a PowerPivot model into an existing SharePoint 2010 Powerpivot Gallery, I received the infamous “an error occured while capturing snapshots” error, where the hourglass changes into a disturbing red cross.

Quite odd seeing as I had used this exact library in the past and successfully created snapshots.

Now, there were errors that were quite prevalent in the CTP builds of SharePoint and related to security and underlying services. So, I used this thinking according to Dave Wickert’s blog, http://bit.ly/eQGCWD. And my server passed all of the checkpoints, i.e. everything was setup correctly. I did check the ULS logs, and only found the generic Timeout Expired error.

So, in a move of desperation, I uninstalled Internet Explorer 9 Beta.

Rebooted.

And the snapshots started working again!

Another reason not to experiment with IE9 just yet… g

, ,

2 Comments

Excel 2010 – Save & Send to SharePoint Issue

So, I have just finished installing (yet another) BI 2010 Virtual Machine, on Windows Server 2008 R2 RTM.
 
On this one, I was unable to use Excel’s “Save & Send” feature to publish a workbook to Sharepoint.
When I tried to use the “Browse for a Location” dialog, I would type in the Sharepoint URL, and got a message along the lines of “this application can’t open this location”.
 
Quite odd.
 
Turns out that you need the Desktop Experience feature turned on Windows Server 2008 R2 to allow this to work.
Thanks Alan Whitehouse! http://bit.ly/bk1VKW

, , ,

Leave a comment

PowerPivot – Compression Stat

Today I found myself needing to load a .csv file into powerpivot.
This .csv file has around 11M rows in it, 45 columns, fully denormalised, and is around 3,4G.
 
So, to test and validate Microsoft’s claims that the Vertipaq engine does a decent job at compression, I loaded the .csv file as-is directly into a PowerPivot model.
 
And yes, Microsoft were right. The same data in PowerPivot compressed to a 140M .xlsx file!!
That’s a compression ratio of 4.1%!!
 
 

Leave a comment

PerformancePoint 2007 – MasterPage “Security” Issue

Interesting bug I ran into this morning…
 
Basically, when opening a PPM 2007 dashboard, I got a standard Sharepoint Permissions denied web page, allowing me to log on as a different user.
Even though I was site admin!!
 
So, some background…
 
I’m in the process of configuring PerformancePoint 2007 + MOSS 2007 Enterprise. (yip, I know 2010 is out, but sometimes we need to work on legacy software!!)
 
I’m setting up multiple sharepoint web applications on a single IIS server, to cater for a Dev and a QA environment.
 
So, I had deployed the PerformancePointDefault.master to the /_catalogs/masterpage folder, and when testing as the original site admin, everything was ok.
However, when logging in as a different user (who was also a site admin) I got the Sharepoint Permissions Denied error when trying to open a Dashboard page.
 
Solution
 
So, it turns out that the PerformancePointDefault.master was in a Pending state in the document library (/_catalogs/masterpage). Meaning that the second user couldn’t see the master page, although he was a site admin. Once Approving and checking in the master page… problem resolved.

, ,

Leave a comment

Analysis Services – Data Driven Security Model

Preamble…

Analysis Services cubes provide a granular mechanism to maintain security rights, right from cube level through to individual dimensional elements. This security is maintained through the implementation of cube roles, where individual Windows Accounts or Groups can be assigned various access rights. These access rights are traditionally managed through SQL Management Studio, as and when required. See Technet for a full description on roles and how to maintain them.

There are cases however, where this maintenance can become tedious, especially in cases where a large number of individual users require granular rights against dimensions with many hundreds or thousands of members.

In traditional SQL based systems, these types of scenarios are typically handled through SQL Server mapping tables. These tables would map individual windows logins against the corresponding entities that that user had rights to see. It is possible to re-use these types of mapping tables in the context of Analysis Services to manage security access. (without writing extensive XMLA or C#-type code to manage roles!!)

Scenario

Let’s take a version of the Adventure Works cube, and assume that we need to manage security rights to the Product Sub Category level in the Product dimension. The process would be as follows:

  1. Implement and Populate Mapping Table

 As described above, let’s create a mapping table, which maps an individual user against the dimension key that he/she has rights to access; something like:

  1. Include table into Cube Project

This table can then be incorporated into the cube Data Source View, linking to the ProductSubcategory table.

  1. User Dimension

From this table, we can create a User dimension, with one level, namely User. So, the user represented in the above table would have a key of [User].[User].[bi2010bob]

  1. Link Dimension to Cube

Firstly, let’s add our mapping table into the cube as an additional measure group, with a basic measure.

We can then add our new User dimension to the cube, and ensure that the relationships have been setup correctly.

  1. Create cube role

The final step in the process is to create a role in the cube which will use our underlying security mappings to limit user roles.

This role relies on some predefined MDX functionality.

  • UserName. This function in MDX returns the NT name of the user that is connected to the cube, in the form domainusername. E.g. bi2010bob
  • StrToMember. This function takes string input, which can be a concatenated expression, and returns the underlying cube member with this name or key.
  • Exists. This lesser-used function determines limits a given set, returning only those members that have actual values in a given Measure Group.
    • Again, see Technet for a full description of Exists

We create a role, with the following properties:

  1. General à Read Definition
  2. Membership à NT AuthorityAuthenticated Users
  3. Cubes à Read Access to the Required Cubes, in our case Sales.
  4. Dimensional Data à in the attribute against which we’re creating security, in our case Product Subcategory, add the following MDX (Under the Advanced Tab, Allowed Member Set)
    1. EXISTS([Product].[Subcategory].Members, STRTOMEMBER(“[User].[User].[” + UserName + “]”), “Fact Security”)
  5. Enable Visual Totals. This will depend on your security policy. If checked, will only show totals that apply to the restricted set. If unchecked, the All levels will display the totals as if security were not applied.  

6. Test

All done! Now we’re ready to deploy, process and test.

The table below outlines the different result when querying Order Qty by Subcategory, as an Administrator and as Bob.

Administrator  Bob 
   

 

Conclusion

Extending the Analysis Services security model to exploit SQL managed mappings is a tremendous way to leverage existing security mappings, as well as manage those scenarios which are simply too cumbersome to manage by hand.

7 Comments

BI Debugging Thoughts – Part I – SSIS

Color coding is an interesting thing. Through our lives, through traffic signals, and through the occasional story book we’ve learnt a few important things:
1. Green is good.
2. Red is bad.
3. Yellow we can live with…
 
This lesson is being enforced right now as I’m debugging some SSIS packages!
(See, the Red / Yellow / Green KPI indicator thinking runs deep in the Microsoft BI stack!)
 
This has got me thinking, is it possible to actually document an approach to debugging BI Solutions? Quite a broad subject area to cover, as BI in the Microsoft sense includes a number of inter-related components.
 
So, I thought I’d start with the ETL pieces, and jot down a few areas that I tend to look into when the SSIS components go Red
1. Progress Tab.
 – Yip, the humble SSIS progress tab. This is an incredibly verbose logging area, and it normally gives a great indication as to the source of errors.
     – This is not always available, as sometimes we need to debug packages that ran through the SQL Agent… so it’s always advisable to implement good SSIS Logging. (which will give the same effect)
 
2. Metadata, Metadata, Metadata and the Intrepid Data Types
 – SSIS is incredibly sensitive to changes in schemas and data types. And in fact, may not execute if some of the underlyinh data structures have changed. (without properly updating the SSIS data flow components.
 – Simple enough to fix, simply open the relevant Data Flow Component and SSIS normally refreshes its metadata on open.
 – In this space, it’s worth noting that SSIS doesn’t like inserting Unicode into Varchar data type fields, so you may need some data type conversions in your flows.
(don’t feel bad, the performance overhead is minimal compared to the overhead of dealing with NVarchar at the database level, in my experience)
 
3. Script Component Debugging
 – This is actually what I’m doing right now. Quite a tricky thing to do on a x64 platform, as the script components are precompiled, and you can’t step through them
 – In this case we have resorted to an approach where all potentially dangerous commands are wrapped in a try {} catch {} block, and the catch {} logs the error through to a text file.
(of course this text file could be the progress tab, a message box or the event log itself; either way it just needs to be set up.)
 – My initial rule of thumb here would be… don’t use the Script Component unless you really need to. SSIS has a variety of components that can be used to do things like FTP, file handling and cube processing… so, use what exists where possible before re-inventing the wheel!
 
4. 32 / 64 bit
 – Talking about bits, it’s important to note there are a few things in SSIS that don’t upgrade very well from 32 bit to 64 bit platform.
 – A good example is the Excel Source. (
 
5. Event Log
 – Often overlooked, and yet incredibly useful area to hunt through. It is amazing how many useful exception messages are found in the Application Log.
 
6. Green isn’t always Good 🙂
 – Yip, even though a package may compile and execute with no visible structural errors, it is always important to balance the end results by hand or by script to make sure that things haven’t gone a little wierd in translation.
 
Oh look, my package is finally all green!! Wonderful.
 

,

Leave a comment

Office 2010 RC – Activation Issues

So, I installed Office 2010 RC this morning… Really looking forward to seeing what has changed / become more stable.
 
When starting Outlook, I then got a message indicating that I needed to Activate the Product. No Problem, I simply clicked ‘Activate over the internet’, and waited for the success message as before…
 
Two errors:
 
– Firstly, I got an ‘Unexpected Error has Occured’ message.
Not a very useful message, but it turns out that the power in our server room had gone down, and hence I had no internet connection.
Easily solved, I just popped in next door to Microsoft, hooked up, and tried again.
This is when I ran into error 2.
 
– When activating over the internet, I got a message indicating that the key I had used had been activated too many times, and I needed to use another key or call the contact centre.
Again, no big deal right. Simply get a key from connect, go to File –> Help, and change the product key (as every MS article indicates you can do)
Well, that link simply isn’t there on my install!
 
So, to change a product key, I had to go to Add/Remove Programs. Change the Office 2010 install. And then I got the option to change the product key.
Ran through that wizard, and everything is working 100% now.

,

Leave a comment

Reporting Services 2008 R2: Lessons learnt from the map control

Preamble

So, SQL Server Reporting Services 2008 R2 now supports maps as data layout controls. This allows us to create reports such as the one indicated below, with no custom development, natively within Reporting Services.

bob

This opens up massive possibilities, as it supports the display of geospatial data in one of three map formats:

  • Built-in maps
    • Maps of the USA are supported
    • I saw a demo of world maps at SQL PASS, looking forward to CTP3 to seeing them in real life
  • ESRI
    • Any ESRI shapefile can be used. This is particularly powerful as you can represent data for your country down to any level supported by the particular shapefile employed at the time.
  • SQL 2008 Geospatial
    • This is the most exciting option for me. While SQL Server 2008 introduced GeoSpatial data types in the database, there was no simple method to display these shapes to users. So, now it is possible to expose geography and geometry data types visually through reports.
  • Lesson Learnt

    With that preamble, I have been experimenting with the ESRI shapefile option. I needed to employ this approach as I wanted to represent country level data for the world (and the CTP2 doesn’t support non-US maps out of the box, really looking forward to CTP3). I was able to develop the report through the wizard, and assign the relevant data fields to the relevant shapefile fields with no problem. However, when rendering the report, I received the following error:

    System.Web.Services.Protocols.SoapException: There was an exception running the extensions specified in the config file. —> System.Web.HttpException: Maximum request length exceeded.
       at System.Web.HttpRequest.GetEntireRawContent()
       at System.Web.HttpRequest.get_InputStream()
       at System.Web.Services.Protocols.SoapServerProtocol.Initialize()
       — End of inner exception stack trace —
       at System.Web.Services.Protocols.SoapServerProtocol.Initialize()
       at System.Web.Services.Protocols.ServerProtocol.SetContext(Type type, HttpContext context, HttpRequest request, HttpResponse response)
       at System.Web.Services.Protocols.ServerProtocolFactory.Create(Type type, HttpContext context, HttpRequest request, HttpResponse response, Boolean& abortProcessing)

    Turns out that the Report Server service limits the HTTP Request size by default. Not exactly sure what the default limit is, however it is definitely smaller than the 6Mb required for my shapefile. To fix this setting, simply update the httpRuntime tag in the web.config file in the ReportServer folder. Specify a maxRequestLength attribute to the size required and you’re in the game. So the tag should look something like:

    <httpRuntime executionTimeout=”9000″ maxRequestLength = “1024000” />

    Watch this space for SQL 2008 R2 lessons learnt in the real world, as I use CTP3 in a prototyping environment over the next few weeks.

    ,

    Leave a comment