Archive for category Technical Ramblings
For the past few years, ITWeb in South Africa have run a number of annual Business Intelligence (BI) conferences. In addition to this, they have run seperate conferences for Data Warehousing (DW).
Now, personally I have never really understood this approach of seperating the concepts. Basically this is implying that possibly BI != DW.
Based on my experience in delivering BI projects; I have always found that a good BI project absolutely requires a good DW component. Despite Analysis Services’ ability to conform data through the virtual world of the UDM (and I include PowerPivot and other 3rd party BI front end tools with similar claims in this discussion), pragmatism and the usual dodgy data quality issues have always dictated that best practise ETL and DW practises are needed before we expose data to end users.
In fact, I have always found the relationship to be so tight, that my internal framing of the world has dictated that BI == DW.
As with any good internal assumption or belief, it is currently being challenged. I’m involved in assisting an international company in developing a group wide reporting data warehouse. Now, the programme is so large, that our current scope of work is “simply” to consolidate the data from the inevitable wide variety of sources into a central data model. This stage of the project has dictated that the approach is very different to what I am used to. i.e. I am no longer directly involved in data delivery, so my usual toolkit of Excel, Strategy Companion, Performance Point and a bit if good old Business Strategy conversations are simply not in the picture.
So, maybe there is space in the world for the idea that DW != BI. (i.e. maybe ITWeb has a point after all)
So, maybe if we flip my formulas into logical implication statements, we could say:
BI –> DW
And although my current project does not necessarily dicate that DW –> BI, I would argue that for the long term success and acceptance of the data we’re consolidating, BI projects will need to be initiated to allow end users to dive into the data asset we’re creating.
Looking forward to being involved in that 🙂
A fairly high percentage of any developers life will require troubleshooting. And yes, I found myself back in that position with a recent install of master data services.
When trying to load the mds IIS site through the browser, the default.aspx would simply not render. So, the loading wheel would simply keep spinning in the tab, without returning any content. Occasionally I would be fortunate to get a ‘page could not be loaded’ timeout response, but most times the wheel would literally keep spinning.
This install was distributed, I.e SQL database was on separate server to IIS. Also SQL server was configured in non-standard port for security.
The install of mds ran successfully.
The config tool ran successfully, creating the site and db successfully.
But still, no page loading?
Turns out that the config tool drops the SQL port specified when storing the connection string to the web.config XML file. Update this string in the web.config file by hand, and suddenly the page starts loading as expected.
Posted from WordPress for Windows Phone
When uploading a PowerPivot model into an existing SharePoint 2010 Powerpivot Gallery, I received the infamous “an error occured while capturing snapshots” error, where the hourglass changes into a disturbing red cross.
Quite odd seeing as I had used this exact library in the past and successfully created snapshots.
Now, there were errors that were quite prevalent in the CTP builds of SharePoint and related to security and underlying services. So, I used this thinking according to Dave Wickert’s blog, http://bit.ly/eQGCWD. And my server passed all of the checkpoints, i.e. everything was setup correctly. I did check the ULS logs, and only found the generic Timeout Expired error.
So, in a move of desperation, I uninstalled Internet Explorer 9 Beta.
And the snapshots started working again!
Another reason not to experiment with IE9 just yet… g
Analysis Services cubes provide a granular mechanism to maintain security rights, right from cube level through to individual dimensional elements. This security is maintained through the implementation of cube roles, where individual Windows Accounts or Groups can be assigned various access rights. These access rights are traditionally managed through SQL Management Studio, as and when required. See Technet for a full description on roles and how to maintain them.
There are cases however, where this maintenance can become tedious, especially in cases where a large number of individual users require granular rights against dimensions with many hundreds or thousands of members.
In traditional SQL based systems, these types of scenarios are typically handled through SQL Server mapping tables. These tables would map individual windows logins against the corresponding entities that that user had rights to see. It is possible to re-use these types of mapping tables in the context of Analysis Services to manage security access. (without writing extensive XMLA or C#-type code to manage roles!!)
Let’s take a version of the Adventure Works cube, and assume that we need to manage security rights to the Product Sub Category level in the Product dimension. The process would be as follows:
- Implement and Populate Mapping Table
As described above, let’s create a mapping table, which maps an individual user against the dimension key that he/she has rights to access; something like:
- Include table into Cube Project
This table can then be incorporated into the cube Data Source View, linking to the ProductSubcategory table.
- User Dimension
From this table, we can create a User dimension, with one level, namely User. So, the user represented in the above table would have a key of [User].[User].[bi2010bob]
- Link Dimension to Cube
Firstly, let’s add our mapping table into the cube as an additional measure group, with a basic measure.
We can then add our new User dimension to the cube, and ensure that the relationships have been setup correctly.
- Create cube role
The final step in the process is to create a role in the cube which will use our underlying security mappings to limit user roles.
This role relies on some predefined MDX functionality.
- UserName. This function in MDX returns the NT name of the user that is connected to the cube, in the form domainusername. E.g. bi2010bob
- StrToMember. This function takes string input, which can be a concatenated expression, and returns the underlying cube member with this name or key.
Exists. This lesser-used function determines limits a given set, returning only those members that have actual values in a given Measure Group.
- Again, see Technet for a full description of Exists
We create a role, with the following properties:
- General à Read Definition
- Membership à NT AuthorityAuthenticated Users
- Cubes à Read Access to the Required Cubes, in our case Sales.
Dimensional Data à in the attribute against which we’re creating security, in our case Product Subcategory, add the following MDX (Under the Advanced Tab, Allowed Member Set)
- EXISTS([Product].[Subcategory].Members, STRTOMEMBER(“[User].[User].[” + UserName + “]”), “Fact Security”)
- Enable Visual Totals. This will depend on your security policy. If checked, will only show totals that apply to the restricted set. If unchecked, the All levels will display the totals as if security were not applied.
All done! Now we’re ready to deploy, process and test.
The table below outlines the different result when querying Order Qty by Subcategory, as an Administrator and as Bob.
Extending the Analysis Services security model to exploit SQL managed mappings is a tremendous way to leverage existing security mappings, as well as manage those scenarios which are simply too cumbersome to manage by hand.
So, SQL Server Reporting Services 2008 R2 now supports maps as data layout controls. This allows us to create reports such as the one indicated below, with no custom development, natively within Reporting Services.
This opens up massive possibilities, as it supports the display of geospatial data in one of three map formats:
- Built-in maps
- Maps of the USA are supported
- I saw a demo of world maps at SQL PASS, looking forward to CTP3 to seeing them in real life
- Any ESRI shapefile can be used. This is particularly powerful as you can represent data for your country down to any level supported by the particular shapefile employed at the time.
- This is the most exciting option for me. While SQL Server 2008 introduced GeoSpatial data types in the database, there was no simple method to display these shapes to users. So, now it is possible to expose geography and geometry data types visually through reports.
With that preamble, I have been experimenting with the ESRI shapefile option. I needed to employ this approach as I wanted to represent country level data for the world (and the CTP2 doesn’t support non-US maps out of the box, really looking forward to CTP3). I was able to develop the report through the wizard, and assign the relevant data fields to the relevant shapefile fields with no problem. However, when rendering the report, I received the following error:
System.Web.Services.Protocols.SoapException: There was an exception running the extensions specified in the config file. —> System.Web.HttpException: Maximum request length exceeded.
— End of inner exception stack trace —
at System.Web.Services.Protocols.ServerProtocol.SetContext(Type type, HttpContext context, HttpRequest request, HttpResponse response)
at System.Web.Services.Protocols.ServerProtocolFactory.Create(Type type, HttpContext context, HttpRequest request, HttpResponse response, Boolean& abortProcessing)
Turns out that the Report Server service limits the HTTP Request size by default. Not exactly sure what the default limit is, however it is definitely smaller than the 6Mb required for my shapefile. To fix this setting, simply update the httpRuntime tag in the web.config file in the ReportServer folder. Specify a maxRequestLength attribute to the size required and you’re in the game. So the tag should look something like:
<httpRuntime executionTimeout=”9000″ maxRequestLength = “1024000” />
Watch this space for SQL 2008 R2 lessons learnt in the real world, as I use CTP3 in a prototyping environment over the next few weeks.