SAP + Microsoft BI: 6 Scenarios to set that data free!

The Microsoft BI front end tools provide a fantastic mechanism to create reports, dashboards and free form analytics. So, how do we use these tools when the underlying data is provided by SAP, either R3 or BW?

Well, I’m in a session at the Microsoft BI Conference this year with Scott Cameron, and he’s discussing some scenarios on how to get data into PowerPivot. These techniques are quite generic, and can allow data to be exposed and consumed by a number of front ends, namely…

  1. Excel 2007 / 2010
  2. Reporting Services 2008 / 2008 R2
  3. Data Dumps
  4. Third Party Providers
    1. Theobald Xtract PPV
    2. ERP-Link iNet.BI
    3. Simplement Data Liberator

Interestingly, only 2 of the techniques involve Microsoft-provided drivers and techniques. The others are either custom data exports and imports, or third-party providers. Let’s discuss them a little further.

Scenario 1: SAP BW à Excel 2010

Excel 2007 / 2010 natively supports browsing and reporting on SAP BW infocubes. No need for additional add-ins, or 3rd party software. Only requirement is the install and configuration of the SAP GUI front end.

This SAP GUI front end provides two things:

  1. SAP BW OLE DB Provider
    1. limitation here is that only supports 32 bit Excel.
  2. SAP Login (Description and Client Number)

Once the driver has been installed, and we’ve used SAP GUI to create a SAP Login; we can then use the Get External Data wizard in Excel to connect to SAP BW, in a very similar manner to connecting to Analysis Services cubes. Some differences though:

  1. When connecting to external data, select Other Data Sources
  2. Select the SAP BW OLE DB Provider
    1. Again, if it’s not there, it’s probably a 64 bit machine ;(
  3. Data Source = Description of the SAP Login created in SAP GUI
  4. User Name and Password are the SAP credentials
  5. Under the All tab, add an Extended Property for the Client Number, SCF_CLIENT = <Client Number>

And you’re done!

Check for further details.

I like this approach personally because I am quite a fan of Excel to analyse data. It also allows re-use of the existing BW cubes, and no data needs to be physically moved between systems.

Scenario II: SAP BW à Reporting Services

SQL Server 2008 and SQL Server 2008 R2 allows reports to be written directly against SAP BW data sources.

These reports use the Microsoft .Net Provider 1.1 for SAP NetWeaver Business Intelligence. This driver is installed as a part of the Client Tools install on the SQL Server install media.

To enable these queries on the BW Server, you need to enable the QueryCube for external access. (This is done through the SAP BEX Query Designer)

The connection string is the XML/A connection URL. This URL allows for the passing of parameters (as the dataset itself doesn’t support parameters) Once connected, the report design experience is again similar to reporting off of Analysis Services.

Again, a neat method to create reports, as tools like Report Builder 3.0 do allow more people to create and consume these reports. And the data is not physically moved into a staging environment.

Check for further details.

Scenario III: SAP Open Hub à CSV File

SAP provides an Open Hub license. This allows data to be exported. These data exports are natively written to CSV file, however it is also possible to export directly to SQL Server (which would be preferred!)

Clearly, this is the most manual method to get to the data. The biggest negatives here are

  • the security is lost in translation
  • the Open Hub license is not a trivial purchase from my understanding

Scenario IV: Third Party Providers

  1. Theobald Xtract PPV
    1. Very neat software. Basically creates a data feed that PowerPivot can consume to directly import data. (Without having to use the “traditional” reporting services middle tier.
    2. Check for further details
  2. ERP-Link iNet.BI
    1. Provides a host of tool that use a .Net Data Provider to extract data from SAP.
    2. Interesting one is the Data Source View Creator, as it creates an Analysis Services cube. (which I’m a massive fan of)
    3. Check for further details
  3. Simplement Data Liberator
    1. A replicator to move data from SAP to SQL in real-time
      1. (Most SAP administrators are nervous to do this, as you need to enable replication in SAP)
    2. Neat solution to create a fully-fledged SQL Server data warehouse
    3. Check for further details



, ,

Leave a comment

Thoughts when Installing PowerPivot – Client

So, I’m in a session at MS BI Conference 2010 with Dave Wickert, and he’s doing a great job of working us through things to look out for when installing the PowerPivot client.

This is an Add-In that runs in the context of Microsoft Excel 2010. Any version of Excel that supports add-ins will support the PowerPivot add-in (which is almost any version!)

The download is available from

Make sure you download and install the version that correlates to the version of Excel (not your Operating System). So, basically 32-bit Excel = 32-bit Add-in (regardless of 64 bit OS) And yes, you want to use the 64-bit version if at all possible. It allows you to create and manage larger datasets, any size that fits in RAM will work.

However, please note that only 4G files can be saved to disk, and only 2G files can be uploaded to Sharepoint. Yes, only J

Preconditions for the Add-in to work include:

  1. VSTO installed and running
    1. The Complete install of Office manages this by default
    2. If installing only Excel, be sure to select “Shared Components” as well to include VSTO
  2. The PowerPivot add-in is enabled. (once installed)
    1. Check under File – Options – Addins
    2. Ensure that the add-in is not in the Disabled State
  3. If the add-in is still gone, even if both are ok, it is possible to expose the VSTO exceptions on Excel start up
    1. Create an environment variable, VSTO_SuppressDisplayAlerts, and set to 0
    2. Check for full details

Debugging options for the client add-in can be set through PowerPivot à Settings. It’s possible to log events off to a log file, which can be opened and analysed in SQL Server Profiler. Also possible to create a snapshot of the current memory usage, by clicking “Snapshot”.

The settings window is also the equivalent of HelpàAbout in most apps. The RTM current version of the add-in is 10.50.1600.1


Leave a comment

BI Conf 2010 – PowerPivot Management Dashboard

PowerPivot provides a feature in SharePoint Central Admin, where administrators can see the usage history of published PowerPivot workbooks. This PowerPivot Management Dashboard is essentially a custom web part web page that shows a number of views on the current state of the system, including:

  • Infrastructure – Server Health
    • Query Response Times
    • Average CPU Usage
    • Average Memory Usage
    • Activity
    • Performance
  • Workbook Activity – Chart and List
    • This is the killer time based chart showing animated history of workbook activity
  • Data Refresh – Activity and Failures
  • Miscellaneous Reports
    • Essentially a document library that has been exposed through a web part

Generally, this dashboard is quite fixed, both in terms of layout and content.

Elements that cannot currently be updated:

  1. Overall Layout. The “dashboard itself is not written in something like Performance Point, so the layout and content can’t be changed.
  2. The Silverlight Control. Sigh. There are no plans to produce a separate reusable bubble chart control. Bit of a pity, would be great to embed in other dashboards. Maybe next release J

Elements that we can update:

  1. Any of the Excel based workbooks, including the Workbook activity book
    1. These workbooks are in themselves PowerPivot workbooks, which feeds data from the PowerPivot Shared Services Database. (The Usage schema tables)
  2. The Reports document library (bottom right section of the dashboard.)
  3. This is simply a reference to a document library in All Site Content à PowerPivot Management à Instance ID à 1033


Leave a comment

New Orleans – Post I

So I’m in New Orleans, Louisiana; attending the Microsoft Business Intelligence conference 2010. It starts officially tomorrow.

This is night #2 that we’ve been in this party city, and I figured that it’s about time I put down some of my initial thoughts about this place.

Firstly, this is definitely the city that you want to attend with some mates. There are simply too many parties going on in the evenings, and you really want to be there (and let’s be honest, no one likes to party alone.) It’s also a good thing to have a wing man to save you when offered things on Bourbon street that you would rather avoid! (Thanks Mark, you legend you!)


Secondly, It’s a city where history and the present are joined at the hip. It’s great to see how the French Quarter has been preserved while the rest of the city has grown into palm lined streets and mega hotels. Oh yeah, loving the palm lined streets, Canal street being a great example (and yes, we walked most of it in a desperate attempt to find a megamall for Mark)


Talking Mega malls takes me to thought number three… New Orleans is not the city you come through too to visit the mega Best Buy / Walmart type stores. (unless you have a car of course, which we don’t for this trip) New Orleans itself is really driven by the French Quarter, and the river attractions. These involve more artie-type shops, where you can get a great massage, tarot card reading, or a nice fridge magnet; but not an ipod… sadly.

So, the first two days have been great so far. We’ve done all sorts of things including:

  1. Eating at the Crescent City Brew House. Hands down the best rib place on planet earth.
  2. Chilling up the Mississippi (did I spell that right??) on the Steamboat Natchez. Seriously like going back in time. Old school thick carpets in the lunch hall, honky jazz band, big steam wheel at the back. Definitely recommended!
  3. Eaten waffles for breakfast. Who says they only need to be pudding!
  4. Watched 12 sports at once. Yip, one of the sports bars here at 12 massive screens going at once. A little ADD, but at least we got to see the French Open Final. (nice one Nadal!)
  5. Drinking a Hurricane at Pat O’Briens. Legendary.
  6. Oh, and registering at the BI Conference, which is why we’re here J

Leave a comment

PerformancePoint 2007 – MasterPage “Security” Issue

Interesting bug I ran into this morning…
Basically, when opening a PPM 2007 dashboard, I got a standard Sharepoint Permissions denied web page, allowing me to log on as a different user.
Even though I was site admin!!
So, some background…
I’m in the process of configuring PerformancePoint 2007 + MOSS 2007 Enterprise. (yip, I know 2010 is out, but sometimes we need to work on legacy software!!)
I’m setting up multiple sharepoint web applications on a single IIS server, to cater for a Dev and a QA environment.
So, I had deployed the PerformancePointDefault.master to the /_catalogs/masterpage folder, and when testing as the original site admin, everything was ok.
However, when logging in as a different user (who was also a site admin) I got the Sharepoint Permissions Denied error when trying to open a Dashboard page.
So, it turns out that the PerformancePointDefault.master was in a Pending state in the document library (/_catalogs/masterpage). Meaning that the second user couldn’t see the master page, although he was a site admin. Once Approving and checking in the master page… problem resolved.

, ,

Leave a comment

Analysis Services – Data Driven Security Model


Analysis Services cubes provide a granular mechanism to maintain security rights, right from cube level through to individual dimensional elements. This security is maintained through the implementation of cube roles, where individual Windows Accounts or Groups can be assigned various access rights. These access rights are traditionally managed through SQL Management Studio, as and when required. See Technet for a full description on roles and how to maintain them.

There are cases however, where this maintenance can become tedious, especially in cases where a large number of individual users require granular rights against dimensions with many hundreds or thousands of members.

In traditional SQL based systems, these types of scenarios are typically handled through SQL Server mapping tables. These tables would map individual windows logins against the corresponding entities that that user had rights to see. It is possible to re-use these types of mapping tables in the context of Analysis Services to manage security access. (without writing extensive XMLA or C#-type code to manage roles!!)


Let’s take a version of the Adventure Works cube, and assume that we need to manage security rights to the Product Sub Category level in the Product dimension. The process would be as follows:

  1. Implement and Populate Mapping Table

 As described above, let’s create a mapping table, which maps an individual user against the dimension key that he/she has rights to access; something like:

  1. Include table into Cube Project

This table can then be incorporated into the cube Data Source View, linking to the ProductSubcategory table.

  1. User Dimension

From this table, we can create a User dimension, with one level, namely User. So, the user represented in the above table would have a key of [User].[User].[bi2010bob]

  1. Link Dimension to Cube

Firstly, let’s add our mapping table into the cube as an additional measure group, with a basic measure.

We can then add our new User dimension to the cube, and ensure that the relationships have been setup correctly.

  1. Create cube role

The final step in the process is to create a role in the cube which will use our underlying security mappings to limit user roles.

This role relies on some predefined MDX functionality.

  • UserName. This function in MDX returns the NT name of the user that is connected to the cube, in the form domainusername. E.g. bi2010bob
  • StrToMember. This function takes string input, which can be a concatenated expression, and returns the underlying cube member with this name or key.
  • Exists. This lesser-used function determines limits a given set, returning only those members that have actual values in a given Measure Group.
    • Again, see Technet for a full description of Exists

We create a role, with the following properties:

  1. General à Read Definition
  2. Membership à NT AuthorityAuthenticated Users
  3. Cubes à Read Access to the Required Cubes, in our case Sales.
  4. Dimensional Data à in the attribute against which we’re creating security, in our case Product Subcategory, add the following MDX (Under the Advanced Tab, Allowed Member Set)
    1. EXISTS([Product].[Subcategory].Members, STRTOMEMBER(“[User].[User].[” + UserName + “]”), “Fact Security”)
  5. Enable Visual Totals. This will depend on your security policy. If checked, will only show totals that apply to the restricted set. If unchecked, the All levels will display the totals as if security were not applied.  

6. Test

All done! Now we’re ready to deploy, process and test.

The table below outlines the different result when querying Order Qty by Subcategory, as an Administrator and as Bob.

Administrator  Bob 



Extending the Analysis Services security model to exploit SQL managed mappings is a tremendous way to leverage existing security mappings, as well as manage those scenarios which are simply too cumbersome to manage by hand.


BI Debugging Thoughts – Part I – SSIS

Color coding is an interesting thing. Through our lives, through traffic signals, and through the occasional story book we’ve learnt a few important things:
1. Green is good.
2. Red is bad.
3. Yellow we can live with…
This lesson is being enforced right now as I’m debugging some SSIS packages!
(See, the Red / Yellow / Green KPI indicator thinking runs deep in the Microsoft BI stack!)
This has got me thinking, is it possible to actually document an approach to debugging BI Solutions? Quite a broad subject area to cover, as BI in the Microsoft sense includes a number of inter-related components.
So, I thought I’d start with the ETL pieces, and jot down a few areas that I tend to look into when the SSIS components go Red
1. Progress Tab.
 – Yip, the humble SSIS progress tab. This is an incredibly verbose logging area, and it normally gives a great indication as to the source of errors.
     – This is not always available, as sometimes we need to debug packages that ran through the SQL Agent… so it’s always advisable to implement good SSIS Logging. (which will give the same effect)
2. Metadata, Metadata, Metadata and the Intrepid Data Types
 – SSIS is incredibly sensitive to changes in schemas and data types. And in fact, may not execute if some of the underlyinh data structures have changed. (without properly updating the SSIS data flow components.
 – Simple enough to fix, simply open the relevant Data Flow Component and SSIS normally refreshes its metadata on open.
 – In this space, it’s worth noting that SSIS doesn’t like inserting Unicode into Varchar data type fields, so you may need some data type conversions in your flows.
(don’t feel bad, the performance overhead is minimal compared to the overhead of dealing with NVarchar at the database level, in my experience)
3. Script Component Debugging
 – This is actually what I’m doing right now. Quite a tricky thing to do on a x64 platform, as the script components are precompiled, and you can’t step through them
 – In this case we have resorted to an approach where all potentially dangerous commands are wrapped in a try {} catch {} block, and the catch {} logs the error through to a text file.
(of course this text file could be the progress tab, a message box or the event log itself; either way it just needs to be set up.)
 – My initial rule of thumb here would be… don’t use the Script Component unless you really need to. SSIS has a variety of components that can be used to do things like FTP, file handling and cube processing… so, use what exists where possible before re-inventing the wheel!
4. 32 / 64 bit
 – Talking about bits, it’s important to note there are a few things in SSIS that don’t upgrade very well from 32 bit to 64 bit platform.
 – A good example is the Excel Source. (
5. Event Log
 – Often overlooked, and yet incredibly useful area to hunt through. It is amazing how many useful exception messages are found in the Application Log.
6. Green isn’t always Good 🙂
 – Yip, even though a package may compile and execute with no visible structural errors, it is always important to balance the end results by hand or by script to make sure that things haven’t gone a little wierd in translation.
Oh look, my package is finally all green!! Wonderful.


Leave a comment

Office 2010 RC – Activation Issues

So, I installed Office 2010 RC this morning… Really looking forward to seeing what has changed / become more stable.
When starting Outlook, I then got a message indicating that I needed to Activate the Product. No Problem, I simply clicked ‘Activate over the internet’, and waited for the success message as before…
Two errors:
– Firstly, I got an ‘Unexpected Error has Occured’ message.
Not a very useful message, but it turns out that the power in our server room had gone down, and hence I had no internet connection.
Easily solved, I just popped in next door to Microsoft, hooked up, and tried again.
This is when I ran into error 2.
– When activating over the internet, I got a message indicating that the key I had used had been activated too many times, and I needed to use another key or call the contact centre.
Again, no big deal right. Simply get a key from connect, go to File –> Help, and change the product key (as every MS article indicates you can do)
Well, that link simply isn’t there on my install!
So, to change a product key, I had to go to Add/Remove Programs. Change the Office 2010 install. And then I got the option to change the product key.
Ran through that wizard, and everything is working 100% now.


Leave a comment

Excel 2007 / 2010 + Pivot Table Column Width

Well, I’m going to write this particular blog post at the risk of looking incredibly silly, however it may help someone…

I’m in the process of creating some internal reports using Excel 2010 Beta on top of an Analysis Services back end. This requires the creation of a number of Pivot Tables, and some super slinky Slicers, and the report is looking really nice.

One small glitch in the process is that whenever I change Slicer values, the Pivot Table columns automatically resize, leaving me with a number of cells with the inevitable “###” values. This requires me to auto expand the column to see the values again. Not an ideal solution to present to my Financial Director J

To overcome this small issue, simply go to the Pivot Table options, and unselect the “Autofit column widths on update” check box. Set the column widths to what you need, and you’re in the game.


Sharepoint 2010 Beta – Configuration Failed

When installing and configuring Sharepoint 2010 Beta yesterday, I ran into a small bug. No big surprise seeing as it is Beta software after all J

I installed the Sharepoint software successfully on a single server, on Windows Server 2008 R2 and SQL Server 2008 R2. However, when running the Sharepoint 2010 Products Configuration Wizard, an error occurred when running step 8.

The error reads as follows:

Configuration Failed

One or more configuration settings failed. Completed configuration settings will not be rolled back. Resolve the problem and run this configuration wizard again. The following contains detailed information about the failure:

Failed to create sample data.

An exception of type Microsoft.Office.Server.UserProfiles.UserProfileException was thrown. Additional exception information: Unrecognized attribute ‘allowInsecureTransport’. Note that attribute names are case-sensitive. (C:Program FilesMicrosoft SharedWeb Server Extensions14WebClientsProfileclient.config line 56)

The root cause seems to be some conflict with the WCF which requires a hotfix, and Ian Campbell has written a nice blog entry about the error here

However, I was unable to download the hotfix yesterday, so to resolve the problem, I simply removed the following XML lines from the config file mentioned above…



allowInsecureTransport="true" />

I then reran the Sharepoint 2010 Products Configuration Wizard, and configured successfully. Probably not the recommended route, as I’m sure something will be looking for this tag in future, but it seems to be a stable short term fix so far.

1 Comment