Overview of Using Analysis Services Data in the new Power BI Preview

Welcome to part 1 of a 3-part series on using the SSAS Connector with the new Power BI Preview. This information is as of Feb 2015 and is subject to change as the Power BI Preview evolves.

Part I:   Overview of Using Analysis Services Data in the new Power BI Preview

Part II:  What You Need to Set Up the SSAS Connector in the new Power BI Preview {coming very soon!}

Part III: Monitoring the SSAS Connector in the Power BI Preview {coming very soon!}

Overview of How the SSAS Connector Works

The functionality to access SSAS data directly via the Power BI Preview works by using EffectiveUserName in the connection string. The method of using EffectiveUserName in an expression-based connection string has been around a long time as one alternative to configuring Kerberos delegation for implementing security in a BI environment.

Below is a representation of what's happening: the SSAS Connector sits in the middle as a proxy. The user query to the SSAS tabular database is issued as the Administrator (as configured by the SSAS Connector), but with an EffectiveUserName of the person who is running the report (i.e., issuing the query).

Note: In the diagrams, I show two machines in the domain; this could be one if you prefer to install the SSAS Connector directly on the SSAS server. More on that in the next part of this series.

What Happens When a Dashboard is Shared?

With EffectiveUserName we see that role-based security settings are honored based on the user running the report - that's great, and as expected, for the user who created the report in the first place (i.e., the report or dashboard owner). How about if User A shares a dashboard with User B and these two users do *not* have the same security settings specified in the SSAS roles?

Unfortunately, at this point in the Power BI Preview, it will be User A's security settings (i.e., the owner of the dashboard that was shared) that will be inherited by User B. Put another way, anyone User A shares with will see User A's data. This is obviously not ideal if the users are not typically permitted to see the exact same thing. At some point in the (hopefully near) future we need for User B (the recipient of the share) to be who is passed in the EffectiveUserName. Until a recipient of a shared dashboard becomes the EffectiveUserName, the recommendation has to be not to share dashboards that use SSAS under the covers unless you know row-level security isn't a factor (i.e., if you're using SSAS due to higher data volumes or more frequent refreshes or some other reason besides role-based security).

SSASConnector2.jpg

Advantages of Using the SSAS Connector

  • No need to store redundant set of data inside of a Power Pivot workbook. This keeps the data more secure & reduces risk of error.
  • Because there's no need to store redundant data, there's also no need to set up a separate data refresh schedule in Power BI. The data is as fresh as the scheduling for the underlying SSAS Tabular model.
  • The reports and the data are now truly separated. This is huge!  It enables delivery of multiple reports, to multiple users, using the same source data. This wasn't previously possible without multiple Power Pivot models (which introduces more data redundancy and risk of misalignment between the multiple Power Pivot models).  **Note this won't really be possible until the issue with the EffectiveUserName for shared dashboards gets resolved.**
  • Honors row-level security specified inside of SSAS.  **For the original report author.
  • There are currently no documented restrictions on whether it's operating in the in-memory (stored in SSAS) mode or direct query mode. This means you should be able to use a tabular model to access a SQL Server relational database or even APS in direct query mode, if desired. (Though you do want to aggregate and summarize the data whenever possible so as not to try to pull back a ton of data volume.)

Disadvantages of Using the SSAS Connector

  • As discussed above, the EffectiveUserName issue for shared dashboards.
  • Currently the SSAS Connector only supports Tabular Models, but Microsoft has publicly stated that support for Multidimensional is coming.
  • Q&A functionality is not yet available in Power BI when SSAS is the data source.
  • If Power Query was used in an Excel workbook that's being upgraded to an SSAS Tabular model, the components written in Power Query must be done a different way (such as SSIS for instance). Currently there's not a migration path for Power Query to SSIS, nor does Power Query integrate with SSAS. A Power Pivot workbook with its data going straight into Power Pivot (without Power Query) is eligible for upgrade.
  • Currently there's no scale-out method documented for the SSAS Connector like there is with the Data Management Gateway.

Part II:  What You Need to Set Up the SSAS Connector in the new Power BI Preview {coming very soon!}

Part III: Monitoring the SSAS Connector in the Power BI Preview {coming very soon!}

Finding More Information

Power BI Support Documentation - SQL Server Analysis Services Tabular Data

MSDN - Connection String Properties (Analysis Services)

Documenting Precedence Constraints in SSIS

Just sharing a quick tip about showing the names of precedence constraints within an SSIS package. This is something I tend to do when the precedence constraint is based on something besides a normal "success."

Here's an example of where I have 4 precedence constraints in my SSIS package. Each precedence constraint references the value of a different variable in order for it to be allowed to proceed:

It's a bit easier to see what the package is doing at a glance if the precedence constraint annotations are displayed. 

The first precedence constraint contains an expression and a constraint. It looks like this:

 

Basically it is saying that, not only does the control flow task have to succeed, but a variable (being populated by the same task) also needs to have a count > 0 in order to be able to proceed. 

By default, the "ShowAnnotation" property on a precedence constraint is set to "As Needed." The canvas does show the fx symbol which means there's an expression placed on the precedence constraint. 

SSISPC_4.jpg
 

Since I wanted the package to be a bit more self-documenting at a glance, I changed the "ShowAnnotation" property to ConstraintName. And then I gave the constraint a good name like "If RowCount_ContractorEntity > 0" that would make sense when displayed on the canvas.

 

You might also like the output from the "ShowAnnotation" property called "ConstraintOptions."  It's even more specific.

 

There's also a "ShowAnnotation" property for "ConstraintDescription" but it doesn't work for me like name does, so I don't use that one currently.

Managing Q&A Synonyms in Power BI for Office 365

Before diving into details, I have a question for you reader:  What do you expect to happen with this sequence of events?

  1. Upload an Excel workbook to Power BI for Office 365. Enable it for Q&A.
  2. In the Q&A optimizations area of the Power BI Site Settings (in the web), set up some synonyms & refine the auto-generated synonyms. Perhaps you also create & refine phrasings.
  3. Re-upload the Excel workbook after some modifications were made to the Excel file.

Will the work you did on synonyms in the web still be there after the Excel workbook has been modified? Nope!  There's a way to handle this so you don't lose any work though.

Note that this information is based on what I experienced with a client project in January 2015 using Power BI for Office 365. This does not currently apply to the new Power BI Preview & I don't know to what extent Q&A will be similar in the new Power BI Preview platform. With Power BI constantly evolving and improving, this information will become out of date so be sure to do your own testing.

Viewing Synonyms in the Excel Client vs. in Power BI Site on the web

The diagram above depicts that you can create synonyms in two places:

Inside of Excel you can only see user-generated synonyms - i.e., the synonyms you set up. You cannot see phrasings inside of Excel at all currently, nor can you see the system-generated synonyms. And there a *lot* of system-generated synonyms!

In the Q&A optimizations area of the Power BI site, you can view all synonyms, both user-generated and system-generated. Here's where the system uses its learning algorithms and attempts to add synonyms and phrasings on your behalf. This week we had a few chuckles with Meagan Longoria at a client site - she had a stock price type of column & the system-generated synonyms added things like cows and sheep based on the word stock. Clearly, the system-generated synonyms need some refinement (sidenote: a great future feature would be to have a setting which tells Q&A if it should try to generate synonyms automatically or if the user prefers it doesn't do so). Note that even after you refine the system-generated synonyms, it continues to learn so clearing out synonyms it sets up that aren't actually relevant will take a few iterations.

Since you can view everything related to Q&A in the website and not the Excel client, one way to handle things is to have a process where you use the web for all Q&A optimizations. Let's talk about how that might work...

Managing Q&A Optimizations - Export File

The above diagram highlights "XLSX File 1" but also illustrates how Q&A optimizations are handled on an individual file basis, and that the export files would be done on a per-file basis.

This technique presumes that you have a file share, folder, or source control somewhere where the Q&A optimizations export file would reside. The flow would look like this:

  1. Upload an Excel workbook to Power BI for Office 365. Enable it for Q&A.
  2. In the Q&A optimizations area of the Power BI Site Settings (in the web), set up some synonyms & refine the auto-generated synonyms. Perhaps you also create & refine phrasings.
  3. Export the Q&A optimizations to the designated location.
  4. Re-upload the Excel workbook after some modifications were made to the Excel file.
  5. Import the Q&A optimizations from the TXT file stored in the designated location.

As long as the person doing Q&A optimizations is diligent about steps #3 and #5 (especially #3), this should work.

Sidenote: Having versioning set on your Power BI document library is a great idea in case you ever need to revert to an older copy of the Excel workbook (be sure to set a limit on them though; 3-5 versions is usually enough). You might be inclined to think that if you "lost" your Q&A optimizations you could restore a previous version and regain them, then re-upload the latest workbook again. I wish I could say this works but it doesn't - we tried that this past week when a project team member lost hours of Q&A work. I can't explain why restoring an older version of the workbook doesn't bring back the Q&A optimizations at that point in time, but it didn't from my experience. I mention this so you know how very important the export file really is.

Re-Downloading the Excel File from Power BI Site to See User-Defined Optimizations

If you do want to re-download the file after Q&A optimizations have been done in the web, it appears that you need to wait a few minutes first for everything to get finished processing. On a client project we tested were the Q&A optimizations were done in the web, then the workbook was immediately downloaded. The new Q&A synonyms done in the web were not contained in the downloaded file. After a few minutes we tried the download again, and then the work done on synonyms was visible in the Excel file. So I can't say what's happening from the back-end perspective but there does appear to be a bit of a time delay. 

Handling Multiple Power BI Sites - Such as UAT and Production

If the Power BI environment is being managed by IT (as is the case fairly often from what I've seen), you may end up with two Power BI sites for your subject area: one for User Acceptance Testing (UAT) and one Production. That's a great way to roll out new changes for review & approval before impacting the 'live' version of the workbook. However, it introduces a few complications when it comes to Q&A.

Keep in mind that the 'scope' of Q&A optimizations is per file. The emphasis or focus will be different between the two sites:

  • In UAT, you would initially be working on refining synonyms and phrasings based on questions you initially believe will be asked of the model.
  • In Production, you would be focused on handling flagged items (i.e., items users flagged to indicate the Q&A results weren't helpful), handling unrecognized words, evaluating usage stats, as well as continuing to refine synonyms and phrasings as needed based on real-life usage.

So what does this mean for Q&A optimizations? Ideally, your flow in a two-tier environment would look like this:

  1. Upload an Excel workbook to Power BI for Office 365 (UAT). Enable it for Q&A.
  2. In the Q&A optimizations area of the Power BI Site Settings (UAT in the web), set up some synonyms & refine the auto-generated synonyms. Perhaps you also create & refine phrasings.
  3. Export the Q&A optimizations from UAT to the designated location.
  4. Import the Q&A optimizations to Production from the file stored in the designated location.
  5. Continue making Q&A optimizations regularly in the Production Power BI Site Settings.
  6. Export the Q&A optimizations from Production to the designated location.
  7. Import the Q&A optimizations back into UAT from the file stored in the designated location.

For this to work well:

  • Q&A optimizations cannot be occurring simultaneously in UAT and Production. If that ends up happening and you don't want to lose work from either environment, you could try to manually merge the data in the export file.
  • The export file in the designated location should be treated as "the file" to be shared by both sites. This means anyone involved needs to understand the process carefully.
  • It presumes you want Q&A to function the same in UAT as it does in Production. If that isn't a need, then do Q&A optimizations in Production only and ignore it for the UAT site. Whether you can simplify like this depends on your needs for the UAT environment.

Multiple Users Involved with Q&A Optimizations

We learned this one the hard way. It seemed like a good idea to go into the site, and have team member A set up synonyms for table 1 and team member B set up synonyms for table 2. At this point it does not seem like multi-user support is there. So, the way I'm thinking about it now is:  We know in the Excel client only one person can be editing the Excel file at a time. Treat it the same way in the web.

Summary of Q&A Export File

In conclusion, the big message here is to be sure and export any work done in Q&A each time it's complete. The export file can serve these purposes:

  • Ability to re-import Q&A optimizations in the event of a file overwrite that loses the Q&A optimizations done in the web.
  • Promotion of Q&A optimizations from one site to another (such as UAT to Production, or vice versa).
  • Import Q&A optimizations to other workbooks if the same file structure exists (for example: you have the same workbook structure in 10 sites with different data contained in each file, all secured separately for different user bases - I don't recommend this as it creates issues with keeping all the separate files in sync, but it can be done).

You Might Also Like...

Where to Find Q&A Optimizations in Power BI

Getting New Power Pivot Features

Displaying Power Pivot Calculated Measures in Alphabetical Order

Sharing a quick tip I got the other day from Meagan Longoria...

See in this screen shot how the calculated measures are being displayed in random order in the PivotTable Field List? If you have a long list of measures, this can be a wee bit aggravating.

 

To fix it, click the gear symbol towards the top right of the PivotTable Field List. Change "Sort in Data Source Order" to "Sort A to Z."

 

And with that, we have a much more usable list to work with. I've noticed this setting isn't retained since I've had to reset the order of a file over and over as I refine the model, so that would be a really nice improvement at some point.