2014 Microsoft Most Valuable Professional (MVP) for SQL Server

1 April, 2014 (10:02) | Professional Development, SQL | By: Mark V

So… This happened:

Dear Mark Vaillancourt,
Congratulations! We are pleased to present you with the 2014 Microsoft® MVP Award! This award is given to exceptional technical community leaders who actively share their high quality, real world expertise with others. We appreciate your outstanding contributions in SQL Server technical communities during the past year.

I am so proud to receive this award from Microsoft. I can’t adequately explain how gratifying it is to know that my contributions to the SQL Community are seen as being so valuable as to be recognized by Microsoft.

I need to thank Microsoft as well as the many many fine people in the SQL Community who set an excellent example for how to make an impact. There are too many people to name without risking missing someone.

Holy cow…

New Role: Business Intelligence Enterprise Consultant

27 March, 2014 (15:40) | Professional Development | By: Mark V

I am delighted to announce that I have taken on a new role at Digineer, the consulting firm I work for and adore.

As the Business Intelligence Enterprise Consultant, I take on responsibilities that align well with my strengths and overall desires. At a high level, my duties are laid out below:

1. I still get to do work for clients, which I really enjoy.

2. I am responsible for helping to drive the overall development of the members of the Information Management (SQL Server) Team. This means that I get to help my teammates continue to grow their skills/careers as it relates to SQL Server and related tools. This is something I LOVE doing anyway, so it fits really well.

3. I am to help Digineer with service offerings and intellectual property related to BI and SQL Server tools in general. This also includes helping to establish best practices and methodologies for use on client projects.

4. I am to help serve as a respected expert/leader in the company as well as the community at large when it comes to BI and SQL Server and related technologies. This is a goal I have long held and continue to strive for. I have to say that, over the past few years, I have made great strides in this area. This also includes volunteering in the community as a speaker, blogger, etc.

I am really excited about this new role. I have been doing many of these things of my own accord for a few years. It is very gratifying to now have them be part of my job description. You hear people talk about organizations growing their own experts instead of hiring out all the time. I am an example of that practice. I started at Digineer 7 years having never used SQL Server before. Digineer enabled me, set me up for success, and got the heck out of the way to let me do it. I count myself very fortunate to work for a company that gets that.

There are so many people in the SQL Community who have served as examples for me over the years. Some of them don’t even know they have inspired me. I do need to list a few people in the SQL Community who have played a huge role in helping me get where I am today.

Lara Rubbelke (Blog|Twitter): Lara was the person who originally hired me at Digineer. She was the Enterprise Consultant at that time while also serving as the overall manager for the Information Management Team. It was Lara that first encouraged me to start Blogging and also got me involved with PASS (Professional Association for SQL Server).

Jason Strate (Blog|Twitter): Jason served as a mentor for me from the moment I started at Digineer. On my first project ever with SQL Server, Jason was my Top-Cover (overall advisor and teacher). Jason eventually became the Enterprise Consultant as well. It was Jason that really pushed me toward presenting and and helped a lot on the blogging front as well.

I wish to say a hearty Thanks to Lara, Jason, Digineer, and to all the members of the #SQLFamily to have helped me in one way or another.

Upcoming Presentations Spring 2014

18 March, 2014 (10:00) | PASS, Presentations, Professional Development, SQLSaturday | By: Mark V

It has been a while since I posted a list of upcoming presentations. In fact, some have come and gone without a blog post.

Recent Past:

SQL Saturday #241 Cleveland, OH – February 8, 2014

MDX Trek: First Contact

DANGER: The Art and Science of Presenting

West Michigan SQL Server User Group – February 27, 2014

MDX Trek: First Contact

* I ended up having to cancel this one at the last minute. I am really bummed about that and will discuss lessons learned in a separate post. :(

Pragmatic Works Free Webinar Series – March 11, 2014

MDX Trek: First Contact

On the Horizon:


SQL Saturday #287 Madison, WI – March 29, 2014

DANGER: The Art and Science of Presenting

PASS BI/DW Virtual Chapter – April 2, 2014

MDX Trek: First Contact

Montreal BI User Group – April 16, 2014

MDX Trek: First Contact

SQL Saturday #291 Chicago, IL – April 26, 2014

MDX Trek: First Contact

DANGER: The Art and Science of Presenting

Minnesota SQL Server User Group (PASSMN) – May 20, 2014

Power Query: The Data Chemist’s Laboratory

This is what I have on my schedule so far. I have to say that I am really happy about how often I have been able to speak at events and user groups over the past few years. It is an important part of my career development and I just love doing it.

Updated MDX Trek: First Contact Downloads

13 March, 2014 (10:00) | MDX, Presentations | By: Mark V

Greetings. After delivering my MDX Trek: First Contact presentation as part of the Pragmatic Works Free Training series on 3/11, I got some great feedback from an attendee. He pointed out that my single zip file download on my home page for the presentation only contained the SQL Server 2008 R2 version and that I may want to include upgrade instructions for people that have SQL 2012. That was a great point. I have neglected to do much with that download, even after I started delivering this presentation in the SQL Server 2012 tools some time ago. While the MDX syntax is the same, the project would have to be upgraded to be opened in SQL Server Data Tools as the existing zip was in BI Studio. This would require some extra steps and create more work for the target audience (which includes people just getting started with SSAS).

So, to rectify this situation, the home page for my MDX Trek: First Contact presentation now has separate downloads for SQL 2008 R2 and SQL 2012. You can got there now by clicking on the image below. I should have done this a long time ago and apologize for being so late.



Introduction To Analysis Services Extended Events

21 February, 2014 (12:00) | Extended Events, SSAS | By: Mark V

I started digging into using Extended Events to trace Analysis Services recently for a client. They wanted to do some tracing of their SSAS instances, and with the deprecation of SQL Profiler, Extended Events was the best long term solution.

I have to admit, when I first started looking at this topic, I was overwhelmed. Other than a few blog posts, which I will list out below, there was very little to go on. I believe, on the whole, SQL Server Books Online (msdn, technet, etc) have pretty solid content. But for using Extended Events on Analysis Services, I have to agree with Chris Webb (Blog|Twitter) that BOL provides little value. Note: Although the examples I have seen in the wild, as well as my example below, have used SSAS Multidimensional, I implemented this for SSAS Tabular at my client. So, it works for both.

I will not be advising you on what events to trace for different purposes. I am afraid that is beyond the scope of this post and not something I have deep knowledge about at this point.

In researching this topic, I used the following blog posts:

Chris Webb (Blog|Twitter) – Using XEvents In SSAS 2012

Bill Anton (Blog|Twitter) – Extended Events For Analysis Services

Andreas Wolter (Blog|Twitter) – Tracing Analysis Services (SSAS) with Extended Events – Yes it works and this is how

Francesco De Chirico (Blog|Twitter) – Identify Storage Engine and Formula Engine bottlenecks with new SSAS XEvents

These posts were all helpful in one way or another. In some cases, I used a post as the source upon which I based the queries I used. When that is the case, I will make it clear where my base code came from. I do this because I am a vehement supporter of giving credit where it is due.

Extended Events for Analysis Services, unlike that for the database engine, lacks a graphical user interface. You have to work in code. Not only that, but the code happens to be XMLA. Yikes. I know there are people who are good with XMLA, but I am not among them. That was part of what gave me trepidation as I started down the path of working with Extended Events for SSAS.

For the CREATE script for my Extended Events trace, I turned to Bill Anton’s blog post listed above. That script not only includes the base syntax, but he also includes every event (I think it is all of them anyway) commented out. This allowed me to just uncomment the events I wanted to trace, but leave the others intact for easy use later. For this script, make sure you are connected to an SSAS instance in Management Studio, not Database Engine. Also, you will ideally be in an XMLA query window; I was able to run this code in an MDX window as well, but my examples below will assume an XMLA window.

Note: In XMLA, lines beginning with <!– and ending with –> are comments. 

   1:  <!-- This script supplied by Bill Anton http://byobi.com/blog/2013/06/extended-events-for-analysis-services/ -->
   3:  <Create
   4:      xmlns="http://schemas.microsoft.com/analysisservices/2003/engine"
   5:      xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
   6:      xmlns:ddl2="http://schemas.microsoft.com/analysisservices/2003/engine/2"
   7:      xmlns:ddl2_2="http://schemas.microsoft.com/analysisservices/2003/engine/2/2"
   8:      xmlns:ddl100_100="http://schemas.microsoft.com/analysisservices/2008/engine/100/100"
   9:      xmlns:ddl200_200="http://schemas.microsoft.com/analysisservices/2010/engine/200/200"
  10:      xmlns:ddl300_300="http://schemas.microsoft.com/analysisservices/2011/engine/300/300">
  11:      <ObjectDefinition>
  12:          <Trace>
  13:              <ID>MyTrace</ID>
  14:              <!--Example: <ID>QueryTuning_20130624</ID>-->
  15:              <Name>MyTrace</Name>
  16:              <!--Example: <Name>QueryTuning_20130624</Name>-->
  17:              <ddl300_300:XEvent>
  18:                  <event_session    name="xeas"
  19:                                  dispatchLatency="1"
  20:                                  maxEventSize="4"
  21:                                  maxMemory="4"
  22:                                  memoryPartitionMode="none"
  23:                                  eventRetentionMode="allowSingleEventLoss"
  24:                                  trackCausality="true">
  26:                      <!-- ### COMMAND EVENTS ### -->
  27:                      <!--<event package="AS" name="CommandBegin" />-->
  28:                      <!--<event package="AS" name="CommandEnd" />-->
  30:                      <!-- ### DISCOVER EVENTS ### -->
  31:                      <!--<event package="AS" name="DiscoverBegin" />-->
  32:                      <!--<event package="AS" name="DiscoverEnd" />-->
  34:                      <!-- ### DISCOVER SERVER STATE EVENTS ### -->
  35:                      <!--<event package="AS" name="ServerStateDiscoverBegin" />-->
  36:                      <!--<event package="AS" name="ServerStateDiscoverEnd" />-->
  38:                      <!-- ### ERRORS AND WARNING ### -->
  39:                      <!--<event package="AS" name="Error" />-->
  41:                      <!-- ### FILE LOAD AND SAVE ### -->
  42:                      <!--<event package="AS" name="FileLoadBegin" />-->
  43:                      <!--<event package="AS" name="FileLoadEnd" />-->
  44:                      <!--<event package="AS" name="FileSaveBegin" />-->
  45:                      <!--<event package="AS" name="FileSaveEnd" />-->
  46:                      <!--<event package="AS" name="PageInBegin" />-->
  47:                      <!--<event package="AS" name="PageInEnd" />-->
  48:                      <!--<event package="AS" name="PageOutBegin" />-->
  49:                      <!--<event package="AS" name="PageOutEnd" />-->
  51:                      <!-- ### LOCKS ### -->
  52:                      <!--<event package="AS" name="Deadlock" />-->
  53:                      <!--<event package="AS" name="LockAcquired" />-->
  54:                      <!--<event package="AS" name="LockReleased" />-->
  55:                      <!--<event package="AS" name="LockTimeout" />-->
  56:                      <!--<event package="AS" name="LockWaiting" />-->
  58:                      <!-- ### NOTIFICATION EVENTS ### -->
  59:                      <!--<event package="AS" name="Notification" />-->
  60:                      <!--<event package="AS" name="UserDefined" />-->
  62:                      <!-- ### PROGRESS REPORTS ### -->
  63:                      <!--<event package="AS" name="ProgressReportBegin" />-->
  64:                      <!--<event package="AS" name="ProgressReportCurrent" />-->
  65:                      <!--<event package="AS" name="ProgressReportEnd" />-->
  66:                      <!--<event package="AS" name="ProgressReportError" />-->
  68:                      <!-- ### QUERY EVENTS ### -->
  69:                      <!--<event package="AS" name="QueryBegin" />-->
  70:                      <event package="AS" name="QueryEnd" />
  72:                      <!-- ### QUERY PROCESSING ### -->
  73:                      <!--<event package="AS" name="CalculateNonEmptyBegin" />-->
  74:                      <!--<event package="AS" name="CalculateNonEmptyCurrent" />-->
  75:                      <!--<event package="AS" name="CalculateNonEmptyEnd" />-->
  76:                      <!--<event package="AS" name="CalculationEvaluation" />-->
  77:                      <!--<event package="AS" name="CalculationEvaluationDetailedInformation" />-->
  78:                      <!--<event package="AS" name="DaxQueryPlan" />-->
  79:                      <!--<event package="AS" name="DirectQueryBegin" />-->
  80:                      <!--<event package="AS" name="DirectQueryEnd" />-->
  81:                      <!--<event package="AS" name="ExecuteMDXScriptBegin" />-->
  82:                      <!--<event package="AS" name="ExecuteMDXScriptCurrent" />-->
  83:                      <!--<event package="AS" name="ExecuteMDXScriptEnd" />-->
  84:                      <!--<event package="AS" name="GetDataFromAggregation" />-->
  85:                      <!--<event package="AS" name="GetDataFromCache" />-->
  86:                      <!--<event package="AS" name="QueryCubeBegin" />-->
  87:                      <!--<event package="AS" name="QueryCubeEnd" />-->
  88:                      <!--<event package="AS" name="QueryDimension" />-->
  89:                      <!--<event package="AS" name="QuerySubcube" />-->
  90:                      <!--<event package="AS" name="ResourceUsage" />-->
  91:                      <!--<event package="AS" name="QuerySubcubeVerbose" />-->
  92:                      <!--<event package="AS" name="SerializeResultsBegin" />-->
  93:                      <!--<event package="AS" name="SerializeResultsCurrent" />-->
  94:                      <!--<event package="AS" name="SerializeResultsEnd" />-->
  95:                      <!--<event package="AS" name="VertiPaqSEQueryBegin" />-->
  96:                      <!--<event package="AS" name="VertiPaqSEQueryCacheMatch" />-->
  97:                      <!--<event package="AS" name="VertiPaqSEQueryEnd" />-->
  99:                      <!-- ### SECURITY AUDIT ### -->
 100:                      <!--<event package="AS" name="AuditAdminOperationsEvent" />-->
 101:                      <event package="AS" name="AuditLogin" />
 102:                      <!--<event package="AS" name="AuditLogout" />-->
 103:                      <!--<event package="AS" name="AuditObjectPermissionEvent" />-->
 104:                      <!--<event package="AS" name="AuditServerStartsAndStops" />-->
 106:                      <!-- ### SESSION EVENTS ### -->
 107:                      <!--<event package="AS" name="ExistingConnection" />-->
 108:                      <!--<event package="AS" name="ExistingSession" />-->
 109:                      <!--<event package="AS" name="SessionInitialize" />-->
 112:                      <target package="Package0" name="event_file">
 113:                          <!-- Make sure SSAS instance Service Account can write to this location -->
 114:                          <parameter name="filename" value="C:\SSASExtendedEvents\MyTrace.xel" />
 115:                          <!--Example: <parameter name="filename" value="C:\Program Files\Microsoft SQL Server\MSAS11.SSAS_MD\OLAP\Log\trace_results.xel" />-->
 116:                      </target>
 117:                  </event_session>
 118:              </ddl300_300:XEvent>
 119:          </Trace>
 120:      </ObjectDefinition>
 121:  </Create>

I modified Bill’s original script for my own purposes in a few places.

I used my own Trace ID and Trace Name in lines 13 and 15 respectively.

  12:          <Trace>
  13:              <ID>MyTrace</ID>
  14:              <!--Example: <ID>QueryTuning_20130624</ID>-->
  15:              <Name>MyTrace</Name>
  16:              <!--Example: <Name>QueryTuning_20130624</Name>—>

I uncommented the Query End event on line 70 as well as the AuditLogin event on line 101 since those were the events I wanted to trace, to keep things simple.

70: <event package="AS" name="QueryEnd" />

101: <event package="AS" name="AuditLogin" />

I put my own output file path on line 114.

114: <parameter name="filename" value="C:\SSASExtendedEvents\MyTrace.xel" />

I also added a comment on line 113.

113: <!– Make sure SSAS instance Service Account can write to this location –>

I did this because I tripped over this myself. I initially got an Access Denied message when running the script above. Once I gave my SSAS instance service account rights to modify the C:\SSASExtendedEvents folder, I was good to go and the trace started just fine.

When you execute the query, your Results pane should look like the screenshot below. This indicates success. Gotta love XMLA, huh?


You can verify your Extended Events trace is running by executing the following query in an MDX query window connected to the same instance in which you started the trace. The query below is in all of the blog posts referenced above.

SELECT * FROM $system.discover_traces

My results for this query looked like this:


Note the line highlighted in the red rectangle indicates “MyTrace” and the type is XEvent. Hazzah! You can also take a look at the destination folder specified for your output file. In my case, that is C:\SSASExtendedEvents, shown below.


There are two files here because I kept the output file from a test run earlier. I did that to show you that the function I will use to import this information into a tabular form in the database engine can iterate over multiple files easily. You will note that the engine added lots of numbers to the filename. I have not run this long enough to spill over into multiple files, but I am assuming the _0_ would refer to the first file in a tracing session. As in, the next file would have the same name, but with _1_, the next file _2_, and so on. But, that is just a guess. The long string of numbers after that seem to just be there to make sure the trace file name is unique.

OK. So, we have an Extended Events trace running. Now what? Well, let’s run some queries. In my case, I just ran some of the MDX queries from my MDX Trek: First Contact presentation. The queries themselves don’t really matter. Just query a database in your SSAS instance in some fashion.

Reading from the xel file in code (as opposed to manually in Management Studio) involves one of two processes I am aware of.

1. The sys.fn_xe_file_target_read_file function followed by shredding some XML. This function was mentioned by Bill Anton and Francesco De Chirico in their posts.

2. Jonathan Kehayias (Blog|Twitter) mentioned to me, on Twitter, the use of the QueryableXEventData class via .Net code. He stressed that this is his preferred method as it is much faster then using the sys.fn_xe_file_target_read_file function and then the XML shredding.

Trusting Jonathan on Extended Events, among many other topics, is a good idea. However, not being a .Net person, and wanting to post this while it is fresh in my mind, I am going to demonstrate the first method. I did find that method 1 is not ultra speedy, to be sure. But for the moment, event at my client, it will serve. I do intend to dig into the .Net and perhaps blog that when I do. :)

In Francesco De Chirico’s post, he not only discusses the use of the sys.fn_xe_file_target_read_file function to read in the xel files, but also provides great examples of the XML shredding. XML and I have an understanding: we both understand that I am horrible at XML. :) So, the fact that Francesco provided the XML shredding syntax was a great find for me.

   1:  /****
   2:  Base query provided by Francesco De Chirico 
   3:  http://francescodechirico.wordpress.com/2012/08/03/identify-storage-engine-and-formula-engine-bottlenecks-with-new-ssas-xevents-5/
   5:  ****/
   7:  SELECT
   8:        xe.TraceFileName
   9:      , xe.TraceEvent
  10:      , xe.EventDataXML.value('(/event/data[@name="EventSubclass"]/value)[1]','int') AS EventSubclass
  11:      , xe.EventDataXML.value('(/event/data[@name="ServerName"]/value)[1]','varchar(50)') AS ServerName
  12:      , xe.EventDataXML.value('(/event/data[@name="DatabaseName"]/value)[1]','varchar(50)') AS DatabaseName
  13:      , xe.EventDataXML.value('(/event/data[@name="NTUserName"]/value)[1]','varchar(50)') AS NTUserName
  14:      , xe.EventDataXML.value('(/event/data[@name="ConnectionID"]/value)[1]','int') AS ConnectionID
  15:      , xe.EventDataXML.value('(/event/data[@name="StartTime"]/value)[1]','datetime') AS StartTime
  16:      , xe.EventDataXML.value('(/event/data[@name="EndTime"]/value)[1]','datetime') AS EndTime
  17:      , xe.EventDataXML.value('(/event/data[@name="Duration"]/value)[1]','bigint') AS Duration
  18:      , xe.EventDataXML.value('(/event/data[@name="TextData"]/value)[1]','varchar(max)') AS TextData
  19:  FROM
  20:  (
  21:  SELECT
  22:        [FILE_NAME] AS TraceFileName
  23:      , OBJECT_NAME AS TraceEvent
  24:      , CONVERT(XML,Event_data) AS EventDataXML
  25:  FROM sys.fn_xe_file_target_read_file ( 'C:\SSASExtendedEvents\MyTrace*.xel', null, null, null )
  26:  ) xe


In line 25, note that the file target indicates MyTrace*.xel. This is because the latter part of the file name(s) will not necessarily be known. The MyTrace*.xel tells the function to iterate over all files matching that spec. Thus, when I run this query, it will pull the data from both of the files shown earlier in my C:\SSASExtendedEvents folder.

In Line 24, we are converting the Event_Data column, which the function returns as an nvarchar(max), into XML to enable use to use the value() method.

Please note that I am not pulling all of the information available in the xel file. I am just pulling the fields I cared about for my purposes. There is more in there. And that will vary depending on the events you choose when creating the trace.

When I run this query, I get the following:


I can use this, like I did at my client, to insert into a SQL Server database table for later analysis. We are actually planning a Tabular model on this data to help track usage of their BI offerings in their organization. That will be fun to play with.

Once you are ready to stop the trace, execute the following XMLA:

   1:  <Delete xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
   2:      <Object>
   3:          <TraceID>MyTrace</TraceID>
   4:      </Object>
   5:  </Delete>


That’s about it. I hope this proves to be a meaningful addition to what is available for working with Extended Events on Analysis Services. It was certainly a great learning experience for me.

“Winning” The Power BI Demo Contest

7 February, 2014 (15:29) | Power BI, Professional Development | By: Mark V

First things first. According to the official rules, I did not win. My video did not even make it to the Top 15 Semi-Finalists. Not even close. The number of votes I got was laughable compared to others. But it was never about the votes for me. I never really had any illusions of winning the contest. However, this contest had #winning all over the place for me anyway. I shall explain.

#winning : I got to play with some really exciting tools. From Power Query to Power Pivot to Power View to the Power BI Team Site I played with for my demo, I had a total blast.

#winning : Holy crap is Power Query awesome! Even the base options in the tool’s ribbon makes common things really easy. I only dabbled a tiny bit with M (Officially: The Power Query Formula Language), but that was really cool as well. I will certainly be delving more deeply into Power Query and M.

#winning : I went from never having done a video to recording and editing a video I can be proud of. I used Camtasia Studio (got a 30-day free trail) and LOVED that software. I watched about 30 minutes worth of training videos and then went to town. That was a great experience I would love to repeat. One day I shall get my own license and make some videos… ideas are already churning.

NOTE: Techsmith, the maker of Camtasia Studio, has not compensated me in any way for saying these things. I have used their SnagIt software for years and love it. And I loved using Camtasia Studio as well. This is my own honest assessment.

#winning : I feel the need to mention Power Query again.

#winning : I learned my DAX needs some attention. After posting my video, i got a Tweet from the mighty Dan English (Blog|Twitter): “I think all three of your DAX calcs i would have done differently:)” This turned into a little back and forth discussion about how I could have done them differently. And it was not just about the awesome DIVIDE() function that I only remembered after submitting my entry. As such, I have recommitted myself to really digging deeper on this exciting expression language. I want to thank Dan for sparking that again.

NOTE: I beg you not to be afraid of posting your work for fear of embarrassment. The feedback we get from others helps us grow and improve. When I post my work/code, I have learned NEVER to say “This is the BEST way” when I do so. I leave it open to others to provide different suggestions. I sometimes flat out ask for exactly that. This is on purpose and part of what keeps me learning.

#winning : The mighty Paul Turley (Blog|Twitter) included my demo in his list of his favorites. See his Power BI Contest post. That was a great compliment from someone I have long respected.

#winning : More Power Query.

#winning : My entry got 218 views. That is a paltry sum when compared to others, to be sure. But from my perspective, that is 218 people who may not have seen my work otherwise. That is 218 people who may choose to come to a session of mine at a SQL Saturday, PASS Summit, or other event at which I speak. That is 218 people who may not have known I exist before that have now been introduced to me via something I am really proud of.

With the Winter Olympics in Sochi having just gotten under way, I remembered a swimmer in the 2000 Summer Olympics in Sydney. Eric Moussambani represented Equatorial Guinea in the 100 Meter Freestyle. His two competitors both had false starts and were disqualified. Eric swam alone and put in a time that, while more than double the fastest times for that event, set a national record for Equatorial Guinea. That race, at the Olympic Games, was the first time he had been in an Olympic size swimming pool. When he finished, the crowd cheered like mad. He was interviewed afterward and asked how he felt. He replied, “I’m happy.” Eric’s definition of #winning was different from that of the others. I would encourage you to watch this video about this race and Eric’s #winning attitude. I am not comparing myself to Eric. Rather, I am calling attention to the idea that the only way to really lose is to stop learning and stop having worthwhile experiences.

I really want to encourage you to jump at opportunities like the Power BI Demo Contest. There are great experiences waiting for you. There are great learning opportunities waiting for you. And don’t be afraid to create your own definitions of #winning.

Upcoming Presentations: SQL Saturday #241 Cleveland

27 January, 2014 (10:17) | MDX, PASS, Presentations, Professional Development, SQLSaturday | By: Mark V





It is with great joy that I announce that I will be presenting at SQL Saturday in Cleveland on February 8th. I have driven through Cleveland before, but never stopped for long. So, this will be my first real visit. Although, if it helps, I used to love to play as the Cleveland Browns in Tecmo Bowl back in my Nintendo days. I will be giving two sessions.

MDX Trek: First Contact

Cube space; the final frontier. In this Star Trek themed introduction to MDX, we will discuss the fundamentals of cube structure and vocabulary, including tuples, members, sets, hierarchies, and more. We will introduce and demonstrate the basic syntax of MDX with queries that include navigating hierarchies and even some time-based expressions. This session will give you the tools you need to write simple, yet meaningful, MDX queries in your own environment.

Session Level: Intermediate

I love this MDX session. I have given it many times over the past few years. The feedback has been overwhelmingly positive. It turns out that my view of the Cube space is a bit revolutionary. I have heard that writing MDX was like trying to solve a Rubik’s Cube in your head. When I first started dealing with MDX, I understood what that meant. But I soon found that it need not be that hard. In this session, before diving into code, I explain my model of looking at the Cube space that is much easier to deal with and understand. The Star Trek theme also keeps this really fun.

DANGER: The Art and Science of Presenting

Is there a great difference in the brain chemistry of someone fleeing a hungry mountain lion and someone presenting to a group of colleagues in a corporate board room? The answer is: NO. Over the past decade, a lot has been learned about the chemistry of the brain and why humans react the way we do to events in our environment. The concept of EQ (Emotional Intelligence) is a compelling and growing concept that applies this knowledge in a set of learnable, improvable skills for leading human beings. While EQ is often applied to corporate leadership, the parallels to presenting are fantastic. This session will explain the basics of EQ and demonstrate how you can apply it to make your presentations better in the following areas:

* Crafting better slide decks
* Preparing yourself for presenting
* Delivering your content
* Dealing with the unexpected

Understanding and practicing the concepts of EQ can make your presentations a better experience for everyone in the room–including you.

Session Level: Beginner

In this session, which I gave at the PASS Summit in Charlotte, I introduce the concepts and skills of Emotional Intelligence as they relate to presenting. This, too, has been incredibly well received and the feedback has been spectacular. Presenting is definitely a strength of mine and this session shows some of the mechanics behind my philosophy. This session can not only help you with presentations and their delivery, but also lays a great foundation for leadership and working with other humans.

I am also excited to announce that Digineer, the consulting firm I work for and adore, is a Gold Sponsor for this SQL Saturday. As such, I will also be giving a shorter presentation during lunch. This presentation, “Keeping The Business In Business Intelligence” lays out our philosophy around BI. While this session will touch a bit on Digineer and who we are, it will also be grounded in solid content for achieving success in Business Intelligence initiatives.

SQL Saturday has been a hugely successful program. I have participated in as many SQL Saturdays as I could over the past several years. You can read about many of my experiences in previous posts on this blog. I have to say that SQL Saturdays have been a hugely important part of my growth in working with SQL Server and related tools. The idea of members of the SQL Community (dubbed SQLFamily with good reason) sharing their expertise with others at free events is just exciting and inspiring. I am proud to be a part of these events. I also consider it part of my own personal mission to help encourage new speakers. If you have questions about speaking (or blogging), please come chat with me. I love helping people get started. The more people we have sharing their knowledge and passion, the stronger a community we are.

Oprah And The 2014 PASS Business Analytics Conference

24 January, 2014 (10:00) | Business Analytics, PASS | By: Mark V

After the success of the 2013 PASS Business Analytics Conference, PASS is doing another one. The 2014 PASS Business Analytics Conference will take place May 7-9 in San Jose, CA.


Last year, I was a speaker as well as part of the official Blogger Core for the event. You can read my posts on this topic:

Who’s Got Two Thumbs And Is Speaking At The PASS Business Analytics Conference?

Business Analytics And PASS: Yes, Please!

PASS Business Analytics Conference – Live Blogging – Keynote Day 1

PASS Business Analytics Conference – Live Blogging – Keynote Day 2

PASS Business Analytics Conference Recap

Alas, I am unable to attend this year. But I wanted to help spread the word about what I feel is a hugely valuable learning opportunity.

In the 1990s, you often heard people talk about the Information Age. This was essentially the revolution of computerization and the adoption of our new digital world. You could argue that we are still in the Information Age, but I think we have transcended that simple definition. Even in the Information Age, information was something to be tightly controlled and protected as an asset; something to be used by the privileged ones.


Analytics solutions were there to be used by senior people in companies in order to drive strategic decisions, etc. It was not something to be shared with just anyone, even within those organizations. What we have seen over the past several years is the adoption of the idea that everyone should have access to better information. The concepts of the Democratization of Data and bringing BI to the Masses have taken root and are driving a lot of the innovation that we have been seeing. With this movement, people are truly realizing that it is not only CXOs and senior managers that need better information to make better decisions.

I picture Oprah standing before all of us, as her audience, saying “YOU get access to better information! And YOU get access to better information! And YOU! and YOU! You ALL get access to better information!”

Image Source: http://www.flickr.com/photos/puroticorico/2129229071/sizes/l/

From the release of Power Pivot for Excel 2010 to the incorporation of Power View into Excel 2013 to the launch of Power BI for Office 365, Microsoft has certainly embraced this viewpoint. Anyone who needs to make decisions can benefit from better information. As such, the role of the Information Worker has expanded to more and more people as the tools of the trade have become much simpler to use. What is key, though, is that people understand how to use this information, and the tools involved, effectively. I have to applaud PASS for creating a Business Analytics Conference at such an important time and continuing to help us make better use of such a highly prized asset.

Although I cannot attend PASSBAC this year, I really want to encourage you to do so if you can. My own experience last year was just fantastic. PASS consistently puts on quality events with great speakers and networking opportunities. And I have no doubt the 2014 PASS Business Analytics Conference will live up to expectations.

NOTE: If you had told me back when I first started blogging that I would feature Oprah in a post, I never would have believed you. But, here we are…

Survey: Changing Model.bim Filename In SSAS Tabular Projects

22 January, 2014 (13:16) | SSAS, Survey, Tabular | By: Mark V

I am working for a client that has several Tabular models and are developing more. Even thought the process of developing Tabular models in SSDT could use some improvement, I am happy to see this exciting technology being adopted.

I noticed that the models here are pretty much all called Model.bim in the project. I have typically renamed mine to provide better context and never encountered an issue. My thinking was based on the multi-dimensional world in which a Cube called Cube is pretty ambiguous as to what information it contains. Likewise, a table called Table or a database called Database. Those examples are a little different, though, since a tabular project can only contain ONE .bim file at the moment.

William Weber (Blog|Twitter), with whom I am working on this project, pointed out that Books Online indicates that the name of the bim file should not be changed:


There is so little detail here as to make me question what could happen. I reached out in general on Twitter and no one seemed to have a good explanation. Today I asked SSAS Program Manager Kasper de Jonge (Blog|Twitter) directly. Kasper knew of no specific issue, either, and suggested it was probably just not tested. Fair enough.

Although, there does seem to be some gray area here. With this post, my hope is that we can eliminate some of the gray and provide better clarity around this for all of us. I would appreciate responses to this in comments.

1. Do you rename your Model.bim file and why/why not?

2. If you do rename it, have you had issues as a result? If so, what issues?


Power BI Demo Contest Entry

15 January, 2014 (00:55) | Business Analytics, DAX, Power BI, Power Pivot, Power Query, Power View | By: Mark V

Behold! I hereby present my entry into the Power BI Demo Contest! I am really pumped about this set of tools and hope this demo helps show off what Power BI can do.

You can view it here on my YouTube channel.

Getting a prize would be cool, but I have to say the fun I had making this video and learning more about Power BI was awesome.