PASS Voting Extension

30 September, 2014 (11:31) | PASS | By: Mark V

Greetings. PASS announced over the weekend that they were extending the deadline for updating your PASS Profile in order to be eligible to vote in the current Board of Directors Election. See the announcement from PASS President Thomas LaRock.

It is important that we all have our chance to cast our vote for the open Board positions. In case people are having issues receiving emails about this from PASS, I have pasted the steps present in the email I received. Please take a moment to make sure your voice is heard! #WeArePASS


Extension to Voting Period for 2014 PASS Board of Directors Elections

To address issues around voting eligibility for the current PASS Board of Director elections, the PASS Board voted today to extend this year’s registration and voting period to ensure that as many members as possible have the ability to cast their ballot. Here are the key dates in the extension:

October 5: All PASS members as of June 1, 2014, who have not already registered to vote by updating their myProfile on myPASS will now have until October 5 to update their profile.

October 6: We will pull an updated list of registered voters and filter for duplicates.

October 7: We will send all newly registered voters an email with their ballot link.

October 14: All voting will close at noon PDT.

How does this decision affect you?

If you already voted

You don’t need to do anything. All votes currently cast will remain valid, there will be no need to recast any votes.

If you already received a ballot but haven’t voted yet

You can cast your vote any time until the deadline on October 14 at noon PDT.

If you established your PASS profile after June 1, 2014

You are not eligible to vote in the current election, but you are already eligible to vote in next year’s elections if you have an up-to-date profile.

If you had a profile prior to June 1, 2014, but have not yet updated your profile

Please follow these steps in order to register and received a ballot to vote in this year’s elections:

Step 1: Update your PASS membership profile by October 5, 2014, 11:59pm PDT.
New mandatory fields were added in January, 2014. Please log in to confirm your PASS profile information is correct and fill out the new mandatory fields, including: Job Function, Industry, Country, Region, City, Zip Code/Postal Code, and Time Zone.

Step 2: Click ‘Save’ and make sure you see the words ‘Profile Updated and Voting Eligibility Recorded’ appear underneath the ‘Save’ button.
If you update your profile by October 5, 2014, you will receive a ballot on October 7, 2014.

Thanks for your patience and feedback as we work through these voting changes, and we apologize for any inconvenience.

To help PASS communicate important news and reminders, and in order to receive future voting ballots, please take the following steps to ensure our messages can reach you:

  • Ensure the PASS IP address is whitelisted on your company or email address provider’s servers:
  • Add to your safe sender list and as a contact.
  • Check spam folders regularly for important messages.
  • If you believe that you are not receiving PASS emails, please contact to let us know!


27 September, 2014 (09:10) | PASS | By: Mark V

If you have been following the recent difficulties around the PASS election of Board Members, you will no doubt have noticed a climate of negativity and sometimes bashing. This has made me really sad to see in a community that I care a lot about. Rather than provide yet another blog post that goes into the details, I instead want to say Thank You to a lot of people.

Thank You to the people that have been raising issues and offering ideas. While I may not agree with the way in which these issues were raised, or the tone present, there were some good points made. You Are PASS.

Thank You to the PASS Board of Directors that worked through/around so much negativity to keep the Community in mind while striving to implement solutions. You took a lot of heat and, as far as what I saw, you handled yourselves with professionalism and patience. It was clear to me during this entire process that you care deeply about the PASS community and work hard to make it as strong as possible. You Are PASS.

Thank You to the people who have been understanding of the difficulties and stayed positive about the willingness of PASS BOD to work through these difficulties. You Are PASS.

PASS has been a huge asset to me in my career in working with SQL Server and related technologies. I feel that I owe a lot to the existence of this organization and the people within it who work so hard to keep it strong and help it grow. That includes the PASS members throughout the world. Thank you. You Are PASS.

My great hope with these recent events around the election and the events surrounding the PASS Summit selection process is that we, as a community, will learn something. The key idea that I think has been lacking on the part of some folk within the community is that everyone involved here, from the BOD to the various committees, to the PASS members ALL want this community to thrive. I would ask that, in the future, those of us who want to raise issues, those who want to make sure their voice is heard, etc, will base how we raise those issues by keeping one, very important idea in mind as we do so: WE ARE PASS.

Show Me The Data

18 August, 2014 (11:00) | Presentations | By: Mark V


In the 1996 film, Jerry Maguire, sports agent Jerry, portrayed by Tom Cruise, is given a very clear demand by his sole remaining client, portrayed by Cuba Gooding Jr. (who won the Oscar for Best Supporting Actor for this role): SHOW ME THE MONEY. It does not take much time or effort to see that this is the essence of what businesses must do to survive. If they cannot show the money in some way, they will go away. The culture we have today, at least here in the United States, is very much driven by money: on Wall Street, on Main Street, in Politics, in the Legal system to an extent, in Journalism to an extent… That SHOW ME THE MONEY idea is everywhere. But we have been seeing a bit of a shift in HOW that money gets shown. It is not just about selling more widgets or getting higher ratings anymore. More and more organizations are realizing the untapped potential that is their DATA. Some have understood this for years. Facebook, for example, is in essence a data aggregator and reseller, using what we share, and the metadata around what we share, as a PRODUCT to sell. Have no illusions, my friends, we are NOT Facebook’s customers, we are the producers of their product: data. That data has been and will continue to be crucial to the strategic decisions of Facebook. They are not alone in relying heavily on data.

AssumptionsMEMEThink about how Netflix transformed how we watch movies (and TV). They rely immensely on data to drive their entire business model. They use data to drive how movies are displayed to their users. They don’t leave that up to people sitting in a board room playing politics over their own ideas. It is driven by data flowing through algorithms. And it WORKS. The rise of Netflix and the demise of Blockbuster can attest to that. Netflix prevailed, in my opinion, largely because they focused on staying ahead of their customers instead of trying to keep up with their competitors. How many companies spend so much time trying not to fall too far behind their competitors? Organizations that do this are, unconsciously, working to achieve mediocrity.

How often do people make decisions about their Business based on information they believe to be true? How often do we, as consumers, make purchases based on believing information provided to us by manufacturers or advertisers? On the whole, as consumers, we tend not use data all that well. Shouldn’t we demand more? Shouldn’t we demand to the people trying to sell us their products and their ideas: SHOW ME THE DATA? We can. We just have to choose to do so. And as business leaders, BetterDecisionsMEMEwe need to rely less on that gut feel and intuition and more on the reality that is before us. We, as global citizens today, need to stop looking to the Media or the Internet for information that makes us feel good about the assumptions we have already made and like. We need to, instead, seek information on the reality of our situation. That reality, for organizations, is often sitting in storage arrays and hard drives as flat files and databases. In some cases, showing the money even just boils down to putting your data to work. I have worked on projects for clients that held large amounts of data on their clients in their industry. The solutions I helped create allowed these organizations to provide their clients’ own data back to them, along with helpful analysis, as a premium service. These organizations turned their DATA into a revenue stream. They stopped just storing their data and started using it.

There is a presentation I have given before titled Keeping the Business in Business Intelligence. In that presentation, I boil down what Business Intelligence is in four simple, yet impactful words: BETTER DECISIONS THROUGH EVIDENCE. When all is said and done, the real point behind any data-driven application or solution, particularly BI, is to make better decisions. As an example, organizations measure performance in order to DECIDE what activities to stop, continue, start, increase, decrease, etc. The word EVIDENCE there is key. Making better decisions is not just about Data. It is about the RIGHT data. It is about accurate, timely data. It is about data that has been deemed trustworthy. It is not really about getting more data, either. To me, the promise of Big Data is not that it lets us use ever larger, more diverse sources of data AllTheDatasMEMEas a whole, but rather that those technologies help us comb through every more vast sets of data to find the bits that we need. It seems like a subtle difference, but I think it is an important one.

As the Subject Matter Expert in the development of the new BI offering we are rolling out at Digineer, the consulting firm I work for and adore, I was able to fold these ideas and philosophy into the foundation of our point of view on BI. I am proud of the way I have been able to drive the overall story. I will be co-presenting with a colleague on Wednesday, August 27 at 11am CDT on the concepts laid out here and more examples from the real world (some Digineer clients, some not). The presentation will focus on how organizations like Netflix have been able to use data effectively in driving their strategy. It will NOT be a big long sales pitch. It will be very much about the concepts above and hopefully inspire people to make better use of an asset that has huge potential be transformative for their organizations. If you would like to join in the discussion, I would encourage you to follow the link, Imagine What You Can Do With Data, and register. I am confident you will take away some valuable ideas on how you can prepare yourself for the next time someone makes this demand of you: SHOW ME THE DATA.

My First Pre-Con: SQL Saturday 332–Minnesota

21 July, 2014 (09:47) | Pre-Cons, Presentations, Professional Development, SQLSaturday | By: Mark V

It is with tremendous joy (and a little trepidation) that I announce that I will be doing my very first Pre-Con as part of SQL Saturday 332 in Minnesota in October. I have been presenting for several years now and feel that it is time to take this next big step. Since I love presenting so much, the idea of presenting for a whole day is just awesome. There is a also quite a bit more pressure in this scenario. But that is part of what makes this a great growth experience, stretching myself like I have never done before.

Over the past several months, I have done a lot of client work in Excel dealing with Power Pivot and Pivot Tables. I have also been doing a bit using Power View up in SharePoint, the vast majority of which carries to Excel 2013 as well. I have always been a fan of enabling users to do more with data and learn to be more self-sufficient. My experiences in Excel have reinforced the idea that Excel is a fantastic platform in the Self-Service BI movement. The past several months working with data in Excel have been some of the most fun in my career. So, when I needed a topic for a BI Pre-Con, the choice was easy.

You can find information on all the SQL Saturday 332 Minnesota Pre-cons here. It is an impressive line-up, to be sure. The abstract for mine is below.

Microsoft Excel: The Business Intelligence Platform For The Masses

From gathering and shaping source data through data modeling and visualizations, it is staggering how much you can accomplish in Excel. This Pre-Con will walk you through creating an interesting and powerful BI solution in Microsoft Excel 2013. Whether you are a business user or a technical developer, you will get good value from attending.

1. Power Query 

• Using Power Query to gather source data from various sources both on-premise and in the cloud. 

• Use various transformations on the Ribbon

• Travel back and forth through time via Query Steps

• The basics of Power Query Formula Language (M)

2. Modeling Data With Power Pivot

• Importing Data from various sources

• Linking Tables to data residing directly in Excel sheets

• The importance of Date Tables

• Best Practices

3. DAX 101

• Introduce DAX syntax beginning with Calculated Columns including the mighty Related function

• The basics of the Calculated Fields (Measures)

• Row context/filter context

• The power of the CALCULATE function

• More…

4. Power Pivot/DAX Design Patterns

• Solving real-word problems with Power Pivot

• Many to Many relationships

• Parent-Child Hierarchies

• Segmentation

• More…

5. Excel Pivot Tables/Charts

• Connecting Excel to data sources like SSAS Cubes, Tabular Models, and the internal Power Pivot model

• Pivot Table basics

• Filtering methods and Slicers

• Conditional formatting

• Pivot Charts

• More…

6. Power View

• The basic visualizations (Bars, Columns, Matrix, etc)

• Advanced visualizations (Multiples, Cards, Scatter/Bubble Charts, etc

• Filtering views or the entire report

• Design tips to take great advantage of Power View’s capabilities


I will be focusing on using Excel 2013 on my machine. There is so much to cover that trying to add in Power BI-specifics is just not in the cards. But I think that makes sense as SO many more people have Excel on their machines than are using Power BI right now. And SO few of those people are taking advantage of even a tiny subset of what Excel has to offer. A major goal of this Pre-Con is to help change that.

My PASS Summit 2014 Submission Feedback

3 July, 2014 (16:50) | PASS, Presentations, Professional Development, Summit | By: Mark V

Speakers have been asking PASS for feedback regarding their Summit submissions for a few years. This year, following a bit of a heated “discussion,” PASS announced that session feedback would be available upon request. I, like so many other speakers had done, applaud this decision. PASS did make it clear that the both the quantity and the quality of the feedback varies widely.

I am a big proponent of learning from the experiences of others. As such, in the hopes that someone can learn something from the feedback I got, I hereby share what I received. I want to thank the reviewers who took the time to make these comments.

Analysis Services Terms and Concepts For The DBA (REGULAR SESSION – NOT SELECTED)


Despite some overlapping concepts, the worlds of the Relational engine and Analysis Services really are quite different. With more and more organizations realizing the power of Analytics, there is a good chance a BI initiative will come your way at some point.


This session is intended for the DBA that wants/needs to learn more about SQL Server Analysis Services. The goal is to provide a meaningful base of knowledge that will allow you to effectively participate in discussions of Analysis Services in your organization.


Through both slides and demos, you will learn:

– The differences between SSAS Multidimensional and SSAS Tabular

– Key terms like Measures, Dimensions, and Hierarchies

– Storage options such as MOLAP, HOLAP, ROLAP, and Direct Query

– Monitoring with Extended Events

– Overviews of MDX, DAX, and XMLA

– And more


Come take a few steps into the exciting world of Business Intelligence with SQL Server Analysis Services.


Seems this may be a 100 level session
Good topic, sounds more like a 100-level session to me.
No need of prerequisites to be SQL Server Administrator. should also focus on OLAP DW part and schema concept, slice and dice part of SSAS OLAP cube if someone wants to show the power of BI Analytics using SQL server analysis services.
Excellent and useful topic!


DANGER: The Art and Science of Presenting (REGULAR SESSION – NOT SELECTED)


Over the past decade, we have learned a lot about the chemistry of the brain and why humans react the way we do to events in our environment. The idea of Emotional Intelligence – EQ – is a compelling concept that applies this knowledge in a set of learn-able, improvable skills for leading others. Although EQ is often applied to corporate leadership, this session will explain the basics of EQ and demonstrate how you can use it to make your presentations better in the following areas:


• Crafting better slide decks

• Preparing yourself for presenting

• Delivering your content

• Dealing with the unexpected


Understanding and practicing the concepts of EQ can make your presentations a better experience for everyone in the room – including you.


This session was chosen as an Alternate last year and I ended up presenting. It was greatly successful (narrowly missed being in the Top Ten sessions) so I submitted it again, noting to the committee why I was doing so. That should provide some additional context to some of the feedback.


Excellent topic. Excellent consistency across session name, abstract, topic and goals. Perhaps, given the topic, some real examples should have been added. Reference to PASS is 2013 should have been avoided.
While the abstract and topic are great I’m not sure that we would want to see a repeat session from last year.
Delivered too recently at the past Summit. Very targeted audience.
The abstract goes too much into EQ and feels disconnected from the title.


Keeping the "Business" in Business Intelligence (REGULAR SESSION – NOT SELECTED)


It is no accident the term “Business Intelligence” starts with “Business.” Any Business Intelligence initiative should, likewise, start with the needs of the Business. For many years, BI was seen as a technology project. This is one reason why so many BI initiatives fail. Rather than a Technology Project, BI is a Business Program. It must grow and evolve as the Business grows and evolves.


In this session, we will discuss the following:

– Why BI is a worthwhile investment (using case study examples)

– What criteria to use in determining the success of a BI initiative

– Several reasons why BI initiatives fail

– Critical Success Factors for BI


So much of the success for BI happens before the requirements are even gathered. Come learn how you can set yourself up for success with Business Intelligence.


Could be an interesting approach to a rather dry topic
The abstract is clear about what will be discussed as for failures of BI projects. If it has real examples, maybe you can get some demo to demonstrate. You can demo the results in chart, as time and effort, even the results.
Thanks for the abstract.


Power Query: Data Chemistry for The Masses (REGULAR SESSION – SELECTED)


ETL Developers have being doing chemistry with data for years in tools like SQL Server Integration Services. These tools require training, experience, and time that few business users have. But in the age of self-service BI, those business users need a way to shape data to support their analysis.


This session will show how Power Query can be easily used to take advantage of data’s properties to drive the change we need to support our goals.


We will discuss/demonstrate:

– The simple process of accessing a wide variety of data sources

– The ease with which simple transformations can be achieved using the Power Query Ribbon

– Power Query’s fantastic ability to travel through time to see every step taken with the data

– The foundations of the Power Query Formula Language, informally known as "M"

– Using "M" to take Power Query WAY beyond what the Ribbon has to offer.


Come learn about what may well be the most exciting member of the Power BI family.


seems like too much to cover in 75




As with so many aspects of life, a solid foundation makes a huge difference. This Star Trek themed introduction to MDX leads you on a voyage through the terms and concepts necessary for a solid foundation for learning this fascinating language. Terms covered include:

– Measures and Measure Groups

– Attributes and Dimensions

– Hierarchies

– Members

– Tuples

– Sets


This session also shows how you can think about the cube space in a way that is very easy to understand. The word "cube" suggests a 3 dimensional object. That way of thinking is fraught with confusion. Forget about the Rubik’s Cube. It doesn’t help.


With that foundation, we then dive into MDX syntax and fundamentals including:

– Query Axes

– Slicer Axis

– Tuples and Sets

– Hierarchy Navigation Functions

– Crossjoin

– Functions allowing us to travel through time


Come join us for a fun voyage through the cube space and boldly go where no MDX presentation has gone before.


Is the topic about MDX or DAX? Just got a little bit confused. The abstract states what will be discussed and what the analogy comes from. About the level, it may be better to be at level 100 since it is an introduction of MDX.


Getting Started with SSAS Extended Events (LIGHTNING TALK – SELECTED)


With SQL Server Profiler on its way to retirement, our friends on the relational database side of the house have already been taking great advantage of the power of Extended Events (XE). There is a lot of great info out there for using XE against the database engine. For Analysis Services, there is a lot less.


This Lightning Talk will demonstrate how easy it is to get started very quickly with SSAS XE once you have some basic information.


We will demonstrate:

– Creating an SSAS Extended Events Trace which outputs to a .xel file

– Make sure your trace is running via the DISCOVER_TRACES rowset

– Importing the contents of that .xel file into a SQL Server db engine table for analysis

– Deleting the SSAS Extended Events trace


Good topic and the abstract explains exactly what the attendee can expect from the session
Great abstract with details on what will be presented and what to expect to learn!
Thanks for the abstract,It’s good to have someone talk on the  power of Extended Events (XEvents) part.


Reporting Services Pagination Triple Play (LIGHTNING TALK – NOT SELECTED)


The ability to have some control over the pagination of Reporting Services reports has been around a while. But it never hurts to review the fundamentals.


This demonstration will cover:

– Basic pagination in Reporting Services using Rectangles (Love these)

– Adding a page name that carries to Excel exports

– Adding a basic Table of Contents to your multi-page report using Bookmarks

– Adding a more dynamic, data driven Table of Contents to your report using Bookmarks and expressions


Come on out to this ballgame where we hit on  SSRS pagination with a report about three of the most famous infielders in the history of Baseball.



Great abstract
Excellent topic that people always ask about in classes
100 demo!
Level appropriate to content
lots to cover in 10 minutes


My Takeaways

Given that there seems to be a wide range in terms of quality and quantity provided to speakers, I have to say that I feel I made out pretty well here. I am pretty happy with both the quantity and quality here.

I am a little puzzled about the confusion over whether my MDX session is on MDX or DAX. And I think MDX is complex enough that any session on it is at least a 200 level, particularly given that almost everyone learns TSQL first and must “unlearn” some things in order to grasp MDX.

Given that the Keeping the “Business” in Business Intelligence is about concepts and ideas, and not technology, I am not sure how I could add demo to it that would not be contrived in an attempt just to say there was some demo.

Overall, I am pretty happy with this feedback and glad PASS made the decision to make it available.


Interview with Biml Creator Scott Currie

2 July, 2014 (10:00) | Biml, Interviews | By: Mark V

On June 12th, I had the pleasure of presenting to the Greenville Business Intelligence User Group in Greenville, South Carolina. I had a fantastic time and I have to say that the people of Varigence (a major sponsor of this group) showed wonderful hospitality. Part of my trip included a day of hanging out with Biml creator Scott Currie (Blog|Twitter) to learn about Biml, Mist, etc. I didn’t have a chance to play with Biml before heading down there and I have not yet been to a Biml related presentation, so I made it clear to Scott that I was a green field. I told him this at dinner the night before and he just smiled and said, “I’m going to change your life tomorrow.” I have to say that it was not an empty promise. I was blown away by the current functionality Biml as well as the potential for what is possible. If you have not had a chance to look into Biml, I highly recommend you do so. It’s brilliant.

I also had the chance to sit down for an interview with Scott Currie for this little blog of mine to talk about Biml, Mist, and the future of the BI ecosystem. Since I had not used Biml before, I reached out to some great members of our SQL Community that have been using Biml in order to get some of their questions for Scott. I want to thank Catherine Wilhelmsen (Blog|Twitter) and Samuel Vanga (Blog|Twitter) for helping me out.

Below is my interview with Scott. Please note, as with my previous interviews, edits were made, with Scott’s permission, to remove the byproducts of casual conversation for better flow in writing.

Scott Currie Interview


When it comes to Biml, some of the stuff I hear in the market, and some of the perceptions I had before I came here, were that Biml was about creating a lot of SSIS packages at once. But, I’ve never been in a situation where I needed to create 100 packages or 200 packages. What do you say to someone who says, “Well, I don’t think Biml is for me because I don’t have to do that?”


Yeah. That’s something we’ve heard from people in the community, as well, who aren’t core Integration Services developers and aren’t creating tons of staging environments and things. And I think the reason that perception has come about is because it’s the easiest, most obvious example you can show anybody in a half hour to an hour presentation. You don’t need to provide a lot of context to create a staging environment from scratch, for example. And what you get out of it is, just as you noted, hundreds of packages or one big package with hundreds of data flows in it. So, it’s the 101 example that everybody sees and they think, sometimes, that’s all there is to it. Whereas, what we’re seeing people do in the real world with it is usually to start with that because there’s definitely value in being able to automatically create staging environments and other sorts of very rote automation. But then they start to take it further. They start implementing their patterns and practices on top of it in really clever ways. They start adding additional metadata stores. And sometimes their semi-technical or even non-technical people start adding configuration information. And that configuration information can be used to create complex business logic, all of the patterns, all of the logging, unit testing; all of the stuff that normally is the plumbing that takes a lot of time to do is now being auto-generated around configuration information that’s actually adding value to the business. What we see happening is people start with that rote automation and then they start to move into having custom business logic and injecting their patterns into it. Really, the way to think about Biml, after you’ve gotten the core concepts, is to think of it as patterns and frameworks engine that allows you to automate the plumbing, but doesn’t restrict you into a specific approach for that automation. You can implement whatever patterns you want to. You have to, of course, do that implementation. But, once you implemented it, you can do whatever you want to. The sky is the limit. And you can have those patterns interact with custom business logic and you’re not constrained on either side.


For those that may not be familiar with Biml or really just see the SSIS facets of Biml, can you talk a little bit about some of the offerings you already have outside of SSIS, and maybe a little bit about what you can share about what’s coming?


The Bids Helper add-in to BIDS [Business Intelligence Development Studio] and SSDT [SQL Server Data Tools], which is a free and open-source add-in that is available on Codeplex actually includes some of the Biml functionality. It does include a subset, though. It has all the stuff that we have for relational modeling and being able to manage your relational assets. It, additionally, has most of the Integration Services features. You do have to purchase a product in order to get some of the additional stuff. Some of the additional things include Analysis Services functionality. Currently, we support all of SSAS Multidimensional. In our upcoming release, which will be coming later this Summer, we do have SSAS Tabular as well. We also have the ability, in the upcoming release, to do things like metadata modeling and being able to construct, in a very reusable way, some of that metadata that I mentioned that becomes very useful in your more complicated scripting. We have the ability to do things called Transformers. Actually, they’re present right now. They allow you to, in a very modular fashion, specify what a pattern looks like. You can say, in one or multiple Transformers, here’s how I do logging. You automatically add Row Counts on your OLE DB Destinations, including creating the variable to store those, and including the execution of stored procedures to go ahead and write those to the database. This includes Event Handlers. You can put all that into these little Transformers and then you can have the tools actually inject that into your custom logic. There are some very powerful things there. Also, on the Analysis Services side, you can use Transformers to automatically control your measure formats, for example. You can add that to any other automation you already have in place to also automatically build a cube off of your dimensional model. We have a lot of options. In a lot of cases, it’s difficult to talk about individual features because the way we built this out is to provide you with the tools you need to build anything. It’s hard for us to be prescriptive and say “Here’s what you ought to go and build” or “Here’s what you can build” because the answer is, essentially, you can build anything. It’s just that you have to make the decision as to which parts you want to automate and which parts you want to keep custom and manual. And that’s going to be a different analysis that’s done by every single organization that is approaching the tool.


You talked about Bids Helper. This question comes from Catherine and I thought it was a great question: What are the future plans for Biml support in Bids Helper?


In Bids Helper, we’re going to continue to update things so that the subset of functionally that is in Bids Helper is going to be up-to-date. As we bring in additional utility methods to be able to bring in your metadata more quickly, and as we bring in additional methods to be able to very easily construct SQL queries (we already have some)… We’re adding additional helpers all the time to make queries easier to write. There’s also SQL 2014 support. All of that is going to just come along for the ride. As it is implemented in Biml, it is there in Bids Helper. We also know that there are some usability issues with Biml in Bids Helper right now in terms of how strong a development environment you have inside of BIDS and SSDT for Biml. One of the things we are definitely doing in one of the upcoming releases in Biml for Bids Helper is improving the error messaging story so you can get a much clearer picture of exactly what your errors are. And you can navigate your errors a bit more easily than you can today. The other thing that we know is a big issue for Biml in Bids Helper is the code editing story. Right now, when you open up a Biml file in Bids Helper, what you get is essentially the standard XML editor that ships with Visual Studio. And that works OK as long as you’re doing flat Biml. But as soon as you start to put in code nuggets to do your automation, the Visual Studio XML editor doesn’t know how to interpret those. It gets very confused and you lose all of your Intellisense and you get error squiggles saying there are problems when there actually aren’t problems. We are 100% aware that this is a problem and we’re looking at a bunch of different options there. We’ll probably have some announcements to make later on about that. We’re definitely thinking about it and working on it, but we don’t have anything to share just yet, unfortunately, about that piece of the story. Outside of the error messages and the code editing, there are a bunch of value-add services that we could, potentially, build into the Bids Helper story around being able to more easily share scripts and share frameworks. We’re also thinking about those. And we will also have some announcements around those in the future, too.


These next few questions come from Samuel. How did the idea for BIML and the foundation for Varigence and Mist and everything else get formed?


The original idea actually came when I was working at Microsoft. I was on the Developer Tools team there working on Visual Studio. Almost by accident, I fell into what became a data warehousing project. So, I was an application developer essentially working on developer tools who became an accidental DBA in a very real way. And one of the things I noticed, with that particular blend of experience, is that most of what we have learned about doing application and web development really well over the past several decades didn’t find its way into Data development. And I think there are a verity of very good reasons, historical reasons, why that happened. But I thought there might be some benefit to trying to re-imagine those things that we have learned doing application and web development in terms of data development and see if something interesting fell out of it. So, to make a long story short, essentially, what we did was keep in mind there are a lot of parallels in what you can do in web development and the types of problems you try to solve in data development. What if you could have an HTML-like language that would describe your Business Intelligence or data warehouse solution? And then, once you’ve got that, take an ASP.NET type approach in putting code nuggets in to automate it. With that as a foundation, almost all of the things that you normally like to do on application or web development just light up and start working. Source control becomes valuable again. Builds become very very powerful and continuous integration become something that’s very useful. Being able to do automation and patterns-based development and best practices that are enforced for your team; all of these things just start lighting up, almost for free, once you move to that human-readable, writable, declarative HTML-like language with code nuggets interspersed. So that was the original insight. If we had this, we could go ahead and start turning out all these interesting things and start doing data development more efficiently and in a more maintainable way. Of course, it was a long journey to get to the place where we actually implemented all that stuff, which is where we are now. But that was the original insight that actually took place while I was actually building out developer tools, but for Application and Web development.


Given that history, and how you got started, what are your short, medium and long-term goals for Varigence? Where do you see this going?


Short term goals are all about, you could say, finishing the engine. As I mentioned a little bit ago, we’re adding in support for SSAS Tabular in the next version, 4.0, so, with that addition, and some of the enhancements we’re making, we’ve essentially got full coverage in features for Relational, Integration Services, and Analysis Services, including Power Pivot. And that’s going to be a great story. Now we’ve got the basis for building out any solution on top of those technologies. So, the medium-term goal is going to be a combination of two things. One is that we may start biting off additional pieces of the stack and not limiting ourselves to just Relational, Integration Services, and Analysis Services (Power Pivot, too). There’s some really interesting things happening elsewhere in the Microsoft stack when you look at things DQS and MDS and some of the Power BI stuff that’s happening. And we still hear a lot of requests for Reporting Services. Those are all things we’re looking at as potentially building in the medium term and expanding out that engine to have entirely new areas. In the medium and also long term, we’re looking at leveraging the engine in different ways as well. So, once you’ve got that core engine, people start asking the next set of questions like “How can you make it easier for me to manage my metadata?” This is one we’re already working on for 4.0. “How can you make it easier for me to take the solution I’ve built and package it for a hosted solution offering that my professional services company can offer to its clients? How can you make it so it’s easier to put this in the Cloud?” So, there’s all these additional ways of repackaging this engine and providing additional services around it which enable entirely new scenarios. In the medium to long term, that’s where we’re going to have a lot of focus: enhancing all of the services around the engine instead of the engine-level focus that we’ve largely had thus far.


So, with that vision in mind, not just for Varigence, but for that “better way,” what would you say my job, as a BI Developer, would look like in five years?


I think that’s a very interesting question and I would have to stop and ask what you mean by BI Developer in terms of a day-to-day job. Because, I think one of the issues that we’ve all faced in this industry is that, for a lot of reasons, some of them cultural, some of them historical, some of them because of the tools that we work with, we’ve all been forced to wear multiple hats. So, in any given day, I might do Data Analyst work. I might do Data Architecture. I might do BI Developer work, BI Architect work… I kind of have to jump back and forth. Even if I have an Architect at my company who’s providing me with patterns that I should use, generally speaking, they’re giving me some sort of template file that I then have to go and understand and customize for my particular solution. What we think is going to happen, and one of the things we are really trying to enable, is to allow those roles to separate. So that if I’m an Architect, I can provide a pattern, not just in a template, but in a completely reusable code file that the tools can then apply to anything. And as a BI Developer, I might focus on implementing business logic or implementing the spoke, custom parts, the complexity that you can’t automate away. And then let that automatable complexity get handled by the tools that the Architect has driven. I don’t have enough confidence in how things progress after that to say what your day looks like. But I think one of the key things that you can expect to happen, especially if we’re successful in some of the approaches we’re taking, is to have your job be more focused so that you’re not having to wear all of these multiple hats at the same time. If we have our druthers, what we’d really like to do is also make your job a little bit more fun. Part of what we’ve heard from BI Developers in the past is that they never got into doing Business Intelligence because they thought it was fun to drag and drop onto a design surface and implement logging rules for particular regulatory compliance in a particular industry. They got into it because they love data and they love insight and they love being the first person in the world to know something interesting or important about their job or about their world. So, what starts happening is that once we start doing this as our day job, we end up spending more of our time doing the plumbing and doing this sort of “not fun” work and less of our time on the insight generation. And hopefully, through technologies like what we’re building and what we’re seeing happening elsewhere in the industry, that’ll start to shift to where I can have that role separation and focus on the parts of the job that I actually do love. And at the same time, I can have fun doing that again because, in the job that I’m doing, I don’t have to spend the time on the drudgery. More specialization and more fun is hopefully what is in the future for BI Devs.


So, here’s another question from Samuel. With so much emphasis on Self Service, including self service ETL with tools like Power Query that make it easier to move data around, what do you see happening to traditional ETL and SSIS? What do you see evolving from that enablement of end users while, at the same time, retaining that Enterprise Class complexity where it is needed?


That’s one of the challenging things with the message around Self Service BI. I’ve certainly seen people polarize into different camps where some folks will say that Self Service BI is the future and traditional ETL and IT developed solutions are going to be a thing of the past; no one’s going to use them anymore. And, of course, I’ve seen the polar opposite where some people are saying that Self Service BI is a flash in the pan; you can’t solve any real problems with it; there’s no real organizational benefit to having it and it causes more problems than it’s worth. I’ve definitely heard people saying both of those things. I think that the challenge in all of that, or the issue with all of that, is that people are trying to apply those technologies or those solutions to problems that they’re really not well suited to solve. Self Service BI has an excellent and important role in the organization for being able to empower individual decision makers to get their questions answered at the speed of the Business rather than at the speed of the technology. Often times, there’s a mismatch there. But at the same time, your Self Service BI tools are never going to work well if your data is not already in decent shape. If you’re plugging bad data into Self Service BI, you’re going to get bad insights out. And if you’re taking data that is not well formatted or well aligned to the types of questions you’re trying to ask, you’re not going to be able to get your questions answered unless you transform that data in a way that actually aligns it with those answers. And any tool that can do that needs to be complex enough to handle the complexity of those transformations. So, if I have a Self Service BI tool that is intended to solve ANY ETL problem, it’s very rapidly going to become as complex as Integration Services because you need that level of complexity in order to solve those data integration problems. There’s no getting around it: you can’t wave a magic wand and make complexity go away. And that’s a good thing. If you could do that, we’d all be out of a job. So, what I think the future looks like is that you’re going to have a very strong presence in Self Service. Self Service is going to be part of that last mile story for Data. But that’s going to make the person who’s day job it is to make that Enterprise Class data warehouse or Enterprise Class Tabular model even more important because their work is going to be much more heavily leveraged. Instead of just leveraging that work through canned reports, we’re also going to be leveraging that work through all of these additional Self Service models. That, I think, is great for both sides of the fence. But you have to pick and choose your solution for the right business problem and not say, “I’ve got a solution, so, I am going to go solve every problem with it.”


Here’s another question from Catherine. How has the Effektor/Mist integration impacted Biml? And please talk a bit about what that integration is.


Absolutely. We’ve got a great partner that’s based in Copenhagen, Denmark, even though they have offices all throughout the Nordic countries. In addition to being a great consulting and professional services partner, they also have a product called Effektor which enables nontechnical users, through configuration, to take a well prepared data warehouse or data mart and build out a whole bunch of additional features on top of it, including cubes and workflows. There’s a bunch of stuff that you can do with it. It’s consistent with the philosophy of Biml; i.e. it’s better, where possible, to have configuration instead of having to code everything up from scratch. And one of the things that they noticed as they were building out more and more functionality is that it made sense for them, instead of building out an engine that could do all of the Integration Services and Analysis Services code generation, to just use the Biml engine for that instead. So, they could focus on the parts that were important to the business and say here’s the type of metadata that we need and here’s the type of configuration UI that we need and here’s how we’re going to translate that into our patterns and practices. Then they can plug all that into the Biml engine and let us do all the code generation bits. So, we announced recently that they are actually going to be integrating in the Biml engine for their code generation into Effektor and they’re also going to be providing Mist as an option for some of the custom logic generation that supplements what’s available inside the Effektor product. In terms of new directions for Mist and Biml, I think the interesting thing about that question is that the way we have architected Mist and Biml does not require that we change our direction in order for new and interesting things to happen. You don’t have to rely on US to do things for you. So, the fact that Effektor is now enabling these new scenarios means that the Mist and Biml ecosystem is going to progress into this new direction without our having to make any product changes. So, I think there are definitely going to be things that we do to make the development of Effektor easier and let them focus even more on their areas of expertise and less on some of the code generation, which is our are of expertise. But really, I think the more interesting thing is that now that the Mist and Biml story are enhanced by this additional Effektor functionality, that’s a new direction in and of itself. And that’s the thing that the most exciting. We love the Effektor relationship and we love the relationship with our other partners as well because they start using the product in ways that we would never have the resources to enable ourselves if we were doing all development independently. And in a lot of cases, they do things that we never really anticipated or thought of. And that’s something that’s really rewarding, I think, as a tools developer because it tells you that you got it right. If somebody is successful in using your tool in a way that you didn’t intend up front, that you didn’t plan for, that means you built a really robust and really useful tool. I’m definitely of the mind that nobody can solve all of the problems. But somebody can provide building blocks. And your building blocks are only as good as the problems that get solved with them. So, if we’re solving all these interesting problems, it means we built good building blocks, which is rewarding.


Here is another question from Catherine. Are there any plans down the road for Biml books?


Absolutely. We have a couple of books actually under development right now. We have a stable of authors that we have recruited. Actually, it wasn’t much of a recruiting effort. We were originally thinking about doing just one book. And we went and approached a list of authors because we wanted it to be sort of a community collaborative sort of thing where we had a recognizable author writing each chapter. And we had a list of authors that we wanted to approach and we made the list a lot longer than we thought it needed to be because we thought maybe half or two thirds of them would say No. The funny part is that when we went and approached them, I think every single one said Yes. So, we had more authors than we were originally intending. The solution there was to go ahead and do two books. We’re still in the very early stages. It’s going to take a while; books take a long time to get out the door. We are going to have one book whose working title is “Biml: The Definitive Reference” which will be an end to end resource to learn everything you need to know to be effective with Biml. It will have reference material and conceptual descriptions as well. And then the other one is a Biml cookbook. So the chapters will be devoted to specific problems you need to solve then options for different patterns that you could use. For example, we might have a chapter on Unit Testing with all sort of different recipes that you could use to do Unit Testing very effectively with Biml. Or another chapter on different patterns like doing a Type I, Type II slowly changing dimensions, etc, in Biml and various different options for doing that. So that, I think, is going to be great. It’s going to cover both ends of it. For the people that just like to sit down and become and experts on stuff, we’re going to have an option there. And for people that prefer to wait until they are confronted with a problem and then go read specifically about that, we’re going to have an option there, too. Unfortunately, we’re not announcing a timeframe on that yet because we don’t want to get it wrong and we’re still a little too early to know exactly what the ship date is going to be. But, it’s definitely something that is an active area of work for us.


Catherine helped with this question, as well. So, right now, there are members of the community just stepping up to do presentations. And sometimes those presentations happen in places that might not be feasible for you guys to get to because cost, and whatever else it may be, and there’s a limited number of people you have here at Varigence. Are there any plans to have kind of a Train the Trainer or some sort of certification program to say, “This person is a certified Biml Trainer” and give them access to stuff. I’m thinking of the Microsoft Certified Trainer. And obviously that’s very robust thing. But are there plans for something like that?


Yeah. Absolutely. And in fact, we’re past the planning stage on that. We actually have something in Production right now on that. But, first just a note about the Community talks. A lot of folks aren’t aware of this, and hopefully we’re going to start doing a better job of publicizing these talks that are happening worldwide, but if you look at just the past six months, so January 1 thru June 30, worldwide, we have had 84 talks across 15 countries with 22 distinct speakers. And that’s incredibly rewarding and I just want to give a huge Thank You to the Community. That’s something where there is absolutely no way we could get that amount of reach on our own. And these are people who don’t work for Varigence; they do it because they love the technology and they love telling the story to others. So, I think we’ve already got something great there. Of course, your one-hour Community event isn’t a replacement for training where you’re actually able to go on site and say, “Here’s your full training program” where, at the end of it, you’ve got the expectation that the Team’s just going to hit the ground running and start working directly. So, for that, we work through partners. We do have a training program we can offer directly, but we prefer to work through partners as often as possible because we don’t think there is any way, going back to some previous conversations, we can be as effective training somebody in say, the Healthcare vertical, as a professional services company that spends all their time in Healthcare. We would rather say, “Let’s get you set up on Biml” instead of saying “Let’s get our trainers able to talk Healthcare.” So, we already have a program; we have a Consulting Partnership program. If there are consulting companies out there that are interested in being able to work directly with Biml and actually train with it as well get in contact with us. We have a Train the Trainer program and a whole bunch of materials that are out there. So, essentially, we have pre-canned offerings that you can start with and then tweak. We have an hour long, two hours long, half day, full day, a three-day and full week sessions. So, we’ve got all of those pre-canned and of course you can tweak them and you can supplement. They’re module based, so you can add in your own modules. You can tweak to use your own branding. You can tweak them to change some of the messaging if there are particular aspects that you know would be more or less interesting to your particular client or your particular industry vertical. So, we’ve already got all that set up and if anybody’s interested in engaging on that, please don’t hesitate to contact us.

Wrapping Up

That takes care of the interview with Scott. I have to thank Scott very much for his time for this and the time he spent showing me Biml and Mist. And I need to thank Catherine and Samuel for their help in coming up with appropriate questions for this interview. Overall, I was immensely impressed with the amazing work that Varigence, with Scott’s leadership and vision, have done with Biml and Mist. I am now planning how I can roll time for learning Biml into my schedule around everything else going on. I am sure I can justify the ROI.

I hope this helps shed some light on Biml and where it is headed.

Presenting at PASS Summit 2014

26 June, 2014 (10:00) | PASS, Presentations, Summit | By: Mark V

I am beyond delighted to announce that I will be presenting at PASS Summit 2014 in Seattle in November. I submitted a total of seven sessions (five regular sessions and two lightning talks). I ended up with one of each: one regular session and one lighting talk. This is a huge honor for me and a great step up from last year, for which I had one alternate session (which ended up getting promoted over the Summer, much to my joy).

Regular Session

Power Query: Data Chemistry for the Masses

ETL Developers have being doing chemistry with data for years in tools like SQL Server Integration Services. These tools require training, experience, and time that few business users have. But in the age of self-service BI, those business users need a way to shape data to support their analysis.

This session will show how Power Query can be easily used to take advantage of data’s properties to drive the change we need to support our goals.

We will discuss/demonstrate:
— The simple process of accessing a wide variety of data sources
— The ease with which simple transformations can be achieved using the Power Query Ribbon
— Power Query’s fantastic ability to travel through time to see every step taken with the data
— The foundations of the Power Query Formula Language, informally know as “M”
— Using “M” to take Power Query WAY beyond what the Ribbon has to offer.

Come learn about what may well be the most exciting member of the Power BI family.

Lightning Talk

Getting Started With SSAS Extended Events

With SQL Server Profiler on its way to retirement, our friends on the relational database side of the house have already been taking great advantage of the power of Extended Events (XE). There is a lot of great info out there for using XE against the database engine. For Analysis Services, there is a lot less.

This Lightning Talk will demonstrate how easy it is to get started very quickly with SSAS XE once you have some basic information.

We will demonstrate:
— Creating an SSAS Extended Events Trace which outputs to a .xel file
— Make sure your trace is running via the DISCOVER_TRACES rowset
— Importing the contents of that .xel file into a SQL Server db engine table for analysis
— Deleting the SSAS Extended Events trace

A Note To My Fellow Speakers

I wanted to take a moment to shout out to my fellow Summit Speakers, especially those for whom this will be your first time speaking at Summit. With the announcement today about speaker selection, there were quite a few Congrats going around the Twitterverse. That was good to see. As a speaker myself, I know how gratifying it is when people are excited for you at a time like this.

There was also a fair bit of negativity today. We in the SQL Community are a passionate bunch. We care deeply about what goes on and can be vocal when we think something isn’t the way we feel it should be. We are all humans as far as I am aware; humans are emotional creatures. So, sometimes, passions will get the best of us and we may not communicate it in the best way as a result. I think there was a fair amount of that today. There will always be people that are disappointed with certain choices made by PASS or any other organization. Sometimes that disappointment is justified in a real way, and not just via the perception of a few. Sometimes it is hard to see it as anything other than lashing out with disappointment. PASS, like any other organization, is not perfect. One thing we need to remember, though, is that while the process may be imperfect, there are a lot of people working very hard to do the best they can. I think some people forgot that today. Even if there are some legitimate concerns about the perception created by certain choices, I think we need to be careful not to jump too quickly into an accusatory posture. There are constructive ways to air concerns and there are destructive ways. Today, there was a little too much of the latter.

I beg you, fellow speakers, not to let that negativity dampen your excitement or pride over having a session accepted to such an incredible event as PASS Summit. I join with you in being just pumped about getting to speak again. Congratulations and I will see you in November!

PASS Business Analytics Conference 2014 Recap

14 May, 2014 (10:00) | Business Analytics, PASS, Professional Development | By: Mark V

BadgeWow. That was quite a few days I had in San Jose last week. Since you follow my blog with rapt joy, you no doubt have already read my two Live Blogs of the keynote addresses. As you can see, there was a lot of great info flowing. It was really fun to participate in that flow and help people who could not be there to share in some of the experience. I actually enjoy the pressure of blogging something live and it is gratifying that people keep telling me how much they like it when I do.

A few people have commented on the nature of my live blog posts, how they auto-update themselves without the reader having to refresh. The key for me is a free WordPress plugin called Live Blogging. You can find info about it here. It is out of date now, but I like it so much that I will keep using it as long as I am able.

The overall pressure was a bit less for me this year since I was not speaking. I actually wasn’t expecting to go at all until a few things fell into place just right about a month ago. In addition to the missing Speaker ribbon this year, I was happy to wear a different ribbon for the first time. Do you see which one I mean? You found it. Yes. I am a Diva. I wore that proudly. It started some conversations, as I expected it would. Mission accomplished. Conversations, after all, are a HUGE part of what makes conferences like this so valuable. Networking and connecting with people is essential. I don’t really consider myself a Diva. But it did fit my Purple ribbon theme this year.

Last year, my PASS Business Analytics Conference Recap focused on stepping out of my comfort zone and the ROI involved with doing so. This year, the theme of my Recap is on Sharing. PASS marketing for events often includes the slogan: Connect. Share. Learn. That is really what we do at these events. The more I think about it, though, the more I see that Share is at the center of all of it (literally, too, for those paying attention). When we Connect with others, it involves Sharing something of ourselves. When we Learn, that involves someone sharing their knowledge with others. In the age of social media, a massive number of people share so much information with others, including what their lunch looked like, where they are now, where they are going, who they are with, where they want to be, which Lord of the Rings character they are, and “It’s Complicated.”

Sharing is at the root of so much that is happening today. We are sharing more within our organizations with the rather new Enterprise Social movement with Lync and Yammer and other similar technologies. We are sharing our passion for coding with the next generation with Reshma Saujani’s Girls Who Code and Lynn Langit’s Teaching Kids Programming. We are sharing our treasure to support interesting projects via Kickstarter. We are sharing our data and insights more effectively with Power BI (I couldn’t resist. I’m a BI guy after all). With the Self-Serve BI movement in general, BI professionals like me are sharing the experience of enabling others to explore data with an ever larger group of people. So many of today’s tools and technologies revolve around making insights easier to get. Why should the CXO be the only one with access to interesting data about our organizations? Why not share it with people at all levels that can make better decisions via that data? We are proving more and more that sharing what we have does not make us weaker, it makes us stronger. It is such an exciting time to be working with Data during what can easily be called a Renaissance in that respect.

Data is everywhere. It permeates (wow, fancy word, eh?) our society in ways you may not think about. We are learning more and more, as well, that it is not just THAT we share Data, but HOW we share it that makes a big difference. I was ECSTATIC to see the mighty Data Visualization expert David McCandless (Website|Twitter) was delivering a keynote. There is SO MUCH bad data viz out there. So many people LYING to us by presenting what might otherwise be good data in a way that is intended to mislead. It is not going to get better until we learn to see through it and the people LYING with data, who rely on our ignorance, see that ignorance evaporate. It is not just those extreme cases, though. There are so many people with great intentions who try to share data effectively but just don’t know how. Often, they mistakenly obscure the meaning of data with something shiny. They don’t realize that it is the DATA that should POP, not the shiny. Data Visualization is an area that I am just digging into. I find it exciting and strewn (another fancy word) with possibilities.

We teach our children to share at school, at home, at the park, at parties, etc. We, as adults, are getting back into sharing. And, by sharing data, we get back into another activity from childhood: play. Just as children learn through play, so too do we, as data professionals, learn through playing with Data. It is through play that we can find the patterns and relationships we didn’t know where there. As David McCandless said, “There is so much data in the world, what else can you do but play with it?”

I have said it before, and I will say it again: events like the PASS Business Analytics Conference are an amazing opportunity. If you have never been to a conference like that, or a SQL Saturday, I implore you to make every effort to try it. I have NEVER been disappointed that I attended a conference. Ever. I have gotten something valuable out of each and every one. I have one last piece of advice to share: Go find something you are passionate about. Connect with others in your community. Share your passion with them. Learn together. Repeat.

I know that was more than one. But, hey. It’s my blog. I’ll do what I want. :)

<mic drop>

PASS Business Analytics Conference 2014 – Live Blog – Keynote Day 2

9 May, 2014 (10:00) | Business Analytics, Live Blog, PASS | By: Mark V

Good morning. It’s Day 2 here in San Jose, CA, for the PASS Business Analytics Conference. Today’s speaker is David McCandless, data visualization expert. Data Visualization is such an important field, in my opinion. There is a lot of junk out that that purports to be using data effectively but instead is just seeking to confuse. We need to do better. The insights of people like David can help us do that.



Getting ready for the Day 2 Keynote. People are filing in.


Lights are dimming. Here we go.


Denise McInerney taking the stage to introduce David McCandless.


David’s in the house…


David’s key phrase: Information is Beautiful. I have to agree.


Let’s start with Billions. Billions of dollars or something else. What do Billions look like?


David showing a Treemap. Hey, I saw one of those yesterday! #PowerBI


Americans give over 0 Billion per year to charitable causes.


David showing Daily Bread: Infographic on how much we, as Americans, spend per day on programs like NASA, Housing, etc


David showing blocks falling into place representing various economic expenditures, etc: Debtris (Playing on Tetris)


David: I LOVE to play with data. Just like children learn through play, so can we if we play with out data. #Explore


David is showing a chart of Fears that look like mountains in our way. Nice touch.


Highlighting the Fear of violent video games. Spike in November. And a Spike in April (related to Columbine tragedy)


Columbine often associated in media with violent video games. #Garbage


Most common “Break Up” times according to Facebook status updates. Mondays are apparently a poplar day to break up. #TheMoreYouKnow


Data is the New Oil #GreatMetaphor David’s is better: Data is the New Soil


The best way to start really using your data? Start asking questions.


“Do horoscopes always say the same thing?”


David scraped most common words in horoscopes. Boiled down: Whatever the situation or secret moment, enjoy everything a lot.


I learned a LOT in this process, just by playing


US Military Budget is MUCH larger than everyone else. But does that mean we are way militaristic?


When you compare military budgets as a percentage of GDP, US is 6th. #GoingOneMoreLayerToFineMeaning


Context is SO important to get a clear, true picture of our data. “Let the data set change your mindset”


Visualization what a Million Lines of Code looks like. Large Hadron Collider has more than 50 million lines of code.… I can’t even count them.


Cool visualization: Our galaxy creates 7 new stars per year. Planets are the norm, rather than the rarity we thought they were.


Given all that, statistically speaking, there could be 46 communicating civilizations in our galaxy.


“Pimp your data”


Of all the work I have done in my career, there is one thing I am most famous for: a Helicopter game that was ripped off by Flappy Birds Ninja


Our brains are incredibly visual. Three out of four neurons in our brain are visual.


We process visual images far better than text.


David showing a visualization: Who’s Suing Whom in Telecoms? Crazy pattern. Begs question: Do decreasing profits lead to increase in lawsuits?


David showing most popular Internet Search Terms laid out in maps of the countries in which they were used. Brilliant.


Great Venn Diagram showing how Pigs, Birds, and Humans relate by Flu: Influ-Venn-Za #MyKindOfHumor


If Twitter were made up of 100 users, 20 would be Dead (Empty accounts), and 75% of the Tweets would come from 5 users (Loud Mouths).


Showing visualization of most commonly used PIN numbers (gathered from previous data breaches). You guys, please choose better ones.


Data Visualization lets you see the Invisible


Is there a relationship between the efficacy and the popularity of vitamin supplements? Not much.


87% of supplements have no research behind them that indicates they have value. #Marketing


Key aspect of Data Visualization: You can compress a massive amount of information into a small, meaningful space.


Data Visualization is SO important for Mobile where you have much less screen space to work with.


David: It used to take me three days to make that Billion dollar Treemap. Now it takes 3 milliseconds. #ProgressYo


Qualitative Data is also Invisible. But you can Visualize it to fix that.


David, when visualizing some political comparisons, I REALLY wanted my viewpoint to look better than the other, but I couldn’t do that. #Honesty


There are organizations that use data visualization to LIE. Learn more about Data Viz so you can see the signs of impropriety.


Data Visualization of views can better reveal the places where opposing viewpoints actually overlap.


In the Newspapers, there is often an anxiety over white space: They Fear it. As a result, the meaning of the data is lost in noise.


In Data Visualization, Bling can create a barrier to understanding. It is the Data that should POP, not the shiny.


In Data Visualization, you should use your scissors more than your paint brush. #LessIsMore


Key takeaway from David: PLAY. Play leads to insight and understanding.


David showing visualization: Who is more Popular: Han Solo or Luke Skywalker? David: Interesting, no mention of either prior to 1977…


David: There is so much data in the world; what else can you do but play with it? #Exactly


Go to  for David’s visualizations and great insights and data.


David: I never design ANYTHING until the data is sorted and clean. #FollowThisExample


David has created a symbology of stars so that they are not just points of light, but lack the resources to build it. Kickstarter?


David: Data is not this empirical thing that is immune from bias. It takes work and effort to keep bias out.


David: In terms of color, sometimes you need to come up with a meaningful metric that will be served well by color.


That wraps up the Day 2 Keynote here at the PASS Business Analytics Conference. Thanks much for following. I hope you found it helpful.

PASS Business Analytics Conference 2014 – Live Blog – Day 1 Keynote

8 May, 2014 (09:47) | Business Analytics, Live Blog, PASS, Power BI, SQL | By: Mark V

Good morning from the PASS Business Analytics Conference in San Jose, CA! The Day 1 Keynote by Mifrosoft’s Amir Netz and Kamal Hathi will begin soon after the opening by PASS President Thomas LaRock which begins in about 15 minutes. If you want to follow along on Twitter, pay attention to the #passbac hash tag.


Attendees are filing into the Dell Grand Ballroom.


I have to apologize to my fellow live bloggers for being so handsome and witty. I can’t help it.


Attendees continue to file in. We should be getting started in a few minutes.


Here we go, yo.


“Business Intelligence and Business Analytics are at the forefront of business today.” #Truth


PASS President Thomas LaRock taking the stage.


Tom: We get paid to work with data every day. How awesome is that? #Truth


Regardless of your title, if you work with Data, you are the lifeblood of your organization.


Tom: @sqlpass has over 100,000 members! Awesome!


Tom: Find a way to get involved with @sqlpass somehow. There are so many ways to get involved.


Tom: Pick up networking buttons at the Community Zone. Minimum count of “pieces of flair” is 31.


Make sure you spend some time with the sponsors. They make events like this possible. Woohoo!


Tom: Give the sponsors a hug. #Embracing


Tom: Next year’s PASS BA Conference will be in Santa Clara, CA


John Wittaker of Dell taking the stage: Big Data, Predictive, & The Middle Market


John: Big Data projects are not just for the enterprise


John: Not only is Big Data happening, but it is being invested in at the appropriate level


John: When IT and Business COLLABORATE, we are much more likely to have a successful outcome. #NoBrainer


John: “Real Time” and Predictive Analytics are the most valuable Big Data tools, according to Dell’s survey


John: Big Data Challenges = Complexity, Volume, and Budget


John: Data Complexity, though, is the major challenge today.


John: You will see vendors like Dell and others abstracting that complexity to make things easier.


John: Dell’s goal is to offer an end to end Information Management solution to manage, integrate, and protect data


Microsoft’s Amir Netz and Kamal Hathi taking the stage. This is going to be awesome!


Kamal: It’s been an amazing year of innovation at Microsoft: Power BI, Power Query, Power Map, HDInsight, SQL 2014…


Amir: We have a new CEO, Satya Nadella. Satya introduced SQL 2014. A CEO has never introduced SQL like that before.


Kamal: While I like Satya for a lot of reasons… Satya has made Indian accents “cool” now. #WellPlayed


Kamal: 2 million+ #PowerPivot downloads. 100 thousand + #PowerQuery downloads since Feb 2014


Kamla: Let’s ditch this Power Point and get over to #PowerBI and look at data the way we want to.


Kamal: #PowerBI Tennants by country in Q and A: It is all over the world. Way cool


Kamal: In Iceland, #PowerBI is kind of hot. #MyKindOfHumor


Kamal: Q&A has been a huge driver for #PowerBI adoption


Kamal: Over 1 million questions asked in Q&A in April 2014. Wow. Way cool.


Kamal is using Q&A in #PowerBI to learn about usage of Q&A in #PowerBI.


Kamal is doing #PowerView style data exploration directly in Q&A: “We made this feature more discoverable” #Awesomesauce


Amir: Power BI In Action discussing the #PowerIB Demo Contest. Winner: Michael Carper, who is taking the stage now.


Amir is walking through Michael’s contest entry about realtionship between Tweets and #NBA Game results


Amir could make watching paint dry sound exciting. Love his style.


Kamal now showing data by browser usage by using by #PowerIB in #Chrome #HTML5


Kamal: We’re doing this demo on a Samsung Chromebook. #HTML5 #PowerBI will just work on any platform


Amir: Native IOS app for #PowerBI will be available this summer.


Kamal: We want you to be able to take the investments you have already made on prem and use them in the Cloud #NotAllOrNothing


Amir: When was the last time you saw AdventureWorks in a keynote demo???


Amir: Reporting Services will become a natively integrated component of #PowerBI this Summer #ssrs Great News!


Kamal: The data exists in YOUR environment, but we can still render it with #SSRS in the cloud.


Kamal: In premise is still hugely important. We are working on bringing more of #PowerIB to on prem.


Kamal: YOUR BI ON YOUR TERMS. That is our promise to you. #LoveIt #MSBI


Amir: Today, we are on the verge of a new age. DATA CULTURE: Give the power to EVERYONE #BIToTheMasses


Amir: Data Culture is when everyone in an organization can use data


Amir: Aloha to a demo on Hawaii


Amir: Hawaii collects information forms from travelers on what they did in Hawaii just so we can have a cool demo today.


You guys, Q&A is just friggin cool. #NuffSaid


Amir: Moloka’i, known for it Leper Colony, is not the top of the list on most visited Hawaiian Islands. #TheMoreYouKnow


Amir analyzing when people from various nations visit Hawaii in Q&A directly. Not in #PowerView. IN Q&A.


Amir: Hawaii is the Niagara Falls for the Japanese.


Amir: You want a great investment? Invest in Honeymoon Suites for the Japanese in Hawaii. #ActionableInsight


Amir introducing KPIs in #PowerBI #AweYeah


Amir created a cool dashboard in less than a minute just using natural language. NO CODE. #PowerBi


Amir: How can we make Predictive Analytics and Data Science so easy that anyone can use it?


Amir: Announcing FORECASTING in #PowerView! #PredictiveAnalytics


Amir: We have some of the best Data Scientists and Researchers at @Microsoft. We put them to work to make it easy.


Amir: Forecasting is available NOW in every Line Chart in #PowerView Just click and drag the line forward to forecast


Amir: You can even correct for one-time events by clicking and dragging in the chart.


Amir: You can even TEST your forecast against older data to compare with actual results. #IWorkInSuchAGreatIndustry


Amir and Kamal bring Julie (audience volunteer) on stage for live BI demo.


Amir: here we have a #PowerView create around shark attacks. What if the user has other questions NOT answered by the report?


You can now get the Field List for a report in #PowerBI. So, users can tweak an existing report for their needs


Treemap introduced for #PowerView


Julie is dragging fields from one chart to another to explore a report and change it on the fly. #BIToTheMasses


Kamal: It’s a good time to caveat, this is a pre-release item… #AsTheDemoHiccups


Wow. Drag items out of one chart into a blank are to create a NEW chart


Amir: Getting attacked by a shark while swimming is not a very good thing. #ActionableInsight


Amir: If you go into the Ocean, take a weapon with you. Shark attacks while spear fishing have higher survival rate.


Amir: This “Gender” field is about the victim, not the shark. Not time to check while getting attacked. #NotRealTime


Amir: Which shark likes the ladies the most? The Bull Shark. #Duh


This demo with shark data is not just a fish story. #TellingStoriesWithData


Amir just did the “bump and shake” to combine two charts in #PowerView. Wow. Way cool.


Amir: The data does not lie – If you put sharks in swimming pools, people will learn to swim much faster #Truth


Amir: These features aren’t just for BI Pros, they are for EVERYONE! #BIToTheMasses


Amir: We are starting a new journey and we are so excited you are coming along with us. #MeToo


HUGE thanks to Microsoft’s Amir Netz and Kamal Hathi on another awesome keynote. #ILoveMyJob


That wraps it up for this morning’s Keynote. Thanks for following. I hope you found it helpful.