A shift in approach between Business Intelligence and Data Analytics

Over the last few years, Data Analytics (DA) have evolved substantially over traditional Business Intelligence (BI) approaches to supporting decision making within organizations.[1] In many cases, existing BI systems are not structured in a way that would allow them to keep pace with the demands being placed on them.
The desire for more “evidence”[2] with which to make informed decisions is being felt across government. Increasingly, DA is seen as the primary way of remaining informed. The shift from BI to DA will require a substantial change in the way these systems are approached from a skills and resourcing perspective.

Expectations have been mounting on IM/IT teams to deliver dynamic analytics capabilities to the areas they support. Figure 1 illustrates the general levels of maturity that organizations are working to implement in terms of their data use. The expectation being that as data use matures, organizations will be able to derive more value form data. Past BI and analytics models rely heavily on the continual assistance of IM/IT professionals. They are not only needed to implement interface layers but they are often continually needed to build reports for clients to use. This approach provides clients with standard views into historic data that gives them insight into the performance of their program. These solutions also allow for some exploration of structured data through ad hoc querying, but users often find these capabilities lacking. The reliance on IM/IT professionals to build views and the increasing demand for new and more dynamic views from clients often lead to what feel like intolerable wait times for clients who need answers to pressing questions quickly.

This approach to analytics can be said to fall into the first maturity level of data driven analytics.[3] This level is often referred to as Descriptive Analytics and is characterized by using past data to answer questions like “what happened?”, “where did it happen?” or “how often did it happen?”. DA has evolved beyond this to include two further levels of maturity that are often labelled as Predictive Analytics and Prescriptive Analytics.[4] These levels build on each other to develop more advanced capabilities that allow for answer to questions like “what might happen?” and “what is the optimal approach?”. Increasing demand for more evidence is a reflection of the desire for answers to questions beyond those that Descriptive Analytics can answer. Strong BI capabilities form the foundations of Descriptive Analytics but this only provides insight, and not answers, into types of questions that should be asked about the future.

Unfortunately, a strong BI function does not guarantee a smooth transition from the realm of description to prediction. This is due partly to a fundamental shift in the way skills are approached between standard implementations of the two levels of analytical maturity. Even at the height of the BI centered approach, the heavy reliance on IM/IT professionals described previously created issues of scalability.[5] IM/IT teams were often hard pressed to keep up with the demand for new views. Predictive Analytics approaches encourage iteration and experimentation, which results in an influx of demands when maintaining the traditional BI model. This influx makes it almost impossible for IM/IT teams to keep up with the requests without a significant, and possibly ill-advised, increase in resources. To address this resource demand issue, many organizations at or beyond the second level of maturity have instituted a self-service DA model.[6] The challenge that has existed for sometime now across all sectors is a critical skills gap between IM/IT professionals and traditionally program analysts. This challenge is a major limiting factor when it comes to implementing higher maturity levels of analytics within any organization.

At this point, the idea of the Data Scientist as the solution to this challenge is well established.  Unfortunately, these individuals remain in short supply. Organizations will need to find more creative ways of addressing this skills gap if increasing DA maturity is an immediate goal. This should include encouraging and supporting existing analysts to develop enough skills to get by with less direct assistance from IM/IT. Understanding Tableau vs Power BI can also be a great way to improve the skill gap for more productivity. Thanks to the explosion of interest in DA, a plethora of learning and development resources are available to help analysts build the skills they need. The strategy for filling the skills gap should also include more deliberate approaches to team composition. If the skills necessary cannot be found in one person, it can often be much easier to build a team that checks off all the boxes. IM/IT will increasingly become a coordinator and facilitator as these skills find themselves more and more imbedded in the program areas. Although this shift will be a challenge for the program areas, the end result should allow for much more meaningful analysis. Allowing the subject matter experts greater control of the analysis approach should lead to better findings than concentrating analysis expertise in a group that has no context for the data they work with.

[1] Jason Lewis, “2018 Planning Guide for Data and Analytics” (Gartner, 2017), 3-4.

[2] Note: Too often all “evidence” provided in the form of data is treated equally. As we strive to use more data driven evidence in our decision making it is critical that we remember not all evidence is created equally. This is a topic I hope to explore further in subsequent posts.

[3] Thomas H. Davenport, Jeanne G Harris, Competing on Analytics (Boston, Harvard Business Review Press, 2017), 26-27.

[4] Note: A fourth level of maturity is sometimes included, which is referred to as Automated Decision Making. For the purposes of this article the fourth level is far enough beyond the transition being discussed that it can be excluded.

[5] Lewis, “Data and Analytics”, 13.

[6] Lewis, “Data and Analytics”, 13.

Tyler Sinclair is a consultant with three years of experience working within the public sector. He has acted as a business analyst across a number strategic initiatives and IM/IT enabled projects for several departments within the Canadian Government. He is part of Systemscope's Strategic Business Consulting practice.


Leave us a comment: * Your information is never shared