Clear(er) Accountability: 4 Key Takeaways from the Policy on Results so far…

Treasury Board Canada’s Policy on Results (published July 2016) is intended to shift how departments report on, and communicate, progress and their value to Canadians, the country, and the world at large. The structure of the new Policy on Results (including the new Departmental Results Framework and Program Information Profiles) provides an opportunity to link spending to real (read: tangible) results in a way that the previous structure didn’t accomplish.

With influences from Deliverology, the new policy addresses quite a few gaps in the previous approach, including; better consideration of the audience (parliamentarians and Canadian public), increased attention towards target setting, and new reporting components (delivery chains, for example) to communicate more comprehensive performance stories. Once Departments are operating in compliance with the Policy, there should be a much clearer picture of the impact being made by departmental spending.

We’ve had an opportunity to work with several departments who are in the process of developing their Departmental Results Framework and Program Information Profiles, and we want to share some of our key takeaways so far:

Outside-In Perspective

For many programs, past performance reporting has relied on internal focused measures, typically volumetrics (# of benefits delivered, # of inspections conducted, etc.). As these programs develop PIPs and provide input into DRF-level “results”, they will be required to shift their thinking to more of an outside-in perspective, and consider how their work (activities and outputs) provides value to external parties. This will be especially difficult for programs whose ultimate outcomes are more indirect, such as those programs that:

  • Provide funding through delivery partners (Provinces, non-profit organizations, etc.)
  • Seek to influence results in areas with many external factors (for example: pollution reduction and financial security)

It’s also shining a bright light on the fact that our performance measures to date within programs have been very inward facing, and it’s going to take a while before we become comfortable with (and good at) putting ourselves in the shoes of our clients.

Complex Programs = Complex Results

Programs are not static entities. Their client groups, outputs, activities, and even intended outcomes change over time to account for different political priorities, legislation/regulations, funding, and client needs. As programs evolve, their delivery mechanisms change, especially where the scope of responsibility has been adjusted (typically adding or removing specific client segments). Additionally, some programs are more susceptible to demand fluctuations, and can suffer drops in performance as a result of external factors.

The challenge is in capturing clear and concise program outcomes and results, while not necessarily representing every inevitable exception.

Harmonized Measurement

For some time now, Departments have been working to improve organizational maturity in performance measurement, including data collection, management, and analysis. Progress has been positive, albeit incremental. One of the biggest barriers has traditionally been that each program or organizational unit collects and uses its own data, irrespective of what other (potentially similar) programs may be collecting.

The new Program Information Profile requirement has the potential to act as a widespread “reset” for performance measurement, and provide an opportunity to have better alignment in measurement approaches across departments.  If multiple programs are providing a similar/same output to different client groups, there should be serious consideration of how they can report a similar performance story. The shift is going to cause frustration for programs in the short term, but will be beneficial to both the department and Canadians in the long run.

Policy-Results or Service-Results? (hint: Both)

There’s another TBS Policy that Departments are working to comply with; the Policy on Service. If the Policy on Results is to help improve results related to high-level policy objectives, and how departments report on their progress to external audiences, the Policy on Service is designed to improve…services.  The problem we are seeing is that the Policy on Results doesn’t take services-delivery into consideration, meaning that programs won’t be able to communicate service-delivery as part of their performance story. There’s always been a bit of a divide between policy shops and service delivery teams, and these two policies are no exception.

The new focus on high-level policy results is an important step for the Government of Canada. However, we suspect that the audiences (parliamentarians, Canadian public) care about how well services are being delivered as well as their overall impact. We’ve been working with a few departments to reconcile this divide and we’re curious to see how others tackle it too.

Those are our biggest observations at this point, and I’m sure we’ll have many more as the implementation of the Policy on Results continues to unfold.

The Government of Canada dedicates massive amounts of attention and resources to try to address many of the big (hairy) problems facing the country. The impacts of these efforts are often the sum of different programs contributing to achieve the same results, and the new Policy on Results should help improve planning, reporting and adjustment going forward. I’m sure there will be more challenges along the way, but I’m looking forward to seeing how these improvements are implemented, and how they will drive evidence-based decision making in the future.


Leave us a comment: * Your information is never shared