MAF 2022 to 2023 service and digital management

On this page

AoM Context

The 2022-23 Service and Digital Area of Management (AoM) assessment focused on advancing the delivery of services and the effectiveness of government operations through the strategic management of government information and data and leveraging of information technology, but also on other areas under the Policy on Service and Digital such as cyber security, open government and accessibility. As we continue to explore areas for innovation and experimentation, the questions in this year’s Management Accountability Framework (MAF) were designed to move the dial on the 2021-22 results, to focus on increasing integration of service, information, data and technology in governance, planning and reporting; increasing client-centric design and delivery; and enhancing strategic management of information, data, service and technology.

The 2022-23 Service and Digital Area of Management methodology examines large departments and agencies on data and information management, information technology, client-centric design and delivery, accessibility, open government and cyber security.

This cycle’s MAF methodology focuses on 4 strategic themes:

  1. Building the enterprise (Integration) 
    Increasing the integration of service, information, data and technology in governance, planning and reporting.
    This strategic theme encompasses the following questions: Q16, Q17 & Q18
  2. Focus on the user (Client-centricity)
    Increasing client-centric design and delivery
    This strategic theme encompasses the following questions: Q1, Q2, Q3, Q4, Q5, Q6, Q7, Q8 & Q14
  3. Strategic assets- data, information, technology and service (Strategic management)
    Enhancing the strategic management of information, data, service and technology
    This strategic theme encompasses the following questions: Q9, Q10, Q11, Q12, Q13 & Q15
  4. Workforce of the future (Capacity)
    Increasing workforce capacity relating to service, information, data and technology and supporting fully digital delivery by managing a government-wide culture shift
    This strategic theme is not assessed in this cycle of MAF.

Theme 1: Client-centric service design and delivery

Theme 1 Overview

The client-centric service design and delivery theme provides insight into service standards, the availability of online services, publishing of real-time performance and service improvement activities. This will allow a focus on key indicators, ensuring the quality and continuity of service to Canadians.

Theme 1 Sub-Category A: Service Standards Maturity

Theme 1 Sub-Category A Overview

Service standards reinforce government accountability by making performance transparent. They also increase the confidence of Canadians in government by demonstrating the government’s commitment to service excellence. Setting and meeting service standards are essential for good client service and effectively managing performance. They help clarify expectations for clients and employees and drive service improvement. Service standards also help clients make time-sensitive, important decisions about accessing services and other expectations relating to services.

Service standards should also be regularly reviewed and improved to ensure that they are comprehensive, meaningful and relevant. Reviewing service standards helps identify any gaps or areas for improvement and courses of action to address key gaps in performance or to keep up client expectations as services evolve.

Question 1: Existence of service standards Preserved (Element A of Q1 in 2021-22)

The question is: As service standards are required under the Policy on Service and Digital, what is the percentage of services that have service standards?

To answer:

  • TBS to answer
  • Department or Agency to answer
  • Both TBS and Department or Agency to answer

Rationale

Under section 4.2.1.4 in the Policy on Service and Digital, Deputy heads are responsible for ensuring services have comprehensive and transparent client-centric standards, related targets, and performance information, for all service delivery channels in use, and this information is available on the department’s web presence.

Setting service standards is essential for good client service and effective management of performance and demonstrates the government’s commitment to service excellence. Service standards help clarify expectations for clients and employees and drive service improvement. Service standards also help clients make time-sensitive, important decisions about accessing services and other expectations relating to services. Services that do not have any standards cannot be managed effectively, thereby eroding trust in government.

Category

Expected Results

Target: 100%

Assessed Organizations

All LDAs will be assessed.

Period of Assessment

April 1, 2021-March 31, 2022.

Calculation Method

(Number of services that have at least one service standard / Total number of services) * 100

Rating Scale:

  • High: [80-100]
  • Medium: [50-79]
  • Low: [0-49]

Evidence Requirements

  • Department to provide evidence
  • TBS to provide evidence
  • Other evidence to be considered (please provide)

Data collection Method

Data from the GC Service Inventory will be used for the calculation.

Government-wide Comparison

  • Yes the results of the indicator will be used for comparison across departments
  • No (please provide an explanation)

Year-Over-Year Analysis

  • Yes
  • No

Departmental Results Framework (DRF) (TBS use only)

Is this indicator used in the TBS DRF?

  • Yes
  • No

Annex A

N/A

Annex B (if applicable)

N/A

Question 2: Service standards targets Preserved (Element B of Q1 in 2021-22)

The question is: What is the percentage of service standards that met their target?

To answer:

  • TBS to answer
  • Department or Agency to answer
  • Both TBS and Department or Agency to answer

Rationale

Under section 4.2.1.4 in the Policy on Service and Digital, Deputy heads are responsible for ensuring services have comprehensive and transparent client-centric standards, related targets, and performance information, for all service delivery channels in use, and this information is available on the department’s web presence.

Measuring and meeting service standards is essential for good client service and effectively managing performance. Service standards help clarify expectations for clients and employees and drive service improvement. Clients gain confidence in the government when standards are met consistently. Departments are encouraged to allocate resources to meet any new improved service levels. Services that do not meet their standards risk erode trust in government.

Category

  • Policy Compliance
  • Performance
  • Baseline

Expected Results

Target: 80% (High)

Short term target is 80% and the long-term target is 100% for policy compliance. Based on the last MAF cycle, the indicator result was assessed at a medium level. In order to create a realistic target for departments, the current target is for 80%.

Assessed Organizations

All LDAs will be assessed.

Period of Assessment

April 1, 2021-March 31, 2022.

Calculation Method

(Number of service standards met / Total number of service standards) * 100

Rating Scale:

  • High: [80-100]
  • Medium: [50-79]
  • Low: [0-49]

Evidence Requirements

  • Department to provide evidence
  • TBS to provide evidence
  • Other evidence to be considered (please provide)

Data collection Method

Data from the GC Service Inventory will be used for the calculation.

Government-wide Comparison

  • Yes the results of the indicator will be used for comparison across departments
  • No (please provide an explanation)

Year-Over-Year Analysis

  • Yes
  • No

Departmental Results Framework (DRF) (TBS use only)

Is this indicator used in the TBS DRF?

  • Yes
  • No

Annex A

N/A

Annex B (if applicable)

N/A

Question 3: Real-time performance for service standards Updated (Q5 in 2019-20)

The question is: As real-time performance reporting is required under the Directive on Service and Digital, what is the extent to which real-time performance reporting for services is published?

To answer:

  • TBS to answer
  • Department or Agency to answer
  • Both TBS and Department or Agency to answer

Rationale

Real-time performance reporting is an essential tool for enabling the public to hold the Government of Canada to account in regard to our service provision. Publishing real-time performance information also informs the public about the current level of service provision they can expect and support them in making informed decisions. Real-time performance reporting is an integral part of service management and implementing client-centric service design and delivery. The requirement is to ensure that newly designed or redesigned online services provide reporting of real-time performance information for service standards is available on the department’s web presence as per 4.2.1.6 in the Directive on Service and Digital. This indicator allows Deputy Heads to have a better understanding of how their department implements openness and transparency of their processes in providing services and how they contribute to client satisfaction.

Category

Expected Results

Target: 100%

Canadians know what to expect in terms of service wait times. It shows deputy heads how departments are adhering to the Policy.

Assessed Organizations

All LDAs will be assessed.

Period of Assessment

April 1, 2021-March 31, 2022.

Calculation Method

Percentage of services that have real-time performance results published on the department’s web presence.

(Number of services with real-time performance results published on the department’s web presence / Total number of services) *100

Rating Scale:

  • High: [80 – 100]
  • Medium: [50 – 79]
  • Low: [0 – 49]

Evidence Requirements

  • Department to provide evidence
  • TBS to provide evidence
  • Other evidence to be considered (please provide)

Data collection Method

Data from the GC Service Inventory will be used for the calculation.

Government-wide Comparison

  • Yes the results of the indicator will be used for comparison across departments
  • No (please provide an explanation)

Year-Over-Year Analysis

  • Yes
  • No

Departmental Results Framework (DRF) (TBS use only)

Is this indicator used in the TBS DRF?

  • Yes
  • No

Annex A

N/A

Annex B (if applicable)

Real-time application status: Information on the current standing of a request for a service or product.

Real-time performance information: Information on the current level of performance that clients can expect to be provided for a service, relative to an established standard. (Note: This is different from real-time application status that is the current status of a specific request).

Real-time service delivery performance information can be grouped into three categories based on the frequency of updates and the speed in which information is processed:

  • Timed updates: service delivery performance information is made available to clients based on timed or scheduled events
  • Near real-time updates: service delivery performance information is made available to clients with minimal delay
  • Instantaneous updates: service delivery performance information is made available to clients immediately and without delay
Question 4: Service standards reviews Updated (updated from Element C of Q1 in 2021-22)

The question is: What is the percentage of service standards which have been reviewed?

To answer:

  • TBS to answer
  • Department or Agency to answer
  • Both TBS and Department or Agency to answer

Rationale

Once service standards have been developed, they should be regularly reviewed and improved to ensure that they are comprehensive, meaningful and relevant. Reviewing service standards helps identify any gaps or areas for improvement and courses of action to address key gaps in performance. Ensuring the regular review of service standards, related targets and performance information, for all services and all service delivery channels in use is a requirement in the Policy on Service and Digital.

Category

Expected Results

Target: 100%

Reviewing service standards regularly ensure up to date service standards information to clients and help clients gain confidence in the government.

A regular review of whether service standards and operational targets are being met can help senior managers determine whether resource adjustments are required, particularly if the variance between service standard and actual performance is long-standing.

Assessed Organizations

All LDAs will be assessed.

Period of Assessment

April 1, 2021-March 31, 2022.

Calculation Method

(Number of services standards which have conducted service standard reviews / Total number of service standards) * 100

Rating Scale:

  • High: [80-100]
  • Medium: [50-79]
  • Low: [0-49]

Evidence Requirements

  • Department to provide evidence
  • TBS to provide evidence
  • Other evidence to be considered (please provide)

Data collection Method

Data from the GC Service Inventory will be used for the calculation.

Government-wide Comparison

  • Yes the results of the indicator will be used for comparison across departments
  • No (please provide an explanation)

Year-Over-Year Analysis

  • Yes
  • No

Departmental Results Framework (DRF) (TBS use only)

Is this indicator used in the TBS DRF?

  • Yes
  • No

Annex A

N/A

Annex B (if applicable)

N/A

Theme 1 Sub-Category B: Online Availability of Services

Theme Sub-Category B Overview

Providing online client-based services is a policy requirement under the Policy on Service and Digital. Online services are convenient for many clients and are significantly more cost-effective than services delivered through more traditional channels such as in-person or telephone. Assessing the extent to which services can be completed online end-to-end and the extent to which applicable interaction points for services are online enables visibility on departmental progress towards this policy requirement.

Question 5: Online end-to-end Preserved (Element A of Q2 in 2021-22)

The question is: As online end-to-end availability of services is required under the Policy on Service and Digital, what is the percentage of applicable services that can be completed online end-to-end?

To answer:

  • TBS to answer
  • Department or Agency to answer
  • Both TBS and Department or Agency to answer

Rationale

Under section 4.2.1.2 in the Policy on Service and Digital, Deputy heads are responsible for maximizing the online end-to-end availability of services and their ease of use to complement all service delivery channels.

Services offered online end-to-end give clients options to interact with government services remotely and electronically and are a key requirement in the Policy on Service and Digital and to improve accessibility to services. Services with partial or no online interaction points risk not meeting client expectations of accessing government services anytime, anywhere, as well as increased intake and processing.

Category

Expected Results

Target: 100%

Assessed Organizations

All LDAs will be assessed.

Period of Assessment

April 1, 2021-March 31, 2022.

Calculation Method

(Number of services that can be completed online end-to-end / Total number of applicable services) * 100

Rating Scale:

  • High: [80-100]
  • Medium: [50-79]
  • Low: [0-49]

Evidence Requirements

  • Department to provide evidence
  • TBS to provide evidence
  • Other evidence to be considered (please provide)

Data collection Method

Data from the GC Service Inventory will be used for the calculation.

Government-wide Comparison

  • Yes the results of the indicator will be used for comparison across departments
  • No (please provide an explanation)

Year-Over-Year Analysis

  • Yes
  • No

Departmental Results Framework (DRF) (TBS use only)

Is this indicator used in the TBS DRF?

  • Yes
  • No

Annex A

N/A

Annex B (if applicable)

N/A

Question 6: Online client interaction points Preserved (Element B of Q2 in 2021-22)

The question is: As online end-to-end availability of services is required under the Policy on Service and Digital, what is the percentage of client interaction points that are available online for services?

To answer:

  • TBS to answer
  • Department or Agency to answer
  • Both TBS and Department or Agency to answer

Rationale

In the Policy on Service and Digital, maximizing online availability is a policy requirement. While online end-to-end services is the goal, there are instances where the client has to move offline to complete a step in the process. While departments are encouraged to consider the possibility of providing all outputs online, this question measures departmental progress toward that goal, taking into account all the enabled interaction points, even if a service is not fully online end-to-end yet.

See section 2.3 of the Guideline on Service and Digital for a more robust definition of online services.

Category

Expected Results

Target: 100%

Assessed Organizations

All LDAs will be assessed.

Period of Assessment

April 1, 2021-March 31, 2022.

Calculation Method

(Online enabled interaction points / Total number of applicable interaction points) * 100

Rating Scale:

  • High: [80-100]
  • Medium: [50-79]
  • Low: [0-49]

Evidence Requirements

  • Department to provide evidence
  • TBS to provide evidence
  • Other evidence to be considered (please provide)

Data collection Method

Data from the GC Service Inventory will be used for the calculation.

Government-wide Comparison

  • Yes the results of the indicator will be used for comparison across departments
  • No (please provide an explanation)

Year-Over-Year Analysis

  • Yes
  • No

Departmental Results Framework (DRF) (TBS use only)

Is this indicator used in the TBS DRF?

  • Yes
  • No

Annex A

N/A

Annex B (if applicable)

N/A

Question 7: ICT Accessibility New

The question is: As accessibility is required under the Policy on Service and Digital, what is the percentage of services available online that have been assessed for ICT accessibility?

To answer:

  • TBS to answer
  • Department or Agency to answer
  • Both TBS and Department or Agency to answer

Rationale

Ensuring that all services are free from barriers is a government priority and a key objective of the Accessible Canada Act. Given the digital government agenda to make services available online it is critical that the online components of these services are fully accessible. To achieve this, the services should be assessed against the Standard on Web Accessibility, the Guideline for Making Technology Usable by All, Web Content Accessibility Guidelines (WCAG), to align with international requirements. A baseline is required to assess the state of Information and Communication Technologies (ICT) accessibility across the GC and to inform the ongoing monitoring of performance and compliance with the forthcoming Standard.

This will provide Deputy Heads with a better understanding of how their departments focus the accessibility of their services in how they assist everyone, facilitate the inclusion of diverse segments of Canadians and enable a significant segment of the population with diverse functional needs and abilities to participate fully and productively in all aspects of life, including effective interaction with the Government of Canada as service clients.

Category

  • Policy Compliance Policy on Service and Digital section 4.4.2.2.
  • Performance
  • Baseline

Expected Results

Target: 100%

The addition of this question is to raise awareness about the existing state of ICT accessibility with Government of Canada services and have a baseline for oversight and track progress as the community matures and implements accessibility measures.

Assessed Organizations

All LDAs will be assessed.

Period of Assessment

April 1, 2021-March 31, 2022.

Calculation Method

(Number of services with online enabled interaction points that have been assessed for ICT accessibility / Total number of services with one or more online enabled interaction points) * 100

Rating Scale:

  • High: [80-100]
  • Medium: [50-79]
  • Low: [0-49]

Evidence Requirements

  • Department to provide evidence
  • TBS to provide evidence
  • Other evidence to be considered (please provide)

Data collection Method

Data from the GC Service Inventory will be used for the calculation.

Government-wide Comparison

  • Yes the results of the indicator will be used for comparison across departments
  • No (please provide an explanation)

Year-Over-Year Analysis

  • Yes
  • No

Departmental Results Framework (DRF) (TBS use only)

Is this indicator used in the TBS DRF?

  • Yes
  • No

Annex A

N/A

Annex B (if applicable)

N/A

Theme 1 Sub-Category C: Service Review and Improvement

Theme 1 Sub-Category C Overview

A review of services consists of a systematic assessment of an organization’s services against a set of predetermined criteria to identify opportunities for service improvement, including greater effectiveness and increased efficiency. The regular review of services is a key practice in ensuring that services:

Client feedback is a critical input into ensuring that services meet the needs of clients and to support continual improvement. It serves several key purposes, including:

Question 8: Client feedback Preserved (Element D of Q3 in 2021-22)

The question is: As ensuring client feedback is used to inform continuous improvement of services is a requirement under the Directive on Service and Digital, what is the percentage of services which have used client feedback to improve services in the last year?

To answer:

  • TBS to answer
  • Department or Agency to answer
  • Both TBS and Department or Agency to answer

Rationale

Service improvement based on client feedback is an integral part of client-centric design and a requirement in the Policy on Service and Digital with service improvement activities required to be reported in Departmental Plan Strategic Plans. Seeking client feedback and using it to improve services is a foundational element in ensuring client centricity and that the needs of clients are met. Departments must use the various tools at their disposal, such as in-service client feedback, client satisfaction surveys, and user experience testing, to improve their services.

Category

Expected Results

Target: 100%

Assessed Organizations

All LDAs will be assessed.

Period of Assessment

April 1, 2021-March 31, 2022.

Calculation Method

(Number of services which identified 2021-22 as the year in which the service was improved based on client feedback / Total number of services) * 100

Rating Scale:

  • High: [80-100]
  • Medium: [50-79]
  • Low: [0-49]

Evidence Requirements

  • Department to provide evidence
  • TBS to provide evidence
  • Other evidence to be considered (please provide)

Data collection Method

Data from the GC Service Inventory will be used for the calculation.

Government-wide Comparison

  • Yes the results of the indicator will be used for comparison across departments
  • No (please provide an explanation)

Year-Over-Year Analysis

  • Yes
  • No

Departmental Results Framework (DRF) (TBS use only)

Is this indicator used in the TBS DRF?

  • Yes
  • No

Annex A

N/A

Annex B (if applicable)

N/A

Theme 2: Data and Information Management

Theme 2 Overview

Enhanced strategic management of information, data, service and technology

The focus of this theme is to assess how well departments are managing information and data as strategic assets. Improvements in the management of information and data can help departments and agencies accelerate their operations, improve client services and meet policy and regulatory requirements.

Theme 2 Sub-Category A: Open and Strategic Management of Information

Theme 2 Sub-Category A: Overview

Managing data and information as strategic assets and using solutions for digital service delivery drives cross-government improvement.

The focus for this sub-category is section 4.3 of the Policy on Service and Digital: Open and strategic management of information, as well as the Directive on Automated Decision-Making.

Question 9: Standardized metadata Updated (Q6 in 2021-22)

The question is: As standardized metadata is a requirement under the Standard on Metadata, what percentage of active systems used by your department use standardized metadata?

To answer:

  • TBS to answer
  • Department or Agency to answer
  • Both TBS and Department or Agency to answer

Rationale

The use of standardized metadata to manage information makes the use of information and data more accessible, findable, reusable and interoperable. Understanding what percentage of active systems use standardized metadata provides a better understanding of where departments currently stand on their journey of managing information and data. The use of data and information provides Deputy Heads with opportunities to improve how their organization delivers on its mandate and enables the Government of Canada’s digital agenda. The MAF assessment will provide Deputy Heads with a better understanding of their departmental and Chief Information Officer (CIO) performance relative to the policy requirements outlined below.

As is currently required in the Standard on Metadata:

  • requirement 6.1.3: Ensuring that GC standardized metadata and value domains are incorporated in the design and implementation of departmental systems managing information resources.

Category

Expected Results

Target: Medium [50 – 79%]

Assessing the percentage of active systems which use standardized metadata will provide a better understanding of where departments currently stand on their journey of managing information and data as strategic assets.

Higher scores for use of standardized metadata indicate better accessibility, authenticity, sharing, reliability, integrity, findability and re-use through the consistent capture, retrieval, description, use, and maintenance of information and data resources.

Additionally, the validation of the results will enhance awareness within Treasury Board of Canada Secretariat (TBS) of departmental use of metadata in accordance with the instruments and standards.

Assessed Organizations

All Large Departments and Agencies (LDAs) will be assessed.

Period of Assessment

Snapshot as of MAF Portal close date for draft release.

Calculation Method

Departments will complete a template listing all active applications in their Application Portfolio Management (APM) as of the MAF portal close date for the draft release. Applications which use standardized metadata receive a score of one. Applications which do not use standardized metadata receive a score of zero. Applications whose standardized metadata is determined by a connected application count as having standardized metadata, even if technically the applications are separate. Rationale and name of connected application can be added in the additional notes section.

Standardized metadata is metadata which adheres to an institutional instrument or standard and is used consistently for departmental data.

Calculation: The number of applications that receive a score of one (numerator) for the utilization of standardized metadata, divided by the total number of active applications which manage data and information in the department’s APM * 100

Rating Scale:

  • High: [80 – 100]
  • Medium: [50 – 79]
  • Low: [0 – 49]

Note: An application whose standardized metadata is determined by a connected application, counts as having standardized metadata, even if technically the applications are separate.

Note: TBS will compare the list of applications submitted by departments to the application inventory in APM. To be relevant here, applications must be considered ’Active’ (are "yes" in the Active field and "in production" in the Application Status field, are "yes" in Include in Portfolio Assessment field).

Note: To validate the scores in the template listing, departments will be required to provide TBS the institutional instrument(s) or metadata standard(s), international or otherwise, used for departmental use of standardized metadata.

Evidence Requirements

  • Department to provide evidence
  • TBS to provide evidence
  • Other evidence to be considered (please provide)

Data collection Method

Departments will collect their data internally and complete a template.

Documentary Evidence (MAF Portal). To prove that standardized metadata is in use, departments are encouraged to submit limited pieces of documentary evidence (for a maximum of 5 documentary pieces of evidence per application), as necessary.

Government-wide Comparison

  • Yes the results of the indicator will be used for comparison across departments
  • No (please provide an explanation)

Year-Over-Year Analysis

  • Yes
  • No

Departmental Results Framework (DRF) (TBS use only)

Is this indicator used in the TBS DRF?

  • Yes
  • No

Annex A

N/A

Annex B (if applicable)

Glossary of terms:

Active applications: For the purpose of assessing the Service and Digital AoM Q9, applications which are considered ‘active’ in the Application Portfolio Management dataset, as well as an application status of ‘in production’.

Application Portfolio Management (APM): The application portfolio is a dataset completed by departments of all their software applications, along with an assessment of those applications across a multitude of fields. More information on APM can be found on GC Departmental Integrated Planning (gcconnex.gc.ca) and OCIO Application Portfolio Management - GCpediaOCIO Application Portfolio Management - GCpedia

Metadata: The definition and description of the structure and meaning of information resources, and the context and systems in which they exist.

Standardized metadata: Standardized metadata is metadata which adheres to an institutional instrument or standard and is used consistently for departmental data in an application.

System: For the purpose of assessing the Service and Digital AoM Q9-Q12, hardware or software used interdependently or networked to capture, transmit, store, retrieve, manipulate, or display information or data.

In contrast, the Security AoM methodology refers to ‘Production IT systems’, any equipment or system that is used in the acquisition, storage, manipulation, management, movement, control, display, switching, interchange, transmission, or reception of information or data with a limited scope to systems for which the department/agency is the system owner and that support the department/agency’s critical services.

Question 10: Automated decision systems - Algorithmic Impact Assessments New

The question is: As an Algorithmic Impact Assessment is required under the Directive on Automated Decision-Making, what percentage of automated decision systems deployed since April 2020, which support external services, have Algorithmic Impact Assessments completed and released to the Open Government Portal?

To answer:

  • TBS to answer
  • Department or Agency to answer
  • Both TBS and Department or Agency to answer

Rationale

There is a need to determine whether automated decision systems are being deployed in a manner that reduces risks to Canadians and federal institutions, and leads to more efficient, accurate, consistent, and interpretable decisions made pursuant to Canadian law. Algorithmic Impact Assessments (AIAs) allow users to identify, assess, and help mitigate the risks of these systems to clients and to the federal government. Publishing AIAs is critical to algorithmic transparency and accountability, and to comply with the Directive on Automated Decision-Making. Under the directive, ADMs are responsible for completing and publishing an AIA for systems deployed to make or support decisions on external services. This supports the responsibility of Deputy Heads under the Policy on Service and Digital to ensure the responsible and ethical use of automated decision systems in accordance with TBS direction and guidance. When made public, Algorithmic Impact Assessments shows Canadian society that the Government of Canada is committed to responsible use of automated decision systems and managing information and data in an open way.

Category

Expected Results

Target: 100%

  • Enhanced awareness in TBS of the automation landscape in the federal government to enable effective oversight.
  • Improved compliance with the Directive on Automated Decision-Making.
  • Improved algorithmic transparency and accountability.

Assessed Organizations

All LDAs will be assessed.

Period of Assessment

April 1, 2020 to March 31, 2022

This is a new question, and therefore no previous question has been asked to determine departmental performance. Extending the assessment period to the point at which departments were required to comply with the Directive will allow departments to demonstrate their compliance to date. TBS will be able to update its list of automation projects.

Calculation Method

Number of AIAs completed and released to the Open Government Portal divided by the total number of automated decision systems determined to be within the scope of the Directive.

Rating Scale:

  • High: [80-100]
  • Medium: [50-79]
  • Low: [0-49]

Evidence Requirements

  • Department to provide evidence
  • TBS to provide evidence
  • Other evidence to be considered (please provide)

Data collection Method

Departments will submit a list of automation projects supporting decision-making on external services that were launched between April 1, 2020 and March 31, 2022, along with copies of approved and/or published AIAs associated with these projects.

TBS will assess whether automation projects fall within the scope of the Directive to identify (or validate) the subset for which an AIA is needed. A review of AIAs may be conducted to ensure clarity and completeness, as appropriate.

Government-wide Comparison

  • Yes the results of the indicator will be used for comparison across departments
  • No (please provide an explanation)

Year-Over-Year Analysis

  • Yes
  • No

Departmental Results Framework (DRF) (TBS use only)

Is this indicator used in the TBS DRF?

  • Yes
  • No

Annex A

N/A

Annex B (if applicable)

Glossary of Terms (taken from the Policy on Service and Digital and Directive on Automated Decision-Making)

Algorithmic Impact Assessment: A framework to help institutions better understand and reduce the risks associated with Automated Decision Systems and to provide the appropriate governance, oversight and reporting/audit requirements that best match the type of application being designed.

Automated Decision System: Includes any technology that either assists or replaces the judgement of human decision-makers. These systems draw from fields like statistics, linguistics, and computer science, and use techniques such as rules-based systems, regression, predictive analytics, machine learning, deep learning, and neural nets.

External service: A service where the intended client is external to the Government of Canada.

Question 11: Automated decision systems – transparency measures New

The question is: As transparency measures for automation projects are required under the Directive on Automated Decision-Making, what percentage of automation projects launched since April 2020, which support decisions on external services, have adopted applicable transparency measures required under the Directive on Automated Decision-Making?

To answer:

  • TBS to answer
  • Department or Agency to answer
  • Both TBS and Department or Agency to answer

Rationale

There is a need to determine whether automated decision systems are being deployed in a manner that reduces risks to Canadians and federal institutions, and leads to more efficient, accurate, consistent, and interpretable decisions made pursuant to Canadian law. The Directive on Automated Decision-Making establishes a range of transparency measures to help achieve this goal. The measures require federal institutions to provide notice to clients regarding their use of automated decision systems; meaningfully explain decisions made or supported by an automated decision system; publish custom source code; and document automated decisions. These measures are critical to algorithmic transparency and accountability, and to comply with the Directive on Automated Decision-Making. Under the Directive, Assistant Deputy Ministers, or persons named by the Deputy heads, are responsible for ensuring that their automation projects adopt these measures, which vary by a project’s impact level. This supports Deputy Heads to ensure the responsible and ethical use of automated decision systems in accordance with TBS direction and guidance as defined under the Policy on Service and Digital. By evaluating departmental compliance with the transparency measures, TBS can help ensure that the federal government is fulfilling its commitment to the responsible development and use of automation technologies.

Category

Expected Results

Target: 100%

  • Enhanced awareness of whether and how departments are complying with the Directive on Automated Decision-Making
  • Improved algorithmic transparency and accountability

Improved understanding of the challenges and opportunities of implementing the directive’s transparency measures

Assessed Organizations

All LDAs will be assessed.

Period of Assessment

April 1, 2020 to March 31, 2022

This is a new question, and therefore no previous question has been asked to determine departmental performance. Extending the assessment period to the point at which departments were required to comply with the Directive on Automated Decision-Making will allow departments to demonstrate their compliance to date. TBS will be able to update its list of automation projects.

Calculation Method

As required by the Directive on Automated Decision-Making section 6.2, all automation projects must complete measures to ensure transparency in the use of automation.

For all automation projects launched since April 2020 to support decision-making on external services, the total number applicable transparency measures completed divided by the total number of active applicable transparency measures.

Rating Scale:

  • High: [80-100]
  • Medium: [50-79]
  • Low: [0-49]

Evidence Requirements

  • Department to provide evidence
  • TBS to provide evidence
  • Other evidence to be considered (please provide)

Data collection Method

Departments will submit a list of automation projects supporting decision-making on external services that were launched between April 1, 2020 and March 31, 2022, along with documentary evidence of: documentary evidence of:

  • Notice statements (subsections 6.2.1-6.2.2) - a notice informs clients of the use of an automated decision system in the context of the service they are seeking. Notice requirements vary by a project’s impact level (see Appendix C of the Directive)
  • Explanation of decisions (subsection 6.2.3) - an explanation describes to clients how a decision made or supported by an automated decision system was reached. Explanation requirements vary by a project’s impact level (see Appendix C of the Directive)
  • Published custom source code (subsection 6.2.6) - government-owned or open-source code should be made publicly available unless one or more exceptions under subsection 6.2.6 apply (this requirement would not apply to proprietary software)
  • Documented decisions (subsection 6.2.8) - outputs of automated decision-systems should be documented in accordance with the requirements of the Directive on Service and Digital (e.g., subsections 4.3.1.6, 4.3.1.12). (These outputs may or may not constitute a decision.)

Evidence will be assessed based on a project’s level of impact, which is established through the Algorithmic Impact Assessment. The level of impact determines applicable transparency and other requirements under the Directive on Automated Decision-Making.

Government-wide Comparison

  • Yes the results of the indicator will be used for comparison across departments
  • No (please provide an explanation)

Year-Over-Year Analysis

  • Yes
  • No

Departmental Results Framework (DRF) (TBS use only)

Is this indicator used in the TBS DRF?

  • Yes
  • No

Annex A

N/A

Annex B (if applicable)

Transparency measures (as related to the Directive on Automated Decision-Making): There are several measures needed to be taken by departments to transparently explain the use of automated systems. These measures are found in the appendices of the Directive on Automated Decision-Making.

Question 12: Interoperable data transfer New

The question is: As data interoperability, reuse and sharing are required under the Directive on Service and Digital, what is the percentage of applications in your Application Portfolio Management (APM) which enable interoperable data transfer in and out to other applications in your department?

To answer:

  • TBS to answer
  • Department or Agency to answer
  • Both TBS and Department or Agency to answer

Rationale

Interoperability of IT systems means that data can correctly flow between these IT systems as needed, expanding the ability to use and re-use data effectively and efficiently. Easily sharing data between IT systems allows the department to create highly accurate datasets which can act as a single source of truth on specified data. It also makes working with data much more efficient, allowing data to go where it is needed right away. This reduces manual copying and pasting of data from one system to another, reduces repetitive data collection from clients (that the department might already have stored somewhere). System interoperability also lets departmental staff connect more pieces of information to inform their analysis, providing better decision making and reporting to Canadians.

Directive on Service and Digital 4.3.1.3: Ensure information and data are managed to enable data interoperability, reuse and sharing to the greatest extent possible within and with other departments across the government to avoid duplication and maximize utility, while respecting security and privacy requirements.

Category

Expected Results

Target: Medium [50 – 79%]

  • Enhanced awareness in TBS of the departmental management of IT applications to enable effective oversight.
  • Improved compliance with the Policy on Service and Digital.
  • Departments implement integrated and collaborative practices.  ​

It is expected that departments will not have sufficiently high levels of interoperability, and therefore these results will encourage change in the departments, as well as in how TBS sets guidance and monitors the development of new systems.

Assessed Organizations

All LDAs will be assessed.

Period of Assessment

Data snapshot as of the MAF portal close date. Providing this extra time to departments will allow them to complete assessments on their applications.

Calculation Method

In APM, departments use an established assessment framework to determine their response to a vast number of questions. One of those questions, Integration Score, determines an application’s integration with other systems through APIs and similar technology. In completing their APM submission, departments record Integration score out of 5 for each application. 0 means not applicable, 1 is very poor, 2 is poor, 3 is average, 4 is good and, 5 is the best result for an application which is easily able to integrate with other applications to share data. The assessment framework is called the Application Lifecycle Management Guide, downloadable from this link: Application Lifecycle Management guide. For technical condition scoring of integration score please see the table in section 3.2.5.

TBS will calculate the assessment score by averaging ‘Integration Score’ for all active applications in the department’s APM and multiplying by 20 (to get a score out of 100).

For example, if a department has 30 active applications which are applicable, with an average score of 3.4 (after any adjustments from the ‘Integration score adjustment’ field), their MAF score would be 3.4*20=68%

Rating Scale:

  • High: [80-100]
  • Medium: [50-79]
  • Low: [0-49]

Note: Applications marked as not applicable will not be counted.

Note: Adjustment criteria will be used to adjust an application’s score, to a min of 0 and a max of 5. 

Note: Active application are those which are Active, In Production and Included in Portfolio Assessment (three fields within APM)

Evidence Requirements

  • Department to provide evidence
  • TBS to provide evidence
  • Other evidence to be considered (please provide)

Data collection Method

APM is regularly updated by departments. The final submission date for the data extract will correspond to the MAF portal close data for the draft results.

Documentary Evidence (MAF Portal)

Government-wide Comparison

  • Yes the results of the indicator will be used for comparison across departments
  • No (please provide an explanation)

Year-Over-Year Analysis

  • Yes
  • No

Departmental Results Framework (DRF) (TBS use only)

Is this indicator used in the TBS DRF?

  • Yes
  • No

Annex A

N/A

Annex B (if applicable)

Application Portfolio Management (APM): The application portfolio is a dataset completed by departments of all their software applications, along with an assessment of those applications across a multitude of fields. More information on APM can be found on GC Application Portfolio Management Community and OCIO Application Portfolio Management – Gcpedia

Question 13: Data inventory Updated (Was element 4 of Q15 in 2019-20)

The question is: As strategic management of data and information is required under the Directive on Service and Digital, what is the percentage of sound management practises present in the department’s data inventory?

To answer:

  • TBS to answer
  • Department or Agency to answer
  • Both TBS and Department or Agency to answer

Rationale

Maximizing the use of data as an asset could help departments and agencies offer services, understand their clients, meet their regulatory requirements and be open and transparent with Canadians. Identifying and understanding the department’s data assets through a data inventory is an introductory step to implementing effective departmental governance.

Guidance from TBS on building a data inventory is found in the Guideline on Service and Digital, section 3.1.3.2: Information and data management. This section expands on requirements in the Directive on Service and Digital to manage and understand the department’s data and information.

The requirement to have a data inventory is specifically stated in section 6.3 of the Directive on Open Government

Category

Expected Results

Target: Medium [50 – 79%]

TBS expects every department to have a data inventory, and to complete at least half the management practices outlined in the guideline on service and digital.

Assessed Organizations

All Large Departments and Agencies (LDAs) will be assessed.

Period of Assessment

TBS understands that data inventories are evergreen documents. Departments should save a snapshot of their inventories when they begin their MAF data collection process.

Calculation Method

Departments will submit their data inventories to TBS through the MAF portal. As explained in the Guideline, the inventory should collect all sources of information and data in the department, and include helpful information on them such as location, who is responsible, how to access the data, etc.

Result = the number of management practices used in the data inventory divided by nine.

Overall the inventory should:

  1. Completely list information and data held by the department.

The inventory should collect the following information per dataset/entry:

  1. Describe where information and data are located (i.e., in applications, in a data warehouse, on a shared drive, etc)
  2. Describe how data (e.g., datasets) are stored or the type of file/application used to store the data.
  3. List who stewards different datasets and has access to them
  4. Indicate whether datasets are shared (outside the organization, beyond borders or jurisdictions, etc)
  5. Explain to staff how to search for, and access the data
  6. State release eligibility for the open government portal
  7. Describe any privacy and security considerations associated with it (e.g., list security classification or privacy impact assessments)
  8. Link to its record on the Open Government Portal

TBS will analyse each data inventory to determine a result. For this exercise, it is not necessary to have 100% completeness for every management practice in each entry. TBS will discern whether a practice is being used based on whether fields exist in the inventory and whether they are in use.

Rating Scale:

  • High: [80 – 100]
  • Medium: [50 – 79]
  • Low: [0 – 49]

Note:

A data steward is the person or role responsible for maintaining the data, determining access rules and data quality standards.

Evidence Requirements

  • Department to provide evidence
  • TBS to provide evidence
  • Other evidence to be considered (please provide)

Data collection Method

Documentary Evidence (MAF Portal)

Departments should submit a copy or extract of their data inventory to support their results. Documents submitted through the MAF portal should include the department’s name and extraction date. Up to 5 documents can be uploaded to the Portal.

Government-wide Comparison

  • Yes the results of the indicator will be used for comparison across departments
  • No (please provide an explanation)

Year-Over-Year Analysis

  • Yes
  • No

Departmental Results Framework (DRF) (TBS use only)

Is this indicator used in the TBS DRF?

  • Yes
  • No

Annex A

N/A

Annex B (if applicable)

N/A

Theme 3: Open Government

Theme 3 Overview

To understand the data and information the government institution is releasing for public consumption to better inform of their activities and support the strategic reuse of the data and information for social and economic benefits.

To demonstrate greater transparency and accountability, government institutions will demonstrate they have an open government portal release process and is able to release Departmental Results Framework (DRF) eligible datasets.

To meet these expectations, the Government of Canada needs to transform from an analogue era to the digital era. This goes beyond the pure digitization of services to leveraging technology to inform and evolve the public and include them in the choices we make and in the design of these services. Transforming the GC for the digital era must include a transformation of our transparency and how we bring the public into the policy development process and inform them so they can meaningfully contribute, set our systems and processes so that everything we develop is done in collaboration with users.

Question 14: Datasets released on Open.Canada.ca Preserved (Q16 in 2019-20)

The question is: As maximizing the release of departmental information and data as open resource is required under the Policy on Service and Digital, what is the percentage of releasable Departmental Results Framework datasets released on Open.Canada.ca?

To answer:

  • TBS to answer
  • Department or Agency to answer
  • Both TBS and Department or Agency to answer

Rationale

The inclusion of this indicator supports greater transparency for government institutions by making them accountable to accessibly publish datasets on the Open Government portal. Institutions are required to be increasingly open by default and demonstrate efforts to be transparent in decision making and operations by releasing resources through open.canada.ca. Releasing data in this way increases trust in government institutions and helps fight the spread of mis/disinformation. It also provides a social and economic impact through reuse of government data to develop new products / services.

The institution’s ability to strategically manage information and data by accurately reviewing or formatting the resource to make it eligible for release, including review for privacy, security and confidentiality. It also demonstrates that the institution has an established release process to publish data and information on the Open Government Portal.

Category

Expected Results

Releasing all DRF eligible datasets helps to support greater transparency and accountability of the government institutions. Not meeting expectations risks lowering public trust in government institutions.

Policy requirements are to release all eligible information and data under an open and unrestrictive license (open by default).

Target: High (80-100)

Rating Scale:

  • High: [80-100]
  • Medium: [50-79]
  • Low: [0-49]

Assessed Organizations

All LDAs will be assessed.

Period of Assessment

April 1, 2021, to March 31, 2022.

Calculation Method

(Number of releasable DRF datasets released on open.canada.ca / Total number of releasable DRF datasets) *100

Reasoning as to why a dataset is not releasable based on existing exemptions within the Directive on Open Government, Guidelines on Service and Digital and Open Government Guidebook will need to be provided.

Evidence Requirements

  • Department to provide evidence
  • TBS to provide evidence
  • Other evidence to be considered (please provide)

Data collection Method

Document evidence to MAF Portal. Links to the relevant datasets on the open government portal, list of all DRF datasets with identification whether releasable or not, Privacy, Security of confidence rationale why a dataset is not eligible for release with mitigation plan (aggregation, summary info) as to whether it can be released next year.

Government-wide Comparison

  • Yes the results of the indicator will be used for comparison across departments
  • No (please provide an explanation)

Year-Over-Year Analysis

  • Yes
  • No

Departmental Results Framework (DRF) (TBS use only)

Is this indicator used in the TBS DRF?

  • Yes
  • No

Annex A

N/A

Annex B (if applicable)

N/A

Theme 4: Cyber Security

Theme 4 Overview

Society – government, corporations, individuals – is more and more reliant on digital solutions for every facet of our daily lives. The importance placed on conducting online business in a safe, secure and reliable way has increased in concert with this need. Strong cyber security measures and practices in all aspects of the digital space are critical. As more Canadians interface electronically with the government, the amount of sensitive information transferred to and from government services will increase. To maintain maximum trust in online transactions, the government must protect them.

Implementing secure protocols by default, such as HTTPS, DNSSEC and DMARC, along with approved encryption algorithms, increases the level of confidence that users are accessing a legitimate service and that their communications remain private and free from interference while offering a level of security and privacy that users expect from government services.

Under Appendix G: Standard on Enterprise IT Service Common Configurations of the Directive on Service and Digital, TBS has established a Web Sites and Services Management Configuration Requirement which requires departments to ensure that all production websites and web services are configured to provide service only through a secure connection that is configured for HTTPS (and redirected from HTTP). TBS will continue to monitor departments’ progress in implementing the standard and will continue to develop guidance to ensure that web services that serve primarily non-browser clients, such as APIs, are also configured with HTTPS by default.

Question 15: HTTPS – Meeting online security protocols New

The question is: As ensuring that all production websites and web services are configured for HTTPS as is required under the Directive on Service and Digital Appendix G, what percentage of publicly available websites and web services are configured to provide service only through a secure connection?

To answer:

  • TBS to answer
  • Department or Agency to answer
  • Both TBS and Department or Agency to answer

Rationale

Delivering secure and trusted digital services requires systems and applications to be built with resiliency against cyberattacks from the outset, as part of its design, implementation, operation and management. This includes using secure protocols by default, such as HTTPS. HTTPS is recognized by industry standards as a vital tool that protects the privacy and security of websites from malicious attacks. Scanning the GC websites for HTTPS secure connections would inform the security posture for digital service delivery. This supports the responsibility of Deputy Heads under the Policy on Service and Digital to ensure that all production websites and web services are configured to provide service only through a secure connection that is configured for HTTPS.

Category

Requirement under the IT configuration: Web Sites and Services Management Configuration Requirements, and, previous to May 2022: requirement under Implementing HTTPS for Secure Web Connections: Information Technology Policy Implementation Notice (ITPIN) (previously under the Policy on IM).

Expected Results

Target: 100 %

Departments will have an opportunity to review public facing websites, identify any missing sites and address gaps in website configurations, thereby enabling year over year improvements in their departmental compliance score.

A low score would indicate a high security risk exposure.

Assessed Organizations

Large departments and agencies.

Period of Assessment

April 1, 2021, to March 31, 2022.Snapshot of Tracker as of MAF Portal close date for draft release.

Calculation Method

(# of publicly available websites and web services configured to provide services only through a secure connection / total number of publicly available websites and web services) x 100

Rating Scale:

  • High: [80 – 100]
  • Medium: [50 – 79]
  • Low: [0 – 49]

Evidence Requirements

  • Department to provide evidence
  • TBS to provide evidence
  • Other evidence to be considered (please provide)

Data collection Method

Tracker automatically scans reported GC domains (all publicly available websites and web services) to report on and track adoption of secure, encrypted HTTPS-only communications. This tool tracks compliance with the characteristics of secure web communications. Scans are performed daily against a list of known GC domains hosting publicly available websites.

Government-wide Comparison

  • Yes the results of the indicator will be used for comparison across departments
  • No (please provide an explanation)

Year-Over-Year Analysis

  • Yes
  • No

Departmental Results Framework (DRF) (TBS use only)

  • Yes
  • No

Annex A

N/A

Annex B (if applicable)

N/A

Theme 5: Information Technology

Theme 5 Overview

The questions in this theme have been chosen to gain insight on the department’s management of applications, adherence to digital standards, adoption of cloud technologies, and IT planning maturity. This view will help Deputy Heads and CIOs to identify progress year-over-year. Recognizing progress contributes to the generation and sharing of best practices.

The expected outcomes from the measurement of this strategic theme include:

Theme 5 Sub-Category A: Informed Investment Decisions

Theme 5 Sub-Category A Overview

Demonstrating best value, sound stewardship, and transparency, based on full life cycle costs, to strengthen overall health of the GC’s application portfolio.

Question 16: Application portfolio health Updated (Element A from Q4 in 2021-22)

The question is: What is the health of the department’s application portfolio?

To answer:

  • TBS to answer
  • Department or Agency to answer
  • Both TBS and Department or Agency to answer

Rationale

Legacy and aging infrastructure and IT applications pose a significant risk to the departments, Shared Services Canada (SSC) and the GC as a whole. By effectively monitoring and reporting on an application’s health, departments can leverage the data to plan for the right IT investments. This indicator highlights for Deputy Heads and functional specialists how a department is managing its most vulnerable IT assets. This indicator provides TBS with a clear understanding of how well IT applications are being managed on a GC level.

Category

  • Policy Compliance
  • Performance
  • Baseline

Expected Results

Target: Medium [50-79]

  • Raise awareness of Aging IT risks including cyber security
  • Encourage departments to modernize their applications and migrate from legacy data centers
  • Enhanced awareness in TBS of the departmental management of IT applications to enable effective oversight.
  • Improved compliance with the Policy on Service and Digital.
  • Departments implement integrated and collaborative practices.  ​

Assessed Organizations

All LDAs will be assessed.

Period of Assessment

April 1, 2021, to March 31, 2022

Calculation Method

Application Portfolio Health Indicator (APHI) = (Number of applications with an Aging IT assessment of “No Attention Required” or “Minimal Attention Required” / Number of Applications) * 100

Rating Scale:

  • High: [80-100]
  • Medium: [50-79]
  • Low: [0-49]

Evidence Requirements

  • Department to provide evidence
  • TBS to provide evidence
  • Other evidence to be considered (please provide)

Data collection Method

TBS Clarity APM data extract from March 2022.

Government-wide Comparison

  • Yes the results of the indicator will be used for comparison across departments
  • No (please provide an explanation)

Year-Over-Year Analysis

  • Yes
  • No

Departmental Results Framework (DRF) (TBS use only)

Is this indicator used in the TBS DRF?

  • Yes
  • No

Annex A

N/A

Annex B (if applicable)

N/A

Theme 5 Sub-Category B: Integrated Planning

Theme 5 Sub-Category B Overview

The focus is on alignment between functions through engagement at the planning stage, collaboration, informing decisions based on performance information, to focus on benefits and outcomes.

Priority: Provide horizontal prioritization and portfolio management

Question 17: IT expenditures Preserved (Element D of Q4 in 2021-22)

The question is: What is the department’s ability to accurately forecast its IT expenditures?

To answer:

  • TBS to answer
  • Department or Agency to answer
  • Both TBS and Department or Agency to answer

Rationale

This question will measure the department’s accuracy of their IT planning by comparing planned IT expenditures to the actual IT expenditures from the same period. The management of IT must be integrated as part of business planning within the department or agency, aligned to the GC CIO priorities, and a balance between enterprise and program-driven priorities. A high maturity of IT expenditures to IT Planning indicates that the department has a mature planning organization which is capable of planning for, tracking and executing on its investments in a predictive consistent fashion. Further, proper planning of investments is paramount for stewardship of fiscal resources.

Category

  • Policy Compliance
  • Performance
  • Baseline

Expected Results

Target: High [80-100]

  • Improve accuracy of forecasting of IT investments
  • Stronger IT investment alignment to CIO of Canada strategic outcomes and the departmental results framework
  • Supports effective planning and better decision-making

Assessed Organizations

All LDAs will be assessed.

Period of Assessment

April 1, 2021, to March 31, 2022.

Calculation Method

Percentage difference between Planned IT expenditures and Actual IT expenditures = ((Actual IT Expenditures - Planned IT Expenditures) / Planned IT Expenditures) * 100

If the Percentage difference between Planned IT Expenditures and Actual IT Expenditures is greater +/- 100%
Assessment result = 15%

If the Percentage difference between Planned IT Expenditures and Actual IT Expenditures is less than or equal to +/- 15%
Assessment result = 100%

Otherwise
Assessment result = 100% - ABS (Percentage difference between Planned IT Expenditures and Actual IT Expenditures) + 15%
ABS = Absolute value

Rating Scale:

  • High: [80-100]
  • Medium: [50-79]
  • Low: [0-49]

Evidence Requirements

  • Department to provide evidence
  • TBS to provide evidence
  • Other evidence to be considered (please provide)

Data collection Method

  • 2021-2022 IT Expenditure Report
  • GC Enterprise Portfolio Management (EPM) Clarity IT investments data extract from Q2 Update FY 2021-22

Government-wide Comparison

  • Yes the results of the indicator will be used for comparison across departments
  • No (please provide an explanation)

Year-Over-Year Analysis

  • Yes
  • No

Departmental Results Framework (DRF) (TBS use only)

Is this indicator used in the TBS DRF?

  • Yes
  • No

Annex A

N/A

Annex B (if applicable)

N/A

Theme 5 Sub-Category C: Informed Investment Decisions

Theme 5 Sub-Category C Overview

Demonstrating best value, sound stewardship, and transparency, based on full life cycle costs to shape a digital value narrative.

Question 18: Cloud adoption Updated (Element C of Q4 in 2021-22)

The question is: What is the department’s percentage (%) of planned investments in the Cloud?

To answer:

  • TBS to answer
  • Department or Agency to answer
  • Both TBS and Department or Agency to answer

Rationale

Demand for digital service is driving the need for agile, rapid delivery of business value. Cloud adoption provides the ability to incrementally improve applications with speed while maintaining quality, security, and stability. Cloud also provides the flexibility to be scalable and offers a wide range of tools/services that Canadians are expecting from the GC. Among others, this indicator highlights for Deputy Heads how quickly a department can phase-out legacy systems and adopt Cloud to remediate technical debt and deliver business value. This indicator provides TBS with a clear understanding of how well departments are adhering to the “Cloud Smart” direction on a GC level and helps shape a digital value narrative.

Category

  • Policy Compliance
  • Performance
  • Baseline

Expected Results

Target: High [70-100]

  • Increase investment in cloud technology solutions.
  • Reducing aging IT risks which will improve overall application health.
  • Shorten the product lifecycle and improve service delivery to Canadians.

Assessed Organizations

Large departments and agencies.

Period of Assessment

April 1, 2021, to March 31, 2022.

Calculation Method

Percentage of Transformation IT projects with a Primary Public Cloud Target = (Number of Transformation IT Projects with a Primary Public Cloud Target or valid Non-Public Cloud Target Response / Number of Transformation IT Projects) * 100

Only the following are considered Valid Non-Public Cloud Target Responses:

  • Above Protected B
  • Deemed too high risk
  • Insufficient network capacity
  • No commercial cloud solution

Rating Scale:

  • High: [70-100]
  • Medium: [45-69]
  • Low: [0-44]

Evidence Requirements

  • Department to provide evidence
  • TBS to provide evidence
  • Other evidence to be considered (please provide)

Data collection Method

  • GC Enterprise Portfolio Management (EPM) Clarity IT investments data extract from Q2 Update FY 2022-23

Government-wide Comparison

  • Yes the results of the indicator will be used for comparison across departments
  • No (please provide an explanation)

Year-Over-Year Analysis

  • Yes
  • No

Departmental Results Framework (DRF) (TBS use only)

Is this indicator used in the TBS DRF?

  • Yes
  • No

Annex A

N/A

Annex B (if applicable)

This definition of a project is taken from the IT Investment Guide. “An activity or series of activities that has a beginning and an end. A project is required to produce defined outputs and realize specific outcomes in support of a public policy objective, within a clear schedule and resource plan. A project is undertaken within specific time, cost and performance parameters.”

Appendix A - Glossary

Active applications: For the purposes of assessing the Service and Digital AoM Q9, applications which are considered ‘active’ in the Application Portfolio Management dataset, as well as an application status of ‘in production’.

Algorithmic Impact Assessment: A framework to help institutions better understand and reduce the risks associated with Automated Decision Systems and to provide the appropriate governance, oversight and reporting/audit requirements that best match the type of application being designed.

Application Portfolio Management (APM): The application portfolio is a dataset completed by departments of all their software applications, along with an assessment of those applications across a multitude of fields. More information on APM can be found on GC Application Portfolio Management Community and OCIO Application Portfolio Management - GCpedia

Automated Decision System: Includes any technology that either assists or replaces the judgement of human decision-makers. These systems draw from fields like statistics, linguistics, and computer science, and use techniques such as rules-based systems, regression, predictive analytics, machine learning, deep learning, and neural nets.

External services: As defined in the Policy on Service and Digital: "A service where the intended client is external to the Government of Canada."

IT project: This definition of a project is taken from the IT Investment Guide. “An activity or series of activities that has a beginning and an end. A project is required to produce defined outputs and realize specific outcomes in support of a public policy objective, within a clear schedule and resource plan. A project is undertaken within specific time, cost and performance parameters.”

Metadata: The definition and description of the structure and meaning of information resources, and the context and systems in which they exist.

Standardized metadata: Standardized metadata is metadata which adheres to an institutional instrument or standard and is used consistently for departmental data in an application.

System: For the purposes of assessing the Service and Digital AoM Q9-Q12, hardware or software used interdependently or networked to capture, transmit, store, retrieve, manipulate, or display information or data.

In contrast, the Security AoM methodology refers to ‘Production IT systems’, any equipment or system that is used in the acquisition, storage, manipulation, management, movement, control, display, switching, interchange, transmission, or reception of information or data with a limited scope to systems for which the department/agency is the system owner and that support the department/agency’s critical services.

Transparency measures (as related to the Directive on Automated Decision-Making): There are several measures needed to be taken by departments to transparently explain the use of automated systems. These measures are found in the appendices of the Directive on Automated Decision-Making.

Page details

Date modified: