Experimentation Works
Supporting experimentation in the public service by:
- encouraging projects where we learn by doing
- publicly sharing our results and lessons learned
Experimentation Works encourages public servants to incorporate experimentation into their skills and practice. Through a unique model of learning by doing, the Government of Canada is showcasing small-scale experiments. Each project lead worked with an experimentation expert over a 12‑month period to develop and execute their vision. We followed the journey of 3 teams that conducted 4 projects and examined how they faced and overcame new challenges.
On this page:
- Experimenting with content design
- Experimenting with program design
- Experimenting with visual design
- Experimenting with message design
Experimenting with content design
Program: Consumer Product Safety Program
Department: Health Canada
The mandate of Health Canada’s Consumer Product Safety Program is to:
- protect Canadians from unsafe products
- ensure effective surveillance of the marketplace
- serve as an early warning system for possible or emerging risks and hazards related to cosmetics and consumer products
Research question: Are enhancements to the landing page, compared to the existing one, resulting in more Canadians proceeding to the consumer incident report form?
Before the redesign of the Cosmetic or Consumer Product Incident Report Form web page, the rate of incident reporting by users was low. A low rate of reporting reduces our ability to:
- identify hazards
- manage the risks posed by cosmetics and consumer products in the marketplace
The reasons for the low rate of reporting, which were identified in interviews, included:
- low web visibility
- difficulty filling out the cosmetic or consumer product incident report form
- lack of transparency, such as not receiving a response after an incident report form was submitted
What we did
The experiment was a randomized A/B test that used:
- the existing landing page for consumer incident reporting (Figure 1)
- a modified landing page (Figure 2)
The intent of the experiment was to determine if changes in language and presentation could successfully encourage users to complete the cosmetic or consumer incident report form.
The modified landing page improved the user interface by providing:
- a button that makes it easier to submit a report form
- plain language instructions on how to fill out the report form
The modified landing page provided clearer direction to users about:
- which incidents should be reported to the Consumer Product Safety Program
- why incidents should be reported to the program
- how to report such incidents
For the experiment, users were sent to either the existing landing page (that is, the control group) or the modified landing page (that is, the experimental group).
Figure 1: existing landing page
Figure 2: modified landing page
What we found
The experiment ran for 3 months, from December 12, 2018, to March 11, 2019. During this time, there were 2,592 page visits to the existing landing page and 1,999 page visits to the modified landing page.
After reaching the landing page, visitors would click on a link to access the consumer incident report form. A greater number of visitors of the modified page accessed the cosmetic or consumer incident report form (61%), compared to visitors of the existing page (27%). The difference (34.0%) in the percentage of visitors who accessed the report form is statistically significant. This difference indicates that making the landing page more user-focused played an important role in encouraging users to access the incident report form.
What we learned
Data gaps
There were no data about the historical baseline of the incident report form web page because the analytics for the form were not previously tracked. The form is a single web page and data are only collected when a user submits a form, so no information about user behaviours is available apart from the number of forms submitted.
Openness drives innovation
There was early support for improving the landing page from senior management. Their support resulted in brainstorming sessions to explore different ways of experimenting with the current website. There were many fruitful discussions on how to change the landing page. The sessions and discussions resulted in a landing page that:
- was informative
- enhanced the user experience
Foster collaboration
In an ideal world, all of the skills that are needed for a project would be known from the outset. In practice, issues arose throughout the project, and they required different expertise and inputs. Therefore, regular team meetings were held, which helped not only to clarify roles, but also to identify challenges before they became problems. The regular meetings:
- fostered teamwork and collaboration
- sustained optimism and focus
Listen to your users
There are many applications for the data collected in the incident reporting system. Staff used the data to identify, assess and manage risks in support of the Canada Consumer Product Safety Act. In this context, enlisting the help of internal stakeholders within the program was essential to bridge the gap between subject matter experts, programmers and communications officers.
Experimenting with program design
Program: Paul Yuzyk Youth Initiative for Multiculturalism
Department: Canadian Heritage
The Paul Yuzyk Award for Multiculturalism was created in 2009. In 2018, the award was repurposed as a youth micro-grant to support the government’s efforts to engage youth. In order to be eligible for a micro-grant, young Canadians were invited to submit proposals to help end racism and discrimination in their local communities.
The initiative sought to disburse a total of $30,000 in funding for youth-led projects that met the targeted eligibility criteria. Young Canadians could apply for micro-grants of $250, $500 or $1,000 to be used toward eligible projects.
Research question: What would be the impact on the application rate of adapting the Paul Yuzyk Lifetime Achievement Award for Multiculturalism into a youth micro-grant?
What we did
The project team faced constraints that prevented them from implementing formal experimental design, such as a randomized controlled trial, A/B testing and controlled before-after design. As a result, the project team:
- implemented an uncontrolled before-after design
- tracked changes in application rates before and after they implemented the new promotion strategy
The team’s analysis was supplemented by data collected in post-initiative surveys and in interviews with grant recipients.
To reach youth, the project team used:
- the youth and youth stakeholder distribution list managed by the Privy Council Office’s Youth Secretariat (this list has about 17,000 contacts on it)
- an email blast to 600 youth stakeholders known to Canadian Heritage
- advertising on Twitter using accounts affiliated with Canadian Heritage
What we found
The project ended in June 2018 with the following results:
- more than 200 individuals downloaded the application form for the initiative
- 70 completed applications were received
- 33 separate projects were funded with $250, $500 or $1000 grants as part of the initiative
These numbers were 10 times greater than the minimum baseline for success (which was set at 7 or more applications).
The most impressive findings of the post-initiative survey were:
- 95.5% of respondents (21 of 22) strongly agreed that they would recommend the initiative to a friend
- 86.4% of respondents (20 of 22) were very satisfied with the initiative overall
Thus, in terms of participant satisfaction, the micro-grant initiative was a success.
Selected post-initiative survey questions and responses
“Was your proposed event or initiative created after discovering the Paul Yuzyk Youth Initiative for Multiculturalism, or was it planned before?”
There were 24 responses to this question:
- 16 (66.7%) responses reported that their proposed projects had been planned after discovering the Paul Yuzyk Initiative
- 8 (33.3%) responses reported that their projects had been planned before they had heard of the initiative
“Did you receive funding or support for this grant from any other sources (including personal contributions)?”
There were 24 responses to this question, of which 14 (58.3% of all respondents) indicated that they had not. Based on the responses to these questions, we believe that 16 projects were funded and executed that would otherwise not have existed. This conclusion is reinforced by the fact that 14 of 24 respondents did not secure any other sources of funding for their projects.
The youth micro-grant experiment showed us that young people are willing to engage with the Government of Canada on important and complex issues. However, we need to address several administrative hurdles in order to make such engagements more youth-friendly, cost effective and transparent in terms of results.
What we learned
It is difficult to determine the impact of the projects that received funding. Complex problems require sophisticated data, such as baseline data and longitudinal interactions that show change over time, to show causal relationships. Such data and interactions were not available to a small one-time project with limited reporting requirements.
Talking to our recipients was well worth the investment
Our post-initiative surveys and follow-up interviews were essential to understanding the end-user experience. In terms of recipient satisfaction, the micro-grant proved a definite success. However, our recipients also outlined areas for improvement.
Behavioural analysis is the key to good reporting requirements
As a condition of funding, applicants were required to complete the post-initiative survey and post a photo to social media. We were surprised that taking a mandatory approach did not result in 100% compliance. Clearly, making something mandatory is not enough. There are opportunities to study methods of increasing compliance with the post-initiative reporting requirements, including increasing the number of ways to report.
Iterative experiments
In the spring and summer of 2018, the project team met with assigned experts from Experimentation Works. Together, they developed multiple hypotheses and models for future testing, based on data gathered while the initiative was under way (such as the number of participants and the gender breakdown of applicants). The data collected during the initiative have been used to demonstrate what it takes for a project to become an experiment. This demonstration was done by gathering pre-experimental data in the initiative’s first year and developing a hypothesis for future testing, which would also help develop and improve programs across Canadian Heritage.
Experimenting with visual design
Program: EnerGuide
Department: Natural Resources Canada
Natural Resources Canada’s EnerGuide label uses a rating to:
- convey information about how much energy a home uses
- encourage homeowners to make their homes more energy-efficient
User research and a literature review of residential labelling identified some opportunities to improve the EnerGuide label so that the rating is better understood by Canadians.
Research question: Does the EnerGuide label effectively convey energy efficiency and consumption information to homeowners?
What we did
We used the Carrot Rewards mobile application to evaluate whether homeowners in British Columbia, Ontario and Newfoundland were able to understand information about the energy efficiency and energy consumption of their homes as depicted on the EnerGuide label. Comprehension was measured by testing if participants could correctly answer questions about energy efficiency and consumption after viewing various label scales that represented a fictional home’s efficiency rating. In total, approximately 30,000 users of the Carrot Rewards application participated in the online experiment; data were collected in November and December 2018.
Participants completed 3 modules that tested their understanding of energy efficiency using:
- Natural Resources Canada’s EnerGuide
- the United Kingdom’s Energy Performance Certification
- the United States’ Home Energy Score
The participants were shown a random assortment of labels issued by organizations in the 3 jurisdictions listed above. They were asked to interpret these labels. Data about how well the participants interpreted the labels were collected. Based on the data, conclusions were drawn regarding the clarity and effectiveness of the labels.
What we found
This experiment showed that, in some cases, homeowners understood the energy efficiency information that the label conveyed quite well. In other cases, however, they misinterpreted or did not understand what the label tried to communicate. The EnerGuide label provides more information than the labels issued in the United Kingdom or the United States; however, the EnerGuide rating scale is not as clear. Our results strongly suggest that users have difficulty interpreting the energy efficiency rating relative to a reference home. Users may also have difficulty understanding energy efficiency when they consider the energy consumption information provided on specific EnerGuide labels.
The experiment identified opportunities to further analyze and make improvements to the design of the EnerGuide label. Future findings will inform and advance our efforts to improve labelling about energy efficiency in order to better inform Canadians.
Experimenting with message design
Program: EnerGuide
Department: Natural Resources Canada
Natural Resources Canada licenses home energy advisors across Canada to deliver:
- the EnerGuide rating system
- the ENERGY STAR® certification system for new homes
- the R‑2000 standard for energy-efficient homes
Research into behaviour suggests that the way we communicate and frame energy efficiency messaging to Canadians matters. How much does the framing matter? This experiment used our previous work and the behavioural insights theory to test different ways of framing messages to users of the Carrot Rewards application in order to learn what works when encouraging users to contact a home energy advisor in their area.
Research question: Do cost- or comfort-specific messaging interventions nudge more homeowners to seek out a home evaluation service organization than does generic energy efficiency messaging?
What we did
We used the Carrot Rewards application to:
- evaluate homeowners’ knowledge of the EnerGuide home evaluation process
- test whether more homeowners in British Columbia, Ontario, and Newfoundland would seek out a home energy evaluation service provider when prompted with cost, comfort or conventional (the control) EnerGuide information
Participants were randomized into 3 groups, and all participants completed questions about their knowledge of the EnerGuide home energy evaluation process. After the participants completed the survey, their uptake of home evaluation services was measured by monitoring the click-through rate and the number of postal codes entered on Natural Resources Canada’s service providers in your community web page. In total, approximately 30,000 homeowners participated in the online experiment; data were collected in December 2018 and January 2019.
Participants completed similar but subtly different reward offers related to the EnerGuide home evaluation process, which included 1 of 3 randomized message treatments:
- The neutral-framed messaging (control) included language such as “an EnerGuide evaluation is a powerful tool!”
- The cost-framed messaging had the same information as the control, but it included messaging about cost, such as “an EnerGuide evaluation can cut your energy bills.”
- The comfort-framed messaging had the same information as the control, but it included messaging about comfort, such as “an EnerGuide evaluation can keep you warm this winter.”
- Both the cost- and comfort-framed offers also included 2 additional questions about cost (reduction in monthly bills) or comfort (heat loss in older homes) to further enhance the potential effect of message framing.
What we found
There was little difference in how the messages performed. Cost-framed messaging had a statistically significant effect and generated a slightly higher click-through rate (78.9%) compared to the control group (77.8%). However, the difference between the click-through rate generated by comfort-framed messaging (78.7%) compared to the rate generated by the control group (77.8%) was not statistically significant.
The distinction between cost-framed and comfort-framed messaging is very subtle, and this may have been a limitation of the experiment, especially one that was presented using a mobile application. The statistical power of the experiment was significantly reduced when:
- only 23,551 of the 30,000 participants completed the click-through
- only 16,131 participants entered their postal code
This attrition rate further limits what distinctions can be inferred about the different messaging treatments. Interestingly, this experiment highlighted a postal code entry rate of roughly 54% when using the Carrot Rewards application as an outreach channel to encourage homeowners to look for an EnerGuide service provider.
This work has opened up opportunities for further message-framing experiments. However, developing an experiment that compares the impact of message framing on consumers will require:
- additional research
- a larger sample of responses
- stronger interventions to observe substantive and statistically significant differences in uptake for an EnerGuide home evaluation
What we learned
Beyond delivering the experiments, we were intentionally building our experimentation capacity and practice through our participation in Experimentation Works. What follows are our reflections on our experience, and our tips for those interested in experimentation in the public sector:
- Consider experimentation as a systematic process of learning by doing in order to guide, inform and evaluate policy and service actions.
- Know your own context and that of your users. Know where you and your organization, your potential partners and your users are at when it comes to:
- what you are trying to do
- how experimentation can help
- Do your research. We hit the ground running because we had been working on EnerGuide and knew where we could experiment to create value.
- Clearly define the problem and the research questions. We kept coming back to the problem and research questions throughout our journey.
- Identify existing tools or create new ones that support experimentation. For example, the EnerGuide label is an existing tool, and Carrot Insights, the company behind the Carrot Rewards mobile application, is managing a new tool that enabled us to reach a sufficient number of Canadians. We needed to engage the users of our policies, services and tools directly, and we were able to do so by working with partners.
- Use experimentation to amplify the distinction between opinion and behaviour. When it comes to generating evidence and understanding what works, there is a difference between:
- sharing something with a stakeholder and asking their opinion about it
- designing an experiment to see if and how they understand it, interact with it, and use it to accomplish a task
- Be ambitious but consider feasibility. If you are working in a new way and within specific timelines, it is likely better to have a clear, appropriately scaled experiment that will give you usable results rather than something super complex that is much more challenging to execute within your constraints.
- Experimentation takes time and requires the establishment of:
- relationships that make experimentation work (shared understanding, partnering, co-creating)
- managed logistical components (content design, production, testing)
- procurement and approvals (data sharing, content approval)
- continual learning to develop something new
- Assess the skills on the team and work as a team. If you have not conducted user research or run a randomized controlled trial before, then work with others who have experience. Take advantage of cross-functional and multi-disciplinary teams.
For more information on Experimentation Works, experimentation in the Government of Canada or to learn how to participate in future learning events, follow us on social media.
Follow:
© Her Majesty the Queen in Right of Canada, represented by the President of the Treasury Board, 2019,
[ISBN: 978-0-660-31556-0]
Page details
- Date modified: