1 of 22

COVID-19 analytics and evidence

Measuring what matters to drive decisions

2 of 22

Purpose of today’s presentation

  • Update on COVID-19 Analytics Working Group
  • Update on performance measurement framework and KPIs
  • Reading and applying data to decisions
    • Examples of data-driven decisions
  • Sharing our data with each other
  • Next steps

3 of 22

COVID-19 Analytics Working Group

  • Goals:
    • Ensuring ROI on reporting through GC-wide collaboration
    • Support each other’s efforts and build capacity
  • Tactics:
    • Building a common methodology for COVID-19 KPIs
      • Including GC I.P. addresses or not?
      • visits vs. visitors?
    • Developing guidance to collect, analyse and report
    • Key messages to support data-driven decisions via meaningful analysis

4 of 22

Updates on the performance measurement framework

5 of 22

What we have learned

  • Different reporting styles
    • Complex dashboards, click-through maps, traffic over time, spreadsheets
  • Different methodologies
    • IP addresses excluded/included in analytics
  • Similar asks for data
    • Page views, comparisons vs. trend analysis and identifying success
  • Similar needs to support data-driven decisions
    • Trend analysis
    • Meaningful comparable data
    • Tested approaches for usability improvements

6 of 22

Reflecting on the measurement journey

Analyse

Analyse the data

Identify trends in behaviour or traffic that can tell a meaningful story to support decisions.

Key performance indicators

Identify the type of data that can support a story of success and answers the KPQ.

KPI

Establish a baseline

Gather data to identify current performance giving the ability to measure against something tangible.

Baseline

Measure performance

Use applicable measurement tools and techniques to support the KPI selected (eg analytics, user testing, etc.)

Measure

KPQ

Key performance questions

Asking questions around the strategic objective of the content and measurement goals

Goal setting stage

Analysis and action stage

Measurement stage

7 of 22

Applying context: What are we collecting data for?

KPQs (key performance questions):

  • Can Canadians do what they need to do on our site?
    • Can people find their task?
    • Once found, can they complete it?
  • How can we make our web presence more useful?

KPIs (key performance indicators) answer the above questions

8 of 22

Update to COVID KPIs

  1. Task success rate (prioritized)
  2. Reduction in calls and emails (cross-referenced with web traffic)
  3. Click-through rate
  4. Search success (new)

Supporting considerations:

  • Search terms (what words? what topics?)
  • Visits to content pages (changes in trends?)
  • User feedback (on-page widgets, task ID surveys)

9 of 22

Next steps for measuring against COVID KPIs

  1. Clearly defining each KPI in detail (methodology, data sources, frequency)
  2. Creating baselines and benchmarks around identified areas for improvement
    • What does performance look like now?
    • What does success look like?
    • Where do we need to improve?
  3. Creating specific goals for success under each KPI
  4. Prioritizing KPIs
    • Task success

10 of 22

Task success indicator - definition

Data source

Methodology

Percentage of users who successfully complete a high-demand online task scenario based on success criteria established for the scenario

Rationale: Task success is a measure of how effectively a department’s public digital content is serving user needs.

Definitions:�

“Task success” is defined as a task scenario at which a user succeeds or fails based on success criteria established for the scenario. The target success rates for tasks on Canada.ca is 80%.

A “task scenario” describes a concrete user goal for a task or a piece of the task that can be reasonably completed in less than 5 minutes. For example: “You live in Stephenville, Newfoundland. Is there a service centre in town where you could apply for Old Age Security pension (Yes or No)?”

Department

Calculated as: Overall task success rates for 3 task scenarios by a task completion questionnaire run on your web content

Time period for data collection: Fiscal year.

Frequency: continuous - questionnaire to run on ongoing basis

Content Scope: Scenarios for testing should be derived from one or more tasks on the Canada.ca Top Task Inventory, or one or more of your department's public top tasks.

Definitions:

Task completion questionnaire” is a factual/behavioural survey conducted with visitors to the website. A random sample of visitors to the site are asked to first complete what they came to the site to do. Then, they are asked to confirm what their task was by selecting from a pre-populated list of top tasks, with the option to write in another task if theirs isn’t included. Finally, they are asked if they were able to complete their task. The percentage of people who indicated they were able to complete each task is used as the measure of task success. (questionnaires can include additional questions; those included here are the minimum required.)

11 of 22

Reading and applying data to decisions

12 of 22

Remembering web is an iterative environment

FROM: Tile approach, Health focused

TO: Task-focused, mobile-first approach, whole-of-Government focused

13 of 22

HC/PHAC: Applying data to landing page decisions

Goal: To maintain landing page to stay current

  • Click-through KPI selected - Low traffic items on the landing page would be removed and/or integrated with other topics to make way for new priorities. <0.1% traffic would flag consideration for change.

Measurement tool: Click-through rate report

  • Daily ClickMap from PP provides click-through rates for every link on the landing page.
  • This was used to identify performance baseline for specific links and traffic trend over time

Analysis: supporting decisions and taking action

  • Clicks over time identifies downward trend for content relevancy and supports recommendations for removal.

14 of 22

HC/PHAC: Applying data to landing page decisions

Improvements:

  • Reduced number of links
  • Updated “Travel and immigration” category to represent current information needs and based on user testing results
  • Created a new “How you can help” category to support a growing area of content

15 of 22

CRA: Applying UX improvements to CERB page

CERB 2.0 Improvements:

  • Navigation allows people to easily explore the content and find what they need
  • Plain language
  • Mobile friendly design
  • Use of field-flows to provide information specific to people who need it
  • UX testing and design improvements are on-going

16 of 22

CRA: Applying UX to CERB eligibility page

Usability improvements

  • Easier to read and scan
  • Engaging interactive checkboxes
  • Clarity for requirements that confused people initially
  • Plain language

17 of 22

CRA: UX research and design for CESB

CESB Research and Design process

  • Leveraged the UX findings from CERB testing to launch with the improved design
  • UX testing during design allowed changes to be made before launch
  • Multi-page design supports the end-to-end user journey, from setting up accounts to reapplying or repaying the benefit
  • UX research involved throughout the design process

18 of 22

Sharing our data with each other

19 of 22

Cataloguing and sharing analytics data

Work in progress, but so far:

  • Data inventory in Google spreadsheet, links to reports
    • Web traffic over time
    • Click-thru maps
    • Call and email — volume and drivers (topics)
    • ...more forthcoming
  • Wiki (public) will link to the Google folder (restricted access; requires email invitation)

20 of 22

Next steps

21 of 22

Next steps for working group

  • Defining consistent KPIs, methodology for whole-of-gov measuring and reporting
  • Streamline whole-of-gov reporting to best leverage skills and meet needs of partners (automate what we can, free up time for analysis)
  • Identify external influences on measurement results (e.g. trends in top tasks) and impact of unintended consequences of content changes
  • Explore future tools for streamlining collection, reporting and analysis of data

22 of 22

Action items

  • Share evidence that has helped decision making in your department & why (examples?)
  • Identify if there is interest to join the working group