1 of 43

Scan Events Content Audit

User Insights, Evaluations and Opportunities

April 2022

Catherine (Chea) Bryce  |  UX Writer

2 of 43

Overview

  1. Summary
  2. Methodology
  3. Findings 
  4. Opportunities

3 of 43

Summary

4 of 43

BACKGROUND

Most users access our website and app to view their tracking status, displayed via scan events. These scan events appear throughout our channels in their current form with no customization.

Here's the current landscape of the scan events: 

  • 6+ years since the last content audit
  • Haven't been reviewed holistically 
  • New event scans are added to the repository, and existing ones are updated on a case-by-case basis.

5 of 43

Identify content gaps by conducting a content audit for the scan events.

PROJECT GOAL

6 of 43

Several high-impact scan events are not user friendly after reviewing them both individually and holistically.  

OVERALL FINDINGS

7 of 43

Methodology

8 of 43

 UX Content Scorecard Template (From Strategic Writing for UX by Torrey Podmajersky)

UX CONTENT SCORECARD

5 criteria for evaluating content: 

Accessible

Purposeful

Concise

Conversational

Clear 

9 of 43

  • Available in the languages the people using it are proficient in.
  • Reading level is below 7th grade (general) or 10th grade (professional).

Accessible

10 of 43

  • What the person should or can do to meet their goals is clear.

Purposeful

11 of 43

  • Information presented is relevant at this moment in the experience.

Concise

12 of 43

  • The words, phrases, and ideas are familiar to the people using it.
  • Directions are presented in useful steps, in a logical order.

Conversational

13 of 43

  • Actions have unambiguous results.
  • How-to and policy info is easy to find.
  • Error messages help the person move forward or make it clear they can’t.
  • The same term means the same concept, every time it’s used.

Clear

14 of 43

Content Testing

15 of 43

Track Delivery Scan Events

12 users provided based on the 5 usability criteria. 

Sample user score for one of the events. 

Event scans across various shipping companies. 

CONTENT TESTING

16 of 43

Insights

17 of 43

Individual Event Scans

18 of 43

CONTENT TESTING – TRACK DELIVERY SCAN EVENT

Canada Post

Low average score: 5.7

19 of 43

The user is not sure what's happening and that's a common scan event. We're basically still waiting for the package. It's not in our hands at this point, but the scan event is unclear and users call us for updates when they see this.

CONTENT TESTING – TRACK DELIVERY SCAN EVENT

Canada Post

Low average score: 5.7

20 of 43

Canada Post

Low average score: 5.7

CONTENT TESTING – TRACK DELIVERY SCAN EVENT

USPS

High average score: 8

21 of 43

Canada Post

Low average score: 5.7

CONTENT TESTING – TRACK DELIVERY SCAN EVENT

USPS

High average score: 8

Simple content is good~

But 'USPS waiting item' sounds like friendly, it can show its personality~

In general, I like the idea of the name of the carrier being included in the message. Especially if you are dealing with many carriers. 

"Shipping Label Created, USPS Awaiting Item" is clear and describes next steps.

22 of 43

  • Numerous "item processed" events with different meanings (e.g., processed at a sorting facility or destination facility)
  • Users may think it says the same thing (if they don't notice location) could lead to customer service calls 

CONTENT TESTING – TRACK DELIVERY SCAN EVENT

Canada Post

Low average score: 3

23 of 43

  • Numerous "item processed" events with different meanings (e.g., processed at a sorting facility or destination facility)
  • Users may think it says the same thing (if they don't notice location) could lead to customer service calls 

CONTENT TESTING – TRACK DELIVERY SCAN EVENT

Canada Post

Low average score: 3

Item processed doesn't tell me much.

"Item processed" is concise, but doesn't really say much.

24 of 43

CONTENT TESTING – TRACK DELIVERY SCAN EVENT

Canada Post

Low average score: 3

FedEx

High average score: 7.4, 7

25 of 43

CONTENT TESTING – TRACK DELIVERY SCAN EVENT

Canada Post

Low average score: 3

FedEx

High average score: 7.4, 7

It's always good to mention whose facility the item has arrived at since sometimes there are multiple carriers especially for international items.

FedEx wins for purposeful and concise.

The FedEx uses regular, conversational language.

26 of 43

CONTENT TESTING – TRACK DELIVERY SCAN EVENT

Canada Post

Low average score: 2.1

"Item arrived" in the situation is terrible - I read that and think it must be at my door."

"Item arrived," I would think it means it's delivered.

"Item arrived" leaves me with a lot of questions. Item arrived where? When? What's actually happening with the item?

27 of 43

CONTENT TESTING – TRACK DELIVERY SCAN EVENT

Canada Post

Low average score: 2.1

FedEx

High average score: 8.8

28 of 43

CONTENT TESTING – TRACK DELIVERY SCAN EVENT

Canada Post

Low average score: 2.1

FedEx

High average score: 8.8

I love how "At local FedEx facility. Scheduled for delivery next business day" mentions when the person can expect delivery

29 of 43

Holistic View

30 of 43

Many unclear scan events that aren't relevant to users at this moment in the experience.  

DELIVERY TRACK

31 of 43

Many unclear scan events that aren't relevant to users at this moment in the experience.  

DELIVERY TRACK

...In transit from where? Departed where? None of these provide enough detail on where the parcel is in the shipping process; it could be leaving a sorting facility, it could be leaving a delivery truck, it could be leaving one town for another.

I get that they're likely vague to avoid having to write a lot of variables, but none of these are super useful for the user.

32 of 43

Amazon shows fewer ones, but only the most important ones.

DELIVERY TRACK

33 of 43

Unhelpful information

"Delivery delayed due to transportation delay."

  • No information about what caused the delay.
  • Not beneficial info – users can't do anything about it. 
  • Not comforting to have this additional detail, since users would want to know how this would impact delivery. 

EMAIL

34 of 43

Suggestion

  • Provide relevant info: "package may arrive later than expected/here's your new delivery date"

  • Make next steps more prominent: "Track your package"

EMAIL

35 of 43

Missing next steps

For both examples, what the person should or can do to meet their goals isn't clear. 

In Example 2, if we don't include location in the scope, the message itself doesn't tell them where to pick up their package.

TEXT NOTIFICATIONS

Example 1

Example 2

36 of 43

Multiple texts to compensate for lack of clarity

The scan event doesn't stand by itself in a text message, so a previous message needs to be sent.

Not concise, purposeful, or clear. 

TEXT NOTIFICATIONS

37 of 43

TEXT NOTIFICATIONS

Content doesn't fit well

Here's an instance where an event description may work better in the track web flow, but not as a text message. 

  • Unclear what's delivered
  • No follow up steps to contact customer service

38 of 43

TEXT NOTIFICATIONS

Purolator example

  • Conversational and clear language
  • Users know their package has been delivered
  • Sharing excitement: "hooray!"
  • Steps to call customer service

39 of 43

Opportunities

40 of 43

Update existing scans

  • Reassess terms we're using, so they resonate with users (using existing studies and competitive research); prioritizing high impact areas 
  • Make the voice more conversational, friendly, less corporate and operational
  • Remove scans that are not necessary or confusing
  • Changes and decisions should also be reflected in the content guideline

UPDATE EXISTING SCANS AND GUIDELINES 

41 of 43

CUSTOMIZED SCANS

Identify areas where customization is needed 

  • Compare different channels, and see where scans can be customized for each (while ensuring the terms, tone etc. are consistent) 

42 of 43

REVIEW CUSTOMER SUPPORT VOLUMES 

Review customer support volumes

  • Look at existing customer support volumes related to scan events (e.g., requests related to their delivery status)
  • Compare ticket volumes overtime to see the impacts of updating existing scan events

43 of 43