ABCDEFGHIJ
1
GFI
2
KB Article Internal QB# of failuresProgress report
3
Total failures14700Total articles7716% Completion
4
Structure1131Total processed (team)360947%
5
1.1Does the article have the correct intended audience, based on the type of content or any predetermined specifications?667Met QB Initially60%
6
1.2Is the article Unique (not a duplicate)? (i.e. no other article with the same content exists.)464Fixed242967%
7
Relevance2934Pending Tech Review2758%
8
2.1Is the article straightforward and does it logically describe the subject matter provided? (i.e. Is it understandable to a potentially new user?)238Needs CSM Feedback20%
9
2.2Does all content in the article provide value to the intended audience? (Contains clear Problem|Solution, How-To/Process, Reference topic, or Diagnostic help)397To be Archived84023%
10
2.3Have all critical terms, acronyms, and jargon been clearly defined?341Draft572%
11
2.4Is the title appropriate to the content in the article and worded using a customer-facing perspective? (Concise, Clear, Relevant)1958Total QA Evals240499%
12
Accuracy3094Articles Reworked after QA4
13
3.1Are the correct AQI and product-component labels used?2402
14
3.2Are hyperlinks, images and attachments displayed clearly and do they all work properly?594Estimated Due DateSeptember 8, 2019
15
3.3Have Information Security policies, Intellectual Property Rights and Customer Privacy been respected?98
16
Presentation7541
17
4.1Does the formatting and structure of the article support understanding of the content?2244
18
4.2Is the article written in concise English, free from obvious grammatical or syntax mistakes (i.e. passes a Grammarly automated check)?1223
19
4.3Does text decoration (bold, italics, underline) support clarity of the content and not override/conflict with brand portal css styles?1931
20
4.4Are source code lines, queries and scripts in a Code block (with optional syntax coloring)?1019
21
4.5Have lists been used correctly? (i.e. numbers where order matters, otherwise bullets)1124
22
23
24
25
Please utilize this form to submit entries:
https://forms.gle/XrazUUkucrR4Z58FA
26
27
28
GFI Specific Information
29
30
Note:
31
32
33
Pending Questions for Brand/PKC:Notes:
34
35
36
37
38
GFI KB Rules:
39
Audience should be "Everyone" for customer facing articles.
40
- Feature request articles can be archived. Example here
41
42
43
44
45
46
47
KB Rescue Information
48
49
The process for the Tech Writers in a nutshell:
50
0. Optional as required - Take the level 100 training courses for the product. You must be able to accurately identify ticket types and have a basic understanding of the product and its features.
51
1. Once a batch of articles has been assigned, copy the URL of the first article from the Selected Ticket Data sheet.
52
2. In a second browser window, open the google form. Put the form and your first browser window 50/50 side by side on your screen.
53
3. Paste the URL into the first entry field of the form, and into a new tab on the first browser, hit enter.
54
4. Read and review the article.
55
5. Check each of the points one by one in the QB, if it passes the first time, check "Yes" for that point. If it does not pass, fix the problem if it is within your control.
56
6. In the summary textbox, put the QB number and a short description of what you did to fix any item which was not marked "Yes". Do not change the check boxes to "Yes" as you fix items.
57
7. If you were unable to fix something, then you will likely need to choose one of the "not published" status options, then submit the form.
58
8. Click to enter another submission and start again at step 1, using the next article in your batch.
59
9. If one of your articles fails to meet the QB, and you need to rework it, submit the form again with a comment and choose the "but I had to rework it" status.
60
61
How to handle duplicates:
62
1. If you find a duplicated article, determine which of the copies is the "best" based on the QB. If they are all the same, go with the oldest one as being best.
63
2. Submit the form using the URLs copied from the Selected Ticket Data sheet, once for each of the duplicate copies. It does not matter who the assignee is for this step.
64
3. Set the status as archive via the form, but do not actually archive the article on Zendesk.
65
4. Review and fix the remaining article per the QB. Submit it via the form as per usual. Again, the asignee does not matter for this step.
66
67
Note 1: If there are technical differences between the different articles, we will get a tech review prior to archiving. Flag it as such using the "unpublished status" question.
68
Note 2: Check the statuses of the duplicated article's assignees to make sure you dont cross submit if possible.
69
70
This file explained by Ben:
71
1. Dashboard - This is the high-level overview which automatically sums up data from the other sheets.
72
2. Selected Ticket Data - The relevant columns from the full KB export. (via importrange). This is the sheet that the Tech Writers work from primarily. They use it to navigate to the individual articles. Names are pulled from the Assignments sheet.
73
3. Assignments - The sheet I populate with names to assign batches of tickets to specific Tech Writers. Has raw progress count per batch.
74
4. Responses - This sheet is populated by a google form. I add comments here to address process related issues, such as not being explicit enough with summary and findings. I also add comments to address "CSM Feedback". The Quality pillars here are used to populate the "failures" counts on the Dashboard, giving us a snapshot of the KB before we touched it.
75
5. QA Input - The sheet I work off of to perform QA evaluations. This sheet is populated via a query to the Responses sheet. It only shows pertinent columns, ignores archived articles and sorts the rest by status. I take each article that is Published and review it against the QB. I submit my results via a second google form. Articles that have been reviewed are automatically highlighted.
76
6. QA Evals - The sheet that is populated by the second google form. We calculate the FTAR score here simply by taking count of "Yes"/total. We could add weighting, but for now, we just keep it simple.
77
7. FTAR - A simple report showing the FTAR score for each Tech Writer. Populated by data in QA Evals tab.
78
79
80
81
82
83
84