Manage a Teammate Peer Assessment Activity using Peer Assess Pro
Reference Guide for Teachers and Students
Version 5.1 2023-06-27
Peter Mellalieu peter@peerassesspro.com
+64 21 42 0118
Patrick Dodd patrick@peerassesspro.com
+64 21 183 6315
Follow these steps to register, launch, manage, and download the final gradebook for a Peer Assess Pro peer assessment using the Xorro Survey Management system.
Download interactive at www.peerassesspro.com/quickstart-guide-for-teachers
Launching Peer Assess Pro™ using Xorro-Q: Overview
Once logged in to Xorro-Q, you launch a peer assessment activity.
During the launch process, you Import a Participants CSV that specifies team members (first, last name) arranged by their team name, login id and their email.
The Participants CSV is a comma-separated variable (csv) file that must contain the column headers shown in the example below.
Peer Assess Pro emails an activity URL that enables each team member to complete the peer assessments of their team members.
Timely reminders and personalised feedback reports are communicated to the students from Peer Assess Pro using the email addresses you provided in the Participants CSV.
Active Warnings on the Teachers Dashboard provide the teacher with advice about at-risk teams and individuals, poor peer assessment rating behaviour, and other progress indicators.
>>> View the comprehensive online and eBook ‘Get Started with Peer Assess Pro’
id | first | last | team | group_code |
AMTO01 | Amanda | Tolley | Amanda.Tolley@noreply.com | Bear | BUS123.101/PMell/TutB/2020-05-28/SUM |
ANWO08 | Anna | Worth | Anna.Worth@noreply.com | Bear | BUS123.101/PMell/TutB/2020-05-28/SUM |
HOBR03 | Holly | Brown | Holly.Brown@noreply.com | Bear | BUS123.101/PMell/TutB/2020-05-28/SUM |
ALJO11 | Alice | Jones | Alice.Jones@noreply.com | Panda | BUS123.101/PMell/TutB/2020-05-28/SUM |
GRGR15 | Greta | Green | Greta.Green@noreply.com | Panda | BUS123.101/PMell/TutB/2020-05-28/SUM |
JEWA06 | Jeff | Wang | Jeff.Wang@noreply.com | Panda | BUS123.101/PMell/TutB/2020-05-28/SUM |
BOWI12 | Bob | Wilson | Bob.Wilson@noreply.com | Tiger | BUS123.101/PMell/TutB/2020-05-28/SUM |
HEJO19 | Henry | Jones | Henry.Jones@noreply.com | Tiger | BUS123.101/PMell/TutB/2020-05-28/SUM |
JOSM13 | John | Smith | John.Smith@noreply.com | Tiger | BUS123.101/PMell/TutB/2020-05-28/SUM |
>>> Download example CSV, EXCEL, or Google Sheet
FAQ - Show me a quick video overview demonstration of the whole Peer Assess Pro system
FAQ - I’m having problems importing my participants csv
FAQ - How do I correct the Team Composition in a running peer assessment activity?
FAQ - What is the purpose of peer assessment?
FAQ - What questions are asked in the peer assessment survey?
FAQ - View the comprehensive ‘Get Started with Peer Assess Pro’ online eBook
>>> More FAQs at www.peerassesspro.com/frequently-asked-questions-2
Everyone | ||
FAQs on the web at http://tinyurl.com/papFAQ | ||
Ask us for help, give us feedback, and request additional features.
https://www.peerassesspro.com/contact-us/
Patrick Dodd patrick@peerassesspro.com +64 21 183 6315
Peter Mellalieu peter@peerassesspro.com +64 21 42 0118 Skype myndsurfer
Download interactive poster https://www.peerassesspro.com/infographic/
Mellalieu, P. J. (2020). How to teach using group assignments: The 7-step formula for fair and effective team assessment. Peer Assess Pro. https://www.peerassesspro.com/ebook
>>> Hyperlinked chart at http://tinyurl.com/papChart
Example Peer Assessment Participants CSV File 2
Example Survey Questions for a Team Member 3
Most Frequently Asked Questions (FAQs) 3
Support, Feedback and Contact 4
The Seven-Step Formula for Effective Peer Assessment 5
Teachers Process Flowchart: Overview 6
PEER ASSESS PRO REFERENCE GUIDE 7
1. Login to your Xorro HOME page 20
1.1 First time users: Register 21
Register a new Xorro Teacher’s Account as a Free Facilitator 21
Getting started with Xorro Q 21
1.2 Login from your registered Xorro Account 22
1.3 Orient yourself to the Xorro HOME Dashboard 23
1.4 Orient yourself to the Peer Assess Pro platform 25
Review the Peer Assess Pro facilitators dashboard 26
Overview of the steps required to launch a peer assessment 27
1.5 Peer Assess Pro system flowchart detail 28
2. Launch Peer Assessment activity 29
2.2 Create the peer assessment Participants CSV 31
Alternative Participants CSV templates 31
Instructions and column explanations for the peer assessment Participants CSV 32
Requirements for a peer assessment Participants CSV file 34
Create a CSV version of your Participants CSV file 34
Why won’t Xorro load my Participants CSV file? 35
Your spreadsheet editor will typically NOT create a CSV file, unless... 35
Good practice hint: Create distinctive group codes for every peer assessment activity you launch 35
Large, multi-cohort streams in a class 36
2.3 Launch and create the peer assessment activity 37
Select ACTIVITIES from the top menu bar 37
Good practice hint: Avoid using the Xorro default Due Date 38
The Due Date date is advisory only 39
View the Peer Assess Pro Teacher’s dashboard 40
Invite team members to respond and other automated activities 41
2.4 Use a Teamset Group to launch a peer assessment 44
From the Xorro HOME page select the PARTICIPANTS page 44
Select ‘Import Participants’ 44
Browse to your Team Members Group CSV file 45
Load, check and confirm correct team membership, then Import 45
Check class and team membership 46
3. Manage the Peer Assessment Activity 48
3.1 Action responses to warnings 49
Note: Changes to a Xorro Group have NO EFFECT on current Team Composition 51
3.2 Automated and manual notifications 51
3.4 Select the Personal Result Calculation Method 53
3.5 Review class, team, and individual statistics 54
Good practice hint: How to identify at risk students 55
The Individual Personal Snapshot 55
Four possible views of the Individual Personal Snapshot 58
3.6 Publish provisional Personal Results to team members 63
Results hidden when insufficient responses 64
4. Finalise the peer assessment activity 66
4.2 Publish Finalised Results to students 67
4.3 Download Teacher’s Gradebook of Results 68
4.4 Finalise the Activity … irrevocably! 69
Launch peer assessment activity 72
Manage the peer assessment activity 72
Responding to Active Warnings 73
Definitions, calculations, and examples 74
The purpose of peer assessment 76
Undertaking the peer assessment 76
Using the results from peer assessment for better performance 77
How peer assessment affects personal results 77
FAQ: What is the purpose of peer assessment? 79
Determination of course personal result 79
Criteria for peer assessment in Peer Assess Pro™ 80
Peer Assess Pro assesses competencies valued by employers 80
FAQ: When and how is the peer assessment conducted? 83
Formative assessment: optional but valuable 83
FAQ: How do I provide useful feedback to my team members? 86
Symptoms of an unfair assessment 88
Steps to address an unfair peer assessment 88
A note on appealing a peer assessment result 89
Prevention is better than cure 90
FAQ: How do I interpret the feedback results I've received from the peer assessment? 92
FAQ: What steps can I take to get a better personal result? 94
Use your institution’s academic support services 94
Raise your Peer Assessed Score 95
How do I address proactively the challenges of team work? 95
Learning constructively from mid-course peer assessment feedback 96
FAQ: What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process? 98
Examples: Highly specific and individualized information 99
1. Low quality assessor rating 99
3. Outlier individual rating 100
4. Mismatched self-assessment 100
Example: Better feedback. Better teams 101
Which teams will raise the Active Warning: Low quality team rating? 102
Which teams tend to have a higher team result? 103
Which teams have worked most productively as a team? 103
Active Warnings, thresholds parameters, and program logic 105
FAQ: Give me a quick overview of how to launch a Peer Assess Pro™ activity through Xorro 106
Peer Assess Pro™ is a work in progress 109
FAQ: How do I find the Peer Assess Pro Xorro Teacher’s dashboard? 111
Alternative method: ACTIVITIES: Running Activities 111
FAQ: How do I navigate the PARTICIPANTS page for Peer Assess Pro? 114
Orientation note: Select an existing Group 115
Inactive functions in PARTICIPANTS page 116
FAQ: How do I correct the team composition in a running peer assessment activity? 117
Adjust the team composition in a running peer assessment activity on an LMS 118
Take care! Here there be dragons!! 118
What happens with Synchronise All? 119
Precautions before Synchronise All! 120
Some survey responses might be deleted! 120
Team composition view prior to synchronise all 121
Team composition following synchronise all 122
Adding orphan teams to a running activity 123
Why was my Synchronise All action rejected? 123
Good practice hint when creating a peer assessment activity 123
LMS team arrangement facilities for Moodle 124
Adjust the team composition in a running peer assessment activity on Xorro 125
Take care! Here there be dragons!! 125
Correct the team composition 126
FAQ: How do I create a CSV file from a Google Sheet? 129
Sample of participants csv file opened using a text editor 130
FAQ: How do I view a demonstration version of Peer Assess Pro? 131
FAQ: How do I correct the participants (team members) in a group already uploaded to Xorro? 132
Update a group’s team members for future use 132
Correct the team members associated with an existing Xorro TeamSet Group 133
FAQ: Where may I view the most recent version of the user guides? 135
Work in progress Google DOCS development version 135
Frequently Asked Questions for teachers and team members 136
Teachers Process Flowchart 136
FAQ: How do I decide which Personal Result method to apply in my peer assessment activity 137
FAQ: How are peer assessment and personal results calculated and defined mathematically? 143
Calculation methods that exclude a team result 144
Calculation methods that incorporate a team result from team outputs 144
Automated communications to students 145
Alternative mode for student access to assessment and results 146
FAQ: How do I view and experience what the students experience? 147
View your student’s personal results feedback report directly from your Teacher’s Dashboard 147
View your students’ experience of the Peer Assess Pro™ survey 147
Enter your Participants’ URL into your browser 148
Select the activity you wish to experience 149
View a survey ready and waiting for responses 150
View a student’s published results 151
View the peer assessment survey for a demonstration class 153
Critical and catastrophic warnings! 155
Optional emails generated for team members 156
FAQ: When, why, and how do I Refresh and Update Results? 157
FAQ: What questions are asked in the peer assessment survey? 159
Example Peer Assessment Survey: Quantitative 160
Example Peer Assessment Survey: Qualitative 161
FAQ: Why FINALISE a survey? 162
Best practice before FINALISE SURVEY 162
Download Teacher’s Gradebook of Results 165
Finalise the Survey … irrevocably! 166
FAQ: How is the Peer Assessed (PA) Score calculated? 167
The self-assessment is excluded from calculating PA Score 167
Mathematical definition of Peer Assessed Score, PA Score 168
Example calculations of Peer Assessed Score 171
Alternative mathematical formulations of PA Score 173
Calculation from Average Rating 173
Calculation from Average Team and Leadership Contributions 174
FAQ: How is the self-assessment used to calculate Peer Assessed Score? 175
Spider chart of individual and averaged team peer ratings 175
Index of Realistic Self-Assessment (IRSA) 176
FAQ: How is the Peer Assessed Index (PA Index) calculated? 177
Mathematical definition of Peer Assessed Index 177
Example calculations of Peer Assessed Index 178
FAQ: How is the Indexed Personal Result (IPR) calculated? 180
Mathematical definition of Indexed Personal Result 180
Example calculations of Indexed Personal Result 181
FAQ: How is the Normalised Personal Result (NPR) calculated? 183
Mathematical definition of Normalised Personal Result 184
Example calculations of Normalised Personal Result 185
Impact of adjusting the Spread Factor on Normalised Personal result 187
FAQ: How is the Rank-Based Personal Result (RPR) calculated? 189
Mathematical definition of Rank-Based Personal Result 190
Example calculations of Rank-Based Personal Result 191
Example calculation with tied ranks 194
Adjusting the range using a spread factor 194
Example calculation with spread factor 195
FAQ: How is the Rank Based Personal Result (RPR) calculated (Pre-2022)? 197
Mathematical definition of Rank-Based Personal Result 198
Example calculations of Rank-Based Personal Result 199
Example calculation with tied ranks 201
FAQ: How is Standard Peer Assessed Score (SPAS) calculated? 202
Design features of Standard Peer Assessed Score 203
Example calculations of Standard Peer Assessed Score 206
Example charts for Standard Peer Assessed Score 209
Assumptions about Standard Peer Assessed Score 210
The impact of gaming peer assessment 211
FAQ: What is Employability? How is it calculated? 214
Mathematical calculation of Employability 214
Conditioning transformations to de-emphasise unsubstantiated precision 215
Example calculations of Employability 215
FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated? 218
Mathematical definition of the Index of Realistic Self Assessment 218
Example calculations of the Index of Realistic Self Assessment 219
Why an IRSA of 100 is not a perfect score! 220
FAQ: How do I interpret measures of realistic self-assessment? 222
Interpreting the Index of Realistic Self Assessment (IRSA) 222
Developing an exceptionally realistic self-image, ERSA 223
What are the benefits of having an Exceptionally Realistic Self Assessment? 223
What can get in the way of having an Exceptionally Realistic Self-Image? 223
How do I develop my Exceptionally Realistic Self-Image, ERSI? 224
FAQ: How is insignificant intra-team agreement identified? WARNING 0041 226
Accounting for tied ratings and low-quality assessments 228
A team where all team members rate everyone the same 228
Award the same personal result 229
‘Award the same personal result’ status is deactivated automatically 231
‘Exclude from calculations’ overrides actions that address insignificant team agreement 232
Interpreting the concordance value, W 233
Exclusion of self-assessment ratings 233
Where everyone is equally average 234
Significance of the concordance statistic 234
Requirement for calculating concordance 235
Team discrimination table shows class concordance statistics 235
Example A. High rating agreement amongst the team members 237
Ex A. Calculation of Concordance, W, from ranks 239
Example B. Low agreement amongst the team members 240
Example C. Teammates show low discernment by submitting tied ranks 242
A note on computational efficiency 244
FAQ: How is an outlier peer assessment rating identified? WARNING 0042 247
Failure to agree across the whole team 247
Threshold for warning of outlier individual peer rating 249
Alternative mathematical calculation of Assessor Impact 250
Alternative example calculations 250
FAQ: What is a mismatched self-assessment (IRSA)? WARNING 0040 252
Threshold for warning of mismatched self-assessment 253
Recommended action for facilitator 254
FAQ: What is an adjusted team arrangement request? WARNING 0006 255
Reassign a new participant to an existing or new team 258
FAQ: What is an inactive team member? WARNING 0048 260
Good practice for addressing an alleged inactive student 261
Reassigning a student to a new ‘team of one’ 262
Exclude from calculations overrides actions that address insignificant team agreement 262
Example of exclusion from calculations 263
Impact of exclusion from calculations 266
FAQ: What is a low-quality team rating? WARNING 0050 267
Threshold for warning of low-quality team rating 268
Recommended action for facilitator 269
Active Warning 0050 Under development 271
Threshold for warning of low-quality team rating 272
Recommended action for facilitator 273
FAQ: What is a low-quality assessor rating? WARNING 0300 275
Threshold for warning of low quality assessor rating 275
Recommended action for facilitator 277
FAQ: What is a valid assessed team? WARNING 0022 278
Results not displayed to members of non-valid assessed teams 278
How many valid and invalid teams do I have? 279
Recommended action for facilitator 279
FAQ: What is an ‘at-risk’ team member? WARNING 0036 281
Recommended action for facilitator 282
Threshold for warning of ‘at-risk’ team member 282
Alternative approaches to identifying at-risk students 287
FAQ: What is a team with low psychological safety? WARNING 0034 291
Example calculation for Team Safety 293
Recommended action for facilitator 294
FAQ: What is an unsafe team member? WARNING 0035 296
Recommended action for facilitator 297
FAQ - What emails have been sent by the platform? 299
Survey notifications history 299
Track-and-trace of emails to participants 299
FAQ: What is the content of emails sent by Peer Assess Pro to Participants? 302
Preview email from Active Warnings 302
Preview all emails available for sending 302
Table of email subjects sent to participants 304
Table of email body text sent to participants, listed by Email ID and Subject 306
FAQ: What is the content of emails sent by Peer Assess Pro to Facilitators? 316
Table of email subjects sent to facilitators 317
Table of email body text sent to facilitators, listed by Email ID and Subject 318
FAQ: How do I login to my peer assessment Activity URL 326
Successful login through Activity URL 327
FAQ: I am unable to login. My login failed 329
Investigation and remedies for login failure 329
1. You entered your ID incorrectly. 329
2. Your teacher or facilitator has entered your ID incorrectly 330
3. The Xorro Activity related to the Activity URL has not yet reached its Start Date 330
4. The Xorro Activity related to the Activity URL has been Finalised and Finished. 331
5. The Xorro Activity related to the Activity URL has been Abandoned 332
FAQ: Can I adjust the Start Date or Due Date for a running activity? 335
The good news: The Due Date date is advisory only 336
Advise students of your extended deadline 336
Worst case scenario: Abandon the peer assessment 336
FAQ - I’m having problems importing my participants csv 339
What are the common problems when importing a participants file? 339
Is my Participants CSV in the correct format for launching a peer assessment activity? 341
Sample of participants csv file opened using a text editor 341
Sorted Participants CSV viewed in Google Sheets 342
Error notifications upon upload of a participants CSV to a peer assessment 344
Examples of error notifications upon upload of a participants CSV to a peer assessment 345
Comprehensive list of potential errors when attempting to import participants csv 346
Potential errors in a participants csv 346
About Comma-separated values (CSV) 348
FAQ - Problems editing and creating participants CSV files 350
Trouble opening csv files with Excel due to regional settings 350
Solution 1: Use a simple text editor to find and replace semicolons with commas 351
Illustration using Mac TextEdit 351
Solution 2: Adjust Language and Region to use point as decimal marker 355
Illustration using Mac OS on an Apple computer 355
FAQ - How do I fix an invalid, missing or failed email delivery? WARNING 0026 359
Corrective action: avoid these emails 359
Corrective action: use these emails 360
FAQ - How do I resolve an unsynchronised team arrangement? ACTIVE WARNING 0021 362
What is a team arrangement? 363
Synchronise all team arrangements 364
Alert to unsynchronised team arrangement 367
Unsynchronised Team Composition 368
Successful synchronisation of the corrected team arrangement 370
FAQ - What is the benefit of a standardized peer assessment rubric? 373
FAQ - What if a student mistakenly advises they are in an incorrect team? 375
Team membership confirmation by student 375
Mistaken team membership notification 376
Sign up as a Free Facilitator to trial the use of Peer Assess Pro using the Xorro-Q interface:
https://www.xorro.com/free_accounts/pap/new
For related information relevant to registering as a new facilitator:
For further details contact Patrick Dodd at the offices of Peer Assess Pro.
After you login, The your Xorro HOME Dashboard page shows will display, as shown in Section 1.3 Orient yourself to the Xorro HOME Dashboard
Now proceed to follow the steps in the Quickstart Guide, or the detailed explanations in Section 2. Launch Peer Assessment Activity
Quick links and related information
Section 1.3 Orient yourself to the Xorro HOME Dashboard
FAQ: How do I find the the Peer Assess Pro Teacher’s dashboard?
Your Xorro HOME Dashboard page shows
Quick links and related information
FAQ: How do I find the the Peer Assess Pro Teacher’s dashboard?
Peer Assess Pro system flowchart detail
Peer Assess Pro system flowchart detail http://tinyurl.com/papChart
Each process box in the flowchart pdf version of the flowchart links directly to the specific page in this Reference Guide that explains that step in the process.
View this short video illustrating many of the features, benefits, and processes involved in using the Peer Assess Pro platform (6 minutes).
http://tinyurl.com/digitalFlyBy
These are the key features of the Facilitators Dashboard accessed through the Xorro ACTIVITIES tab when you have launched a peer assessment activity.
PDF with hyperlinks at Xorro Peer Assess ProTM Teachers Process Flowchart http://tinyurl.com/papChart
Create a file containing your class list that shows every team member organised into their teams. The required file format is Comma Separated Variables (CSV). This is your Participants CSV file. A sample of the file format is shown in Section 2.2 Create the peer assessment Participants CSV file
Use any of these following templates to adapt and create your Participants CSV file using your preferred editor.
After editing the template, remember to create a CSV file type using SAVE AS CSV, DOWNLOAD AS CSV or EXPORT AS CSV, depending on your spreadsheet editor.
For a registered Xorro user, use this link to launch a new peer assessment activity. You will be presented with an option to import directly your Participants CSV.
https://qf.xorro.com/pap/launches/new
If your CSV refuses to load, or the activity fails to create, review the detailed steps in the next sections to ensure your CSV is specified correctly.
FAQ - I’m having problems importing my participants csv
Check carefully that the specifications detailed in the INSTRUCTIONS and COLUMN EXPLANATIONS presented within the sample.csv template are followed strictly.
Use a spreadsheet editor, such as Google Sheets, Excel or Numbers to produce a file that contains columns of data with these column headers id, first, last, email, team, and group_code. Precise INSTRUCTIONS and COLUMN EXPLANATIONS for each of these data are detailed below.
Use any of these templates to adapt and create your Participants CSV (comma separated variables file) using your preferred editor. The templates contain the example data and instructions shown below.
In the sample files, only the group BUS123.101/PMell/TutB/2020-05-28/SUM is a valid teamset suitable for processing by Peer Assess Pro. This is the only group that specifies membership of teams by the students in the class, the teams being Panda, Bear and Tiger.
Sample peer assessment Participants CSV
id | first | last | team | group_code | |
ANWO08 | Anna | Worth | ARTS123.204/WShak/2021-02-28 | ||
GRGR15 | Greta | Green | ARTS123.204/WShak/2021-02-28 | ||
AMTO01 | Amanda | Tolley | Amanda.Tolley@noreply.com | Bear | BUS123.101/PMell/TutB/2020-05-28/SUM |
ANWO08 | Anna | Worth | Anna.Worth@noreply.com | Bear | BUS123.101/PMell/TutB/2020-05-28/SUM |
HOBR03 | Holly | Brown | Holly.Brown@noreply.com | Bear | BUS123.101/PMell/TutB/2020-05-28/SUM |
ALJO11 | Alice | Jones | Alice.Jones@noreply.com | Panda | BUS123.101/PMell/TutB/2020-05-28/SUM |
GRGR15 | Greta | Green | Greta.Green@noreply.com | Panda | BUS123.101/PMell/TutB/2020-05-28/SUM |
JEWA06 | Jeff | Wang | Jeff.Wang@noreply.com | Panda | BUS123.101/PMell/TutB/2020-05-28/SUM |
BOWI12 | Bob | Wilson | Bob.Wilson@noreply.com | Tiger | BUS123.101/PMell/TutB/2020-05-28/SUM |
HEJO19 | Henry | Jones | Henry.Jones@noreply.com | Tiger | BUS123.101/PMell/TutB/2020-05-28/SUM |
JOSM13 | John | Smith | John.Smith@noreply.com | Tiger | BUS123.101/PMell/TutB/2020-05-28/SUM |
THWI18 | Thomas | Windsor | Thomas.Windsor@noreply.com | Tiger | BUS123.101/PMell/TutB/2020-05-28/SUM |
ANWO08 | Anna | Worth | Anna.Worth@noreply.com | COMP123.201/PDod/TutA/2020-10-01 | |
HOBR03 | Holly | Brown | Holly.Brown@noreply.com | COMP123.201/PDod/TutA/2020-10-01 | |
JOSM13 | John | Smith | John.Smith@noreply.com | COMP123.201/PDod/TutA/2020-10-01 |
INSTRUCTIONS |
1. Organise your participants data into the columns corresponding to those shown in columns A to F, the first 6 columns headed 'id' through 'email'. You might find it helpful to paste your data from row 17, below the sample data provided in rows 2 through 16. The sample data provided demonstrates ten unique individuals (ids), organised into three different groups. One group contains a further three teams. A group might comprise all members of a class, or subdivisons such as streams, cohorts, sections, or tutorial groups. In the group called BUS123.101/PMell/TutB/2020-05-28/SUM the participants are subdivided further into three different teams, Bear, Panda and Tiger. Only group BUS123.101/PMell/TutB/2020-05-28/SUM is a Xorro teamset suitable for a peer assessment activity. A group is not a team. A group (such as a class) may contain several teams, in which case that's a Xorro teamset. |
2. If you are preparing a separate file, ensure you use exactly the same column headers for your list as shown in row 1. That is, 'id', 'first', 'last', 'email', 'team', 'group_code'. These headers are not case sensitive. The sequence of column headings is NOT IMPORTANT. You may optionally include additional headers and columns of data. This data will be ignored by Xorro. Data may be sorted by any of the columns. |
3. Read carefully the COLUMN EXPLANATIONS, below, for each type of data. Some data is optional, and can be skipped, as shown for group_code ARTS123.204/WShak/2021-02-28 |
4. Delete the sample data, immediately below the header row. That is, delete everything between row 2 and row 16. CRITICAL: CHECK you do not have duplicate ids in the same group_code. CHECK you do have all the ids in your class allocated to to a group, and, optionally, a team |
5. If you have used this page as your template, you may DELETE this 'instructions' column. That is, delete anything not part of your data. Keep the column headers. The headers must be on row 1 of your file. |
6. Save (Download, Export, Save As) the file as a CSV, giving it an appropriate filename. |
7. From Xorro-Q, browse to PARTICIPANTS, then upload the CSV file. Alternatively, when you Launch a Peer Assessment Activity, you can IMPORT directly the CSV to create or update the activity. From this sample file, upon upload three groups would be created in Xorro: ARTS..., BUS.... and COMP.... Only one of the groups is a teamset containing the three teams Bear, Panda, and Tiger. |
8. COLUMN EXPLANATIONS |
id - Compulsory field. Identifier for this participant, must be unique for the entire institution. For a peer assessment activity, this is the participant's login id. No blanks or characters such as #@$%&*()+ |
first - Compulsory field. Participant's first name |
last - Compulsory field. Participant's last name |
email - Optional field. The participant's email. Ideally required for a peer assessment activity when you require autogenerated warnings and notifications from Peer Assess Pro. |
team - Optional field. Required for peer assessment activity. The name of the team in which the participant is a member. The participant can be a member of NO MORE than one team within the same group. A participant may belong to different teams in different groups. |
group_code - Optional field. Required for a peer assessment activity. The code for the group (ie course, class, stream, cohort) into which the participant is being enrolled. If the participant is in multiple groups, supply a separate line for each group in which the participant is a member. Good practice. Append to your root code, such as BUS123.101, abbreviations that indicate the teacher, activity date (start or due), subdivision (stream, cohort), summative or formative. Note that Anna Worth is enrolled in three groups and in one team within group BUS123.101/PMell/TutB/2020-05-28/SUM |
After editing the template, remember to create a CSV version of your file. Depending on your editor, the appropriate command is:
FILE… SAVE AS … TEXT CSV
FILE… DOWNLOAD AS … Commas separated values (.csv)
FILE… EXPORT AS CSV
FILE… EXPORT TO… CSV
First, see FAQ - I’m having problems importing my participants csv
In frequent cases, using the FILE… SAVE command in your spreadsheet editor will produce a file with the incorrect file format, such as .xls, .sheet, or .numbers.
Xorro will reject those file formats. Xorro accepts and loads only .csv.
Follow this advice
FAQ: How do I create a CSV file from a Google Sheet
We advise creating a new, unique group_code for each Xorro Activity you create, even for repeat peer assessments within the same class term or semester.
Use a group_code like this
BT123.101/PJM/2020-03-28/FORM
We suggest your group_code include these elements as per the example above:
We recommend your resulting group_code should distinguish uniquely this semester’s mid-semester formative peer assessment(s) from last semester’s end of class summative where, perhaps, the same institutional class code could have a different set of student names.
The group_code is specified in the Participants CSV file you import prior to launching a Peer Assess Pro™ Activity.
In the general case, a very large class could comprise several cohorts, streams or tutorial sets, each subclass containing several teams conducting one or more peer assessment activities. Consequently, your group_code should help distinguish these separate peer assessment activities. For example,
BT123.101/PJM/TutB/2020-05-28/SUM
Consider two teachers at the same institution teaching the same course but with different tutorial groups. If they use the same goup_code, such as BT101, they will load their own team sets into the same Xorro Participants’ Group, additively, thereby causing mutual confusion and dismay. Similarly, a teacher using the same group_code from term to term, semester to semester, and year to year will experience similar grief.
Quick links and related information
FAQ - I’m having problems importing my participants csv
FAQ: How do I correct the Team Composition in a running peer assessment activity?
FAQ: How do I correct the participants (team members) in a group I uploaded?
FAQ: How do I create a CSV file from a Google Sheet
In summary
Enter the following details, in this sequence
Set a realistic Due Date that is your target for when you expect and want most students to have completed the peer assessment. In practice, typical Due Dates are set to within four days to seven days beyond the Start Date.
The Due Date is used by Peer Assess Pro to generate automatically:
If you use the Xorro default Due Date, which currently is NOW, the Start Date, you will not receive the benefits of the automated processes conducted by Peer Assess Pro that are triggered by a practical Due Date.
The ‘Due Date’ date is advisory only. Students can CONTINUE to submit responses beyond the Due Date UNTIL the teacher Finalises the activity. After the Finalisation Date, the students will have no more than two weeks to review their results.
FAQ: How do I adjust the Due Date or deadline?
The short answer is ‘You can’t adjust the Due Date!’ You don’t need to!
After setting the Start At and Due Dates, select Create Activity .
Double check your Start Date and Due Date carefully!
Once you Create Activity you cannot adjust the Start Date. The peer assessment Survey and the Email notifications to students requesting their response are created when the Start Date is reached. Furthermore, the email advises the students the Start Date and Due Date.
Therefore, an adjustment to the Start Date would confuse the students as the Participant Activity URLs would be announced to students. Those Activity URLs could become unavailable to the students if dates were adjustable.
For a similar reason, you cannot adjust the Due Date. However, the ‘Due Date’ date is advisory only. Students can CONTINUE to submit responses beyond the Due Date UNTIL the teacher Finalises the activity.
FAQ: Can I adjust the Start Date or Due Date for a running activity?
In short, No!
In a ‘worst case scenario’ you can abandon the activity and launch a new activity. Review the foregoing FAQ for details on how to Abandon a running Peer Assess Pro activity.
Peer Assess Pro Teacher’s Dashboard
When the Start Date occurs, Peer Assess Pro automates several activities:
A unique peer assessment survey is created for every team and team member
Quick links and related information
FAQ - I’m having problems importing my participants csv
FAQ: How do I correct the Team Composition in a running peer assessment activity?
FAQ: Can I adjust the Start Date or Due Date for a running activity?
FAQ: How do I view a list of the participants (team members) in the group I uploaded?
FAQ: How do I view and experience what the students experience?
FAQ: How do I find the the Peer Assess Pro Teacher’s dashboard?
This is an alternative approach to launching a peer assessment activity. This is a two stage process where you can
(Image to come)
This uploads your Participants CSV within which you have classified your students into teams, as detailed in Section 2.2 Create the peer assessment Participants CSV file
Note that multiple teamset groups may be created using this import process. This is potentially useful for managing peer assessment in large, multi-stream classes.
You should see a list of all the students belonging to the class for whom you wish to run the peer assessment activity.
Note: The message ‘Exists’ or ‘Conflict’ means that the id (Identification) code has already been identified within your institution, or a previous Group you have uploaded. Carry on!
At this point you are unable to confirm the team membership of your team class. You must first launch a peer assessment activity selecting (one of) the Group Codes that existed within the original Participants CSV.
Quick links and related information
FAQ - I’m having problems importing my participants csv
FAQ: How do I view a list of the participants (team members) in the group I uploaded?
FAQ: How do I view or change the participants (team members) in a group I uploaded?
FAQ: How do I correct the Team Composition in a running peer assessment activity?
FAQ: Can I adjust the Start Date or Due Date for a running activity?
In short, No! Please check carefully your Start Date and Due Dates before you Create Activity.
Active Warnings show when you need to take action to remedy an issue during execution of the peer assessment activity.
In the following example, one member of Team Brazilia has completed the assessment of their four team members. Consequently, a warning is generated for Team Brazilia that the number of responses from the team is insufficient for presenting valid results. In contrast, all four team members of Team Kublas have completed the assessment.
The warnings displayed in this case are
Click through the warning to gain advice on how to remedy the situation. For example, you can remind the students to complete the survey. Emails are automatically generated and sent on your behalf to all or selected students.
Quick links and related information
FAQ: What is the content of emails sent by Peer Assess Pro?
FAQ - What emails have been sent by the platform?
Upon commencing the peer assessment survey, team members are asked first to confirm that the team members identified or their team are correct. If not, the student initiates a request notification to the teacher to readjust their team’s membership.
Once the peer assessment activity has been launched, you can modify the team composition as per the following FAQ.
FAQ: How do I correct the Team Composition in a running peer assessment activity?
Changes to a Xorro Group will have NO EFFECT on a currently running activity, unless you Finalise then Abandon the activity. Then re-launch a new activity with the revised Group. This is an extreme response, and should not generally be required, if you follow the previous FAQ.
Quick links and related information
Students who have NOT completed the survey are sent an email reminder 72 hours, 24 hours and 12 hours before the Due Date.
Similarly, if a student is required to resubmit a response because a team has been reconstituted, an automatic reminder is sent.
Quick links and related information
FAQ: What is the content of emails sent by Peer Assess Pro?
FAQ - What emails have been sent by the platform?
The Team Results for each team must be entered should you intend to select any of these methods to calculate the Personal Result.
After you have entered or revised your Team Results, communicate the Personal Results to your class using Publish or ‘Update’ button.
Team Results are not used to calculate:
Upon entering Team Results, the Peer Assess Pro platform selects automatically the Normalised Personal Result (NPR) method for calculating participants’ Personal result. A Scale Factor of 1.0 is selected.
The Personal Result Calculation Method calculates the Personal Result you will award to each team member.
When you first enter Team Results, the Peer Assess Pro platform selects automatically the Normalised Personal Result (NPR) method for calculating participants’ Personal result. A Scale Factor of 1.0 is selected.
To adjust the Personal Result Calculation Method and/or adjust the Scale Factor
Quick links and related information
FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?
You can explore progress and final results at the class, team, and individual level.
In the Class Results, select a Bucket Range to identify the specific students lying within the range of a histogram bar chart.
Before reviewing results, see:
FAQ: When, why, and how do I ‘Recalculate Results’?
Example class statistics
In any of the tables, you may
The Individual Personal Snapshot enables you to view all data related to one student. The Student View version of the Personal Snapshot shows exactlt the report the student will receive when the teacher Publishes the results of the current Peer Assess Pro activity.
However, the teacher may wish to view how the results will appear to students BEFORE they are Published. Consequently, there are four possible views of an Individual Personal Snapshot. They are variations on the following example. The four views are explained later.
Example Individual Personal Snapshot (1 of 3)
Example Individual Personal Snapshot (2 of 3)
Example Individual Personal Snapshot (3 of 3)
Note there are four possible views of an Individual Personal Snapshot.
If the view is not yet Published, the student will see this remark.
Results unpublished
The same message will be also be displayed if the team is not a valid assessed team, even if the results have been Published to the class as a whole.
Select an individual team to probe the results of its team members. Sort by Peer Assessed Score or Index of Realistic Self Assessment. Then you can quickly review the Individual Personal Snapshot of each team member as part of your diagnosis to identify ‘star performers’ , ‘at risk’ team members, and those with outlier degrees of over confidence or underconfidence.
Example Team Statistics
(To come)
(To come)
There are many advanced statistics and charts you can view. Furthermore, from ‘Available Actions’ you can Download Full Statistics to conduct more detailed investigations beyond the scope of what we have conceived.
Quick links and related information
FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?
Results of the peer assessment are hidden from team members until you initiate Publish Survey on the Peer Assess Pro Teacher’s dashboard.
Before Publishing, see:
FAQ: When, why, and how do I ‘Recalculate Results’?
The foregoing ‘Refresh and Recalculate’ steps provide you with the opportunity to quality review results before publishing and republishing personal results and qualitative peer feedback comments. In short, as the peer assessment activity progresses towards the due date, results ARE NOT automatically updated and made available for viewing by the students.
Take Care! Once an activity is Published, the results can never be unpublished. However, you may re-publish results if new responses are submitted and/or you make adjustments to Team Results, Team Composition, etc. To reiterate, even if interim results have been published to students, as the peer assessment activity continues to progress towards the due date, results ARE NOT automatically updated and made available for viewing by the students.
Results will be hidden from the teacher and ALL team member in teams where less than one-half of team members have submitted the peer assessment. Peer assessment results are possibly not valid and representative at this stage of the survey activity processing. For small teams, at least 3 team members must have submitted a response. That is, team sizes of 3, 4, 5 and 6 team members require at least three team members to have peer assessed each other. A team of 7 or 8 requires a minimum of 4 responses. Team members who have already submitted a response will ALSO be advised their results are hidden until more of their team members have submitted responses.
Quick links and related information
FAQ: How do I view and experience what the students experience?
Survey responses from Team Members are received and available for incorporation into the Peer Assessment activity UNTIL the you explicitly Finalise the Survey. Even responses submitted after the Due Date announced to students, at the launch of the Activity, will be available for incorporation UNTIL the survey is Finalised deliberately by the Teacher. Until Finalisation, you can request a student to reconsider. They will then optionally resubmit their responses.
From the Peer Assess Pro Teacher’s Dashboard, select either
Example Gradebook Summary Statistics
Example Gradebook Full Statistics
Quick links and related information
Everyone | ||
FAQs on the web at http://tinyurl.com/papFAQ | ||
Launch peer assessment activity
Manage the peer assessment activity
Definitions, calculations, and examples
FAQ - What is the benefit of a standardized peer assessment rubric?
FAQ - When and how is the peer assessment conducted?
FAQ - What is the purpose of peer assessment?
FAQ - What questions are asked in the peer assessment survey?
FAQ - How are peer assessment and personal results calculated and defined mathematically?
FAQ - Is the self-assessment used to calculate Peer Assessed Score?
FAQ - Give me a quick overview of how to launch a Peer Assess Pro™ activity through Xorro
FAQ - How do I navigate the PARTICIPANTS page for Peer Assess Pro?
FAQ - How do I view and experience what the students experience?
FAQ - I’m having problems importing my participants csv
FAQ - How do I create a CSV file from a Google Sheet?
FAQ - How do I correct the participants (team members) in a group already uploaded to Xorro?
FAQ - Can I adjust the Start Date or Due Date for a running activity?
FAQ - What if a student mistakenly advises they are in an incorrect team?
FAQ - Can I adjust the Start Date or Due Date for a running activity?
FAQ - How do I correct the Team Composition in a running peer assessment activity?
FAQ - How do I resolve an unsynchronised team arrangement? (ACTIVE WARNING 0021)
FAQ - What if a student mistakenly advises they are in an incorrect team?
FAQ - What is the content of emails sent by Peer Assess Pro to Participants?
FAQ - What is a valid assessed team?
FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?
FAQ - When, why, and how do I ‘Update and Recalculate Results’?
FAQ - How do I advise a student who feels they have been unfairly treated?
FAQ - What emails have been sent by the platform? (Notifications History)
FAQ: What is an adjusted teamset request? WARNING 0006
FAQ - How do I resolve an unsynchronised team arrangement? ACTIVE WARNING 0021
FAQ - What is the content of emails sent by Peer Assess Pro to Participants?
FAQ - What is a valid assessed team? WARNING 0022
FAQ - How do I fix an invalid, missing or failed email delivery? WARNING 0026
FAQ - What is a team with low psychological safety? WARNING 0034
FAQ - What is an unsafe team member? WARNING 0035
FAQ - What is an ‘at risk’ team member? WARNING 0036
FAQ - What is a mismatched self-assessment (IRSA)? WARNING 0040
FAQ - How is insignificant intra-team agreement identified? WARNING 0041
FAQ - How is an outlier peer assessment rating identified? WARNING 0042
FAQ - What is an inactive team member? WARNING 0048
FAQ - What is a low-quality team rating? WARNING 0050
FAQ - What is a low quality assessor rating? WARNING 0300
FAQ - How are peer assessment and personal results calculated and defined mathematically?
FAQ - What is the benefit of a standardized peer assessment rubric?
FAQ - How is the Peer Assessed (PA) Score calculated?
FAQ - Is the self-assessment used to calculate Peer Assessed Score?
FAQ - How is the Peer Assessed Index (PA Index) calculated?
FAQ - How is the Indexed Personal Result (IPR) calculated?
FAQ - How is the Normalised Personal Result (NPR) calculated?
FAQ - How is the Rank Based Personal Result (RPR) calculated?
FAQ - How is Standard Peer Assessed Score (SPAS) calculated?
FAQ - What is Employability? How is it calculated?
FAQ - How is the Index of Realistic Self Assessment (IRSA) calculated?
FAQ - How do I decide which Personal Result method to apply in my peer assessment activity?
FAQ - I’m having problems importing my participants csv
FAQ - How do I contact people at Peer Assess Pro?
FAQ - Where may I view the most recent version of the user guides?
FAQ - What are the design objectives, key features, and benefits of the Peer Assess Pro development?
The purpose of peer assessment
Undertaking the peer assessment
Using peer assessment results for better performance
How peer assessment affects personal results
FAQ - What is the purpose of peer assessment?
FAQ - How are peer assessment and personal results calculated and defined mathematically?
FAQ - What questions are asked in the peer assessment survey?
FAQ - I am unable to login. My login failed
FAQ - How do I login to my peer assessment Activity URL
FAQ - What if I mistakenly advise the survey I am in an incorrect team?
FAQ - When and how is the peer assessment conducted?
FAQ - How do I provide useful feedback to my team members?
FAQ - How do I view and experience what the students experience?
FAQ - Is the self-assessment used to calculate Peer Assessed Score?
FAQ - What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process?
FAQ - How do I interpret the feedback results I've received from the peer assessment?
FAQ - How do I interpret measures of realistic self-assessment?
FAQ - What steps can I take to get a better personal result?
FAQ - Is the self-assessment used to calculate Peer Assessed Score?
FAQ - I don’t understand what my teammates are trying to tell me. How do I ask for better feedback?
FAQ - What is a team with low psychological safety?
FAQ - What is an unsafe team member?
FAQ - What is an ‘at risk’ team member?
FAQ - What is Employability? How is it calculated?
FAQ - How are peer assessment and personal results calculated and defined mathematically?
FAQ - Is the self-assessment used to calculate Peer Assessed Score?
FAQ - What steps can I take to get a better personal result?
FAQ - What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process?
Peer assessment is an educational activity in which students judge the performance of their peers, typically their teammates. Peer assessment takes several forms including
The ability to give and receive constructive feedback is an essential skill for team members, leaders, and managers.
Consequently, your teacher has chosen to use Peer Assess Pro™ to help you provide developmental feedback to your team members, for both formative and/or summative purposes.
The goal of developmental feedback is to highlight both positive aspects of performance plus areas for performance improvement. The result of feedback is to increase both individual and team performance (Carr, Herman, Keldsen, Miller, & Wakefield, 2005).
Additionally, your teacher may use the quantitative results calculated by Peer Assess Pro™ to determine your Personal Result for the team work conducted by your team. Your Personal Result may contribute to the final (summative) assessment grade you gain for the course in which Peer Assess Pro™ is applied.
In general, your Personal Result is calculated from two factors:
There are many possible criteria for assessing your contribution to your team’s work. Peer Assess Pro has chosen to place equal weight on two groups of factors based on a well-established instrument devised by Deacon Carr, Herman, Keldsen, Miller, & Wakefield (2005), Task Accomplishment, and Contribution to Leadership and team processes:
The selection of the criteria used in the Peer Assess Pro is reinforced by the results from a recent survey that asked employers to rate the importance of several competencies they expected to see in new graduates from higher education. The figure shows teamwork, collaboration, professionalism, and oral communications rate amongst the most highly needed Career Readiness’ Competencies (CRCs) sought by employers. All these Career Readiness competencies rate at least as ‘Essential’, with Teamwork and Collaboration rating almost Absolutely Essential (National Association of Colleges and Employers, 2018).
Employers rate their essential need for Career Readiness Competencies
Source: National Association of Colleges and Employers (NACE). (2018). Figure 42, p. 33.
Quick links and related information
FAQ: What questions are asked in the peer assessment survey?
FAQ: How are peer assessment and personal results calculated and defined mathematically?
FAQ: How is the Peer Assessed (PA) Score calculated?
Mellalieu, P. J. (2021, June 9). Why Peer Assessment? The key to improved group assignments. Better Feedback. Better Teams. https://www.peerassesspro.com/why-peer-assessment/
References
Deacon Carr, S., Herman, E. D., Keldsen, S. Z., Miller, J. G., & Wakefield, P. A. (2005). Peer feedback. In The Team Learning Assistant Workbook. New York: McGraw Hill Irwin.
National Association of Colleges and Employers (NACE). (2018). Job Outlook 2019. Bethlehem, PA. https://www.naceweb.org/
The best practice for conducting peer assessment in an academic course follows several stages.
The midpoint formative peer assessment is an optional element of peer assessment within the classroom. As a minimum, the formative peer assessment gives the team members experience of the Peer Assess Pro™ mechanism including the questions that will be used to conduct the final, summative peer assessment.
More importantly, the midpoint formative assessment helps ensure that team members have the opportunity to respond proactively to the peer feedback they receive immediately the peer assessment activity concludes. Through undertaking appropriate corrective action mid-way through the course, team members have the opportunity to raise their peer assessment rating, their team’s results, and, therefore, their end of course personal results.
The intention of formative assessment is that, ideally, a team member should face no surprises when they receive their final personal result and peer assessment feedback at the conclusion of the course. For instance, a free-riders should receive clear feedback that the rest of their team observes they are free-riding. Consequently, the free-rider should learn in a timely manner that they will be penalised at the concluding summative assessment unless they remediate their behaviour. It is equally important that an overachieving student who does most of the work is given timely feedback that they need to learn to involve and engage the other team members in the team’s planning and execution of tasks. The Peer Assess Pro™ survey specifically targets these aspects of leadership and team process contributions. This particular style of overachieving student should be identified through the peer assessment ratings they receive.
To minimise the risk of surprises, it is important, therefore, that the peer assessment you provide to your team members at the midpoint of a team activity is
Quick links and related information
FAQ: What questions are asked in the peer assessment survey?
FAQ: How do I provide useful feedback to my team members?
FAQ: How do I view and experience what the students experience?
FAQ: How do I interpret the feedback results I've received from the peer assessment?
FAQ: How are peer assessment and personal results calculated and defined mathematically?
FAQ: Is the self-assessment used to calculate Peer Assessed Score?
It is essential that the peer assessment a team member provides to their team members through peer assessment is:
Ohland et al (2012) provide a table of Behaviorally Anchored Ratings covering high and low contributions to team effectiveness. The table provides some guidance to team members about how they might give accurate, effective, and productive feedback to their team members through peer assessment.
Examples of high and low contributions to team effectiveness | ||
HIGH | CONTRIBUTION | LOW |
|
| |
| INTERACTION |
|
| KEEPING FOCUS |
|
| CAPABLE |
|
Source: Ohland et al., (2012) | ||
Adapted by Mellalieu (2017) from Ohland, M. W., Loughry, M. L., Woehr, D. J., Bullard, L. G., Felder, R. M., Finelli, C. J., … Schmucker, D. G. (2012). APPENDIX B: Behaviorally Anchored Rating Scale (BARS) Version, from Comprehensive Assessment of Team Member Effectiveness. Academy of Management Learning & Education, 11(4), 609–630. Retrieved from http://amle.aom.org/content/11/4/609.short |
Quick links and related information
For teachers: How do I advise a student who feels they have been unfairly treated?
Here are some symptoms that you may have been treated unfairly by one or more teammates in their peer assessment of you:
If you believe you may have been unfairly treated, these are the steps you should pursue, in this order of action
An appeal against a peer assessment result is likely to fail if one or more of the following circumstances have prevailed:
Take these steps to avoid a mismatch between the peer assessment result you expect, and the result you receive.
Quick links and related information
How peer assessment affects personal results
FAQ: How are peer assessment and personal results calculated and defined mathematically?
FAQ: Is the self-assessment used to calculate Peer Assessed Score?
FAQ: What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process?
Using the results from peer assessment for better performance
FAQ: How do I interpret the feedback results I've received from the peer assessment?
FAQ: I don’t understand what my teammates are trying to tell me. How do I ask for better feedback?
FAQ: How do I interpret measures of realistic self-assessment?
FAQ: What steps can I take to get a better personal result?
(To be published)
Quick links and related information
FAQ: How do I interpret measures of realistic self-assessment?
Begin by viewing this video. Watch especially for the question that is introduced soon after minute 15 by Harvard University professor Sheila Heen.
Heen, S. (2015). How to use others’ feedback to learn and grow. TEDx. Retrieved from https://www.youtube.com/watch?v=FQNbaKkYk_Q
As Heen and Stone observe
“Feedback is less likely to set off your emotional triggers if you request it and direct it. So donʼt wait until your annual performance review. Find opportunities to get bite-size pieces of coaching from a variety of people throughout the year. Donʼt invite criticism with a big, unfocused question like “Do you have any feedback for me?” Make the process more manageable by asking a colleague, a boss, or a direct report,
“Whatʼs one thing you see me doing (or failing to do) that holds me back?”
That person may name the first behavior that comes to mind or the most important one on his or her list. Either way, youʼll get concrete information and can tease out more specifics at your own pace.” (Heen & Stone, 2014)
Quick links and related information
Heen, S., & Stone, D. (2014). Find the Coaching in Criticism. Harvard Business Review, 9. Retrieved from https://medschool.duke.edu/sites/medschool.duke.edu/files/field/attachments/find-the-coaching-in-criticism.pdf
Your Personal Result is determined from a combination of your Team Result and your Peer Assessed Score. Consequently, to raise your Personal Result you need to apply balanced effort to raising both these contributing factors.
Typically, your Team Result is earned from its assignment outputs, such as a report, and/or a presentation. Consequently, the grade for the Team Result is determined by the teacher, based on the rubric (marking guideline) they apply to assess your team’s outputs. Ensure you understand the assignment elements and how each will be assessed. Seek out exemplars of good practice. Pursue the guidance found in:
Mellalieu, P. (2013, March 15). Creating The A Plus Assignment: A Project Management Approach (Audio). Innovation & chaos ... in search of optimality website: http://pogus.tumblr.com/post/45403052813/this-audio-tutorial-helps-you-plan-out-the-time
In addition to your teacher and their assistant tutors, your academic institution will offer personal and group coaching to guide you on the specific success factors related to the type of assignment you are pursuing. Schedule appointments to make use of these support facilities early in your project. Locate the online resources these coaching support services have curated for your guidance.
Group and team projects present special challenges of coordination, motivation, communication and and leadership. These challenges are normal! Furthermore, an essential part of your job as team member is to overcome proactively these challenges as part of your academic learning journey.
As you overcome these challenges you will achieve several benefits directly instrumental in raising your Personal Result:
You will also develop team work and leadership competencies that will both raise your future employability, and your effectiveness in future teamwork, as discussed in:
FAQ: What is the purpose of peer assessment?
Whilst there are many resources to help address the challenges of team work in academic settings, we suggest you familiarise yourself with these resources early in your team project. Since “Any fool can learn from their own mistakes. It takes genius to learn from the mistakes of others” (Einstein), be proactive rather than foolish in learning effective team working skills from:
Turner, K., Ireland, L., Krenus, B., & Pointon, L. (2011). Collaborative learning: Working in groups. In Essential Academic Skills (2nd ed., pp. 193–217).
Carr, S. D., Herman, E. D., Keldsen, S. Z., Miller, J. G., & Wakefield, P. A. (2005). The Team Learning Assistant Workbook.
Good practice peer assessment management by your teacher will provide you with two opportunities for peer assessment and peer feedback through your course, formative and summative.
Your first, mid-course, formative assessment provides you with early advice about your strengths and opportunities for development as perceived by your team members. Make use of this formative feedback at the earliest opportunity as you proceed towards the conclusion of your team work, and your final, summative peer assessment. Usually, this final, summative assessment is where you earn the significant contribution to your course grade from the Personal Result earned from your Peer Assessed Score awarded by your team members.
Consequently, take proactive action following the mid-course formative assessment through referring to:
FAQ: How do I interpret the feedback results I've received from the peer assessment?
Maybe you don’t understand or don’t agree with the feedback your teammates are providing. In that case, refer to
FAQ: I don’t understand what my teammates are trying to tell me. How do I ask for better feedback?
Quick links and related information
The purpose of peer assessment
FAQ: What is the purpose of peer assessment?
Undertaking the peer assessment
Using peer assessment results for better performance
How peer assessment affects personal results
FAQ: How are peer assessment and personal results calculated and defined mathematically?
FAQ: What steps can I take to get a better personal result?
FAQ: What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process?
FAQ: Is the self-assessment used to calculate Peer Assessed Score?
What happens if a team member attempts to 'game' the peer assessment process?
The designers of Peer Assess Pro have many decades’ experience working with students. We know the tricks that students attempt to play with peer assessment. We have anticipated the tricks, so Peer Assess Pro warns the teacher that a trick may be being played. Furthermore, the teacher receives highly specific and student individualized information about each incident. The teacher may then undertake overt or covert action to address the issue to which they have been alerted. For example, the trick-playing student or team may then receive a request to reconsider and resubmit their peer assessment. In more extreme incidents, the student or team may receive an invitation to visit the teacher for a counselling consultation.
Here follow a few of the ‘tricks’ that Peer Assess Pro identifies and warns the teacher about during the survey process. Examples follow later.
Here are some examples of the highly specific and individualized Active Warnings a teacher receives about each incident.
Madison may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 100 and low range 3. Team Alpha
Ben may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 86 and low range 5. Team Bravo
This message warns the teacher that the team member has given everyone a near perfect Peer Assessed Score or a similar score (narrow range). Practically, from the student’s point of view they are ‘wasting their votes’. If everyone is scored with the same or similar score then students who have contributed substantially to the team’s result will not be adequately recompensed. Furthermore, if EVERY team member pursued this same approach, then every team member would be awarded the Team Result. In this case, the team member just looks stupid in the eyes of the teacher. Furthermore, the team member fails to gain practice at being a leader where giving accurate assessments of team members’ contributions is a valued management competency.
Team Bravo may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 98 and low range 8.
Team Echo may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 94 and low range 10.
These messages warns the teacher that the team collectively may have arranged to give everyone a near perfect Peer Assessed Score or the same score. Practically, from the students’ point of view, this trick is a waste of time. If everyone is scored with the same score, or a perfect Peer Assessed Score of 100, then every team member will be awarded the Team Result … which is usually not 100. The team members just look stupid in the eyes of the teacher. Furthermore, they may not receive useful qualitative feedback and ratings that help guide focussed development of their future productivity in team assignments and their future professional work in teams.
The warning highlights a situation where the team members appear to be inconsistent in rating high, medium and low contributors to the team’s process and results.
This example is a symptom there maybe some disruptive team dynamics or bullying within the team.
Harrison Ford assessed Steven Spielberg awarding a PA Subscore of 38. Compared with the average rating by the other team members of 70 this subscore DEPRESSED the PA Score to 64 by 7 PA Score Units. Team Alpha
This message warns the teacher there maybe some favouritism between friends or allies.
Donald Trump assessed Vladimir Putin awarding a PA Subscore of 90. Compared with the average rating by the other team members of 57 this subscore RAISED the PA Score to 64 by 7 PA Score Units. Team Charlie
Anna self-assessment of 68 is OVERCONFIDENT compared with the peer assessment of 34 given by others in team Charlie. IRSA = 51
This message warns the teacher that the team member has a very much higher opinion of their performance than is evidenced by the rating provided by their peers. The teacher may request an interview with the student to explore the reasons for this divergence, and how the student can develop a more realistic self-assessment.
Alternatively, the team member may be being scape-goated by the remainder of the team, and that possibility will be discussed with the team member for whom this warning is raised.
Anna has been rated amongst the lowest in class. Low Recommendation 2.3 and/or low Peer Assessment Score 34. Team Alpha
This message warns the teacher that the team member is rated very poorly when compared with most of the class. It’s often a symptom of little or no attendance or contribution by the team member, which the teacher will verify through examining the qualitative feedback provided by the team members. Again, the teacher may request an interview with the student to explore the reasons for this divergence, and how the student can develop a more realistic self-assessment.
Examine the following teacher’s dashboard graphic revealing a real class that undertook a peer assessment.
Teachers dashboard: visible identification of teams with low quality team rating
Team 1, 14, 13, 2, 7, 5, and 11. Over ½ of the teams in the class!
Observation: This class was poorly briefed on how to make the best use of peer assessment and feedback. With a better briefing, less than 10% of teams will raise this warning.
The lower Team Results are associated with teams that had a low quality team rating. Apart from Team 15, all teams with an adequate quality team rating had a Team Result equal to or greater than the Class median Team Result of 73.3. For example, Team 10 (Team Result 90) through to Team 4 (Team Result 76.7) according to the sort by Range in the foregoing table.
Team 10, with a Team Result of 90 is clearly a high performing team. The moderately low Range of PA Scores (10) across the team suggests IN THIS CASE that everyone contributed relatively equally and effortfully towards a great Team Result. Reminder: The Team Result is awarded by the teacher: it is independent of the Peer Assessed Scores of the team.
However, Team 3 is also a good candidate for being a fair and productive team. They engaged honestly with peer assessment, awarding a high spread of Peer Assessed Scores (Range 18.8) an a team average PA Score (78.3). This team average was not outrageously high, in contrast to teams 1 (100!), 14 (100), 13, 2, 7, 5. Furthermore, Team 3 earned the class median Team Result of 73.3, which appears then allocated according to the peer assessed contribution of the team members. This fair distribution is illustrated in the following graph and table. Team Member Charlie earned the highest Personal Result of 81, whilst Able earned 65.3. Similar reasoning applies to Team 6 to a slightly lesser degree, since the Range is not so wide.
Note from the following graph how teams 14, 5, 13, 1, 2 and 7 are again glaringly identified in the Teachers Dashboard as outlier teams poorly engaged with the peer assessment process: the low vertical spread in the graph. This low vertical spread in the Personal Result (NPR in this case) derives from the low range of Peer Assessed Scores across each team.
With this admittedly small case size, we advance the proposition that ‘Better feedback leads to better teams’. And/Or ‘Better teams give better feedback!’. In conclusion, let’s say Better feedback. Better teams.
Teachers dashboard: a fairly productive team
Quick links and related information
FAQ: What is the purpose of peer assessment?
FAQ: How do I provide useful feedback to my team members?
FAQ: How do I interpret the feedback results I've received from the peer assessment?
FAQ: I don’t understand what my teammates are trying to tell me. How do I ask for better feedback?
The following section explains how the teacher should respond to the Active Warnings displayed on their dashboard. The thresholds parameters and program logic for raising the warnings are also provided.
If this is your first time using Peer Assess Pro, we recommend strongly that you glance briefly our Frequently Asked Questions so you are prepared to answer your own and your students' concerns - https://www.peerassesspro.com/frequently-asked-questions-2/
Download a pdf of the Quickstart Guide and this Reference Guide here - http://tinyurl.com/papRefPdf
View a quick video overview demonstration of the whole Peer Assess Pro system
Contact
Patrick Dodd - https://www.peerassesspro.com/contact/
Quick links and related information
View the web Quickstart Guide at tinyurl.com/pdfQuickWeb
View our comprehensive online and eBook introduction Get Started with Peer Assess Pro
FAQ: How do I contact people at Peer Assess Pro?
FAQ: Where may I view the most recent version of the User Guide?
FAQ: What are the design objectives, key features, and benefits of the Peer Assess Pro development?
Our overall objectives for Peer Assess Pro™ are
We appreciate your participation in this pre-market release of our substantially revised Peer Assess Pro™ in conjunction with the Xorro advanced quiz and survey platform.
As we proceed through this pre-market refinement phase we respond almost daily to your suggestions for improving both the software applications and user documentation. These improvements are implemented at anytime whilst we undergo our Beta Development phase. We anticipate that our implementations are robust enough to prevent loss of your data and wasting your time. We crave your forgiveness if we have been over optimistic in keeping Murphy’s Law at a distance.
You need not take any action to use the latest versions of the Peer Assess Pro™ Xorro Teacher’s Dashboard. Those updates happen in the background and will automatically use any data and activities you have initiated. However, if you use the PDF version of this user guide, you will need to update regularly to the latest version here.
Quick links and related information
FAQ: where may I view the most recent version of the Reference Guide?
If you quit your browser then wish to return to the Teachers Dashboard
From HOME Tab
From Activities Tab: Running Activities
Quick links and related information
Note the list of ‘All participants’ currently known to Xorro in your institution.
Note a list of all other Groups uploaded by other Teachers in your institution. A group is a list of participants, such as students in a class. The minimum requirement for a Group is id, first name, last name.
However, for a peer assessment activity a Group must include team membership for all team members. This team membership is not required for most other Xorro activities. Accordingly, Groups set up for other teachers or by other teachers will rarely contain the correct team membership data required for your Peer Assess Pro™ activity.
Select Group ClassAM101.6. This group selection displays a list of about 25 students in the class titled AM101.6
Quick links and related information
FAQ: How do I correct the participants (team members) in a group I uploaded?
FAQ: How do I correct the Team Composition in a running peer assessment activity?
In a launched, running peer assessment activity, you often need need to make these adjustments:
Select the context you require
LMS (Moodle, Canvas) Adjust the team composition in a running peer assessment activity on an LMS
Xorro Adjust the team composition in a running peer assessment activity on Xorro
Context: Peer Assess Pro running on an LMS (Moodle, Canvas, Blackboard)
Ensure you read ALL of this FAQ before proceeding.
If you make a mistake in this process the consequence may lead to unrecoverable loss of survey responses received to date.
In the LMS version of Peer Assess Pro, the facilitator is alerted to requests to adjust the team composition in the running activity from several Active Warnings. These Active Warnings and their associated available actions streamline the facilitator’s workflow for managing adjustments to the team arrangement.
For example, new participants may have been enrolled on the LMS since the peer assessment activity was launched. Additionally, a participant may have alerted the facilitator that team participant(s) (or themself) have been
The relevant Active Warnings are described in these FAQs
FAQ - What is an adjusted team arrangement request? WARNING 0006
FAQ - What is an inactive team member? WARNING 0048
FAQ - How do I resolve an unsynchronised team arrangement? ACTIVE WARNING 0021
The facilitator responds to the suggested actions presented on the Teachers Dashboard for the relevant Active Warnings to first produce a proposed team arrangement on the LMS side. The facilitator will
The facilitator now must
Before the facilitator initiates Synchronise All, the refreshed Team Composition view for the Peer Assess Pro activity will now show the updated team arrangement, Step 5 above.The updated view combines the intentions just-stated by the facilitator on the LMS (the proposed team arrangement), and the current status of the team arrangement within the running activity. The symbols (+, -) signal mismatches, or the lack of synchronisation between the LMS and the current Peer Assess Pro activity.
Prior to the facilitator committing to synchronisation, the facilitator’s proposed changes are clearly indicated, as the example illustrates. Crucially, this Team Composition view provides a ‘last chance’ opportunity for the facilitator to review carefully the changes they intend.
In the Team Composition view, the symbol (-) signifies that, following synchronisation, a participant would be dropped from a team, or a team deleted from the current peer assessment activity. The symbol (+) indicates that a participant or team will be added.
For example, in the following example, once the facilitator has initiated Synchronise All, Kamryn MILLER will be reassigned from team Black Robins to Team Brown Kiwis. The two teams Black Robins and Brown Kiwis will now include the new class participants Jill ROBERTSON and Jason SMITH. The team Wax Eyes will be added, along with its team members, Kael BRIDGES, Jonathan CHANG, and Kyleigh COHEN.
Note that any survey responses already generated by Kamryn Miller for team Black Robins will be irretrievably deleted following synchronisation. Similarly, Estrella Hawkins’ responses will be irretrievably deleted..
Black Robins
Kamryn MILLER (-), Alexander SAMPSON , Mikaela RAY , Ramon MCKNIGHT, Jill ROBERTSON (+)
Brown Kiwis
Estrella HAWKINS (-), August DAUGHERTY , Nehemiah MCCONNELL, Kamryn MILLER (+), Jason SMITH (+)
Grey Warblers
Joslyn HOOVER , Alyvia YANG , Mariyah POLLARD , Arianna SCHROEDER
Pukekos
Dorian SULLIVAN, Elisha NUNEZ , Muhammad HOLT , Skylar MCCLURE
Red Rooks < Red Ruru
Alberto UNDERWOOD , Annika KLINE , June MCKINNEY , Jaylee MURRAY
Waxeyes (+)
Kael BRIDGES (+), Jonathan CHANG (+), Kyleigh COHEN (+)
Black Robins
Alexander SAMPSON , Mikaela RAY , Ramon MCKNIGHT, Jill ROBERTSON
Brown Kiwis
August DAUGHERTY , Nehemiah MCCONNELL, Kamryn MILLER, Jason SMITH
Grey Warblers
Joslyn HOOVER , Alyvia YANG , Mariyah POLLARD , Arianna SCHROEDER
Pukekos
Dorian SULLIVAN, Elisha NUNEZ , Muhammad HOLT , Skylar MCCLURE
Red Rooks
Alberto UNDERWOOD , Annika KLINE , June MCKINNEY , Jaylee MURRAY
Waxeyes
Kael BRIDGES, Jonathan CHANG, Kyleigh COHEN
The founding members of teams Black Robins and Brown Kiwis will be alerted by an automated notifications from Peer Assess Pro they must submit an updated survey response for their re-configured teams, as will the relocated Kamryn MILLER. Notification 0013: RESUBMIT peer assessment due to TEAM CHANGE.
The newly-enrolled class members in the new team Waxeyes Kael BRIDGES, Jonathan CHANG, Kyleigh COHEN, Kael BRIDGES, Jonathan CHANG and Kyleigh COHEN will be advised to submit the peer assessment survey. Notification 0011: Request to COMPLETE peer assessment.
Note that if a founding participant has not yet submitted, they will receive an updated notification to complete the peer assessment, updated to include the new team arrangement, Notification 0011: Request to COMPLETE peer assessment.
There will be no impact on the survey responses for the participants in teams Grey Warblers and Pukekos.
An orphan team is a team that has not been assigned to a teamset (grouping, group set).
Teams that are to be added to the activity must be added to the same teamset (grouping, group set) that was used to create and launch the peer assessment activity. Furthermore, ALL the teams available in the teamset must have been selected for the initial launch.
If the Peer Assess Pro activity was launched using orphan teams, then no additional teams can be added. In this case, participants can only be added to or deleted from teams used to create the initial activity.
A Synchronise All action will rejected for several reasons when the team arrangement is not feasible.
Before launching a Peer Assess Pro activity always
Quick links and related information
FAQ - What is an adjusted team arrangement request? WARNING 0006
FAQ - What is an inactive team member? WARNING 0048
FAQ - How do I resolve an unsynchronised team arrangement? ACTIVE WARNING 0021
Groups—MoodleDocs. https://docs.moodle.org/400/en/Groups
Grouping users—MoodleDocs. https://docs.moodle.org/400/en/Grouping_users
Groupings—MoodleDocs. https://docs.moodle.org/400/en/Groupings
Groups versus groupings—MoodleDocs. https://docs.moodle.org/19/en/Groups_versus_groupings
Context: Peer Assess Pro running on Xorro
Ensure you read ALL of this FAQ before proceeding.
If you make a mistake in this process the consequence may lead to unrecoverable, loss of survey responses received to date.
Select the ‘Team Composition’ button for the running Peer Assessment Activity for which you wish to adjust the team composition.
During the re-import, the changes to the teamset will be presented to you so that you can check and confirm the adjustment process. Take care!
Upon completion of the re-import process, the running Peer Assess Pro Activity will continue.
All students in teams affected by a change in composition are now required to resubmit their peer assessment responses. Reason: They now have different team members to rate. The remaining teams of the class will be unaffected. Their responses remain submitted and evident within Peer Assess Pro.
Affected team members will be notified of their need to re-submit by an automatically generated email from Peer Assess Pro.
You cannot change the participants in the Xorro Group used to create the running activity, as explained in the FAQ:
FAQ: How do I correct the participants (team members) in a group already uploaded to Xorro?
Reason: whenever a Xorro activity is created a snapshot is taken of the Group used to create the activity. From that moment this snapshot, known as a Xorro Teamset, is inextricably connected with the activity. That activity-specific teamset can be updated only during a running activity through the FAQ detailed above, through the Team Composition section of the Peer Assess Pro dashboard.
In the image above, the Group used to create the peer assessment activity is BT101. Any changes made to that group WILL NOT affect the running activity. The teamset created from the Group BT101 is denoted 2019-02-24 BT101 by Beta Beta. That name indicates what date the teamset was created, from which Group, and by whom.
You can add, swap or delete delete team members anytime before launching the activity, and anytime before the peer assessment activity is finalised.
Good Practice Hint. Get your team composition list absolutely correct before the activity is launched and made available for response by your students. Reason: All students in teams affected by a change in composition will be required to resubmit their peer assessment responses. The students now have different team members to rate. However, the remaining teams of the class will be unaffected.
Quick links and related information
FAQ: How do I correct the Team Composition in a running peer assessment activity?
Confirm that the Participants CSV file you created is in the correct format. Open the Participants CSV using a text editor (Apple Textedit, or Microsoft Windows Notepad).
The format should appear as illustrated below. Note that
id,first,last,group_code,team,email,
BOWI12,Bob,Wilson,123.101,Tiger,Bob.Wilson@xorroinstitution.com,
ALJO11,Alice,Jones,123.101,Panda,Alice.Jones@xorroinstitution.com,
JOSM13,John,Smith,123.101,Tiger,John.Smith@xorroinstitution.com,
JOSM13,John,Smith,123.202,,John.Smith@xorroinstitution.com,
GRGR15,Greta,Green,123.101,Panda,Greta.Green@xorroinstitution.com,
GRGR15,Greta,Green,123.204,,,
HEJO19,Henry,Jones,123.101,Tiger,Henry.Jones@xorroinstitution.com,
AMTO01,Amanda,Tolley,123.101,Bear,Amanda.Tolley@xorroinstitution.com,
JEWA06,Jeff,Wang,123.101,Panda,Jeff.Wang@xorroinstitution.com,
HOBR03,Holly,Brown,123.101,Bear,Holly.Brown@xorroinstitution.com,
HOBR03,Holly,Brown,123.202,,Holly.Brown@xorroinstitution.com,
THWI18,Thomas,Windsor,123.101,Tiger,Thomas.Windsor@xorroinstitution.com,
ANWO08,Anna,Worth,123.101,Bear,Anna.Worth@xorroinstitution.com,
ANWO08,Anna,Worth,123.202,,Anna.Worth@xorroinstitution.com,
ANWO08,Anna,Worth,123.204,,,
Quick links and related information
FAQ - I’m having problems importing my participants csv
A Beta Test demonstration site has been established with these credentials:
Browse to: https://qf.staging.xorro.com/
Enter: Username BetaTest, Password Secret
This Beta Test User is established for you to view. But don’t touch to hard!
View
WARNING! HERE THERE BE DRAGONS!!
If a peer assessment activity is launched and running then you cannot update team membership details in that running activity using the procedure described here!!! Instead, apply the procedure described here FAQ: How do I correct the Team Composition in a running peer assessment activity?
Reason: whenever a Xorro activity is created a snapshot is taken of the Group used when creating the activity. From that moment this snapshot, known as a Xorro Teamset, is inextricably connected with the activity. That activity-specific teamset can only be updated during a running activity through the following FAQ.
Quick links and related information
FAQ: How do I correct the Team Composition in a running peer assessment activity?
Quickstart Guide for Peer Assess Pro: Xorro. (2019, March 6). Peer Assess Pro. http://tinyurl.com/pdfQuickWeb
Pdf version: http://tinyurl.com/pdfQuick
Login and orientation. (2019). Auckland: Peer Assess Pro.
Launch a Peer Assess Pro Activity. (2019). Auckland: Peer Assess Pro.
Student survey experience. (2019). Auckland: Peer Assess Pro.
Peer Assess Pro. (2019, March 5). Manage a Peer Assessment Activity using Xorro: Reference Guide for Teachers. Auckland: Peer Assess Pro
Web version http://tinyurl.com/papRefWeb2
Pdf version http://tinyurl.com/papRefPdf
Feel welcome to make suggestions or ask questions using the Comment feature of the Google Docs development version. Shows work in progress improvements.
Frequently Asked Questions (FAQs) (2019). In Manage a Peer Assessment Activity using Xorro: Reference Guide for Teachers [web]. Auckland, New Zealand: Peer Assess Pro. http://tinyurl.com/papFAQ
Peer Assess Pro. (2019). Xorro Peer Assess ProTM Teachers Process Flowchart: Overview and Detail. http://tinyurl.com/papChart
Quick links and related information
The choice of calculation method for determining a team member’s personal result is determined by the teacher's preference for compensating more strongly team members who have contributed significantly to their teams, and under-rewarding team members who are peer assessed as weak contributors. The figure illustrates the statistical features, such as team average, range, and standard deviation, associated with each method.
Alternative calculation methods for Personal Result (PR) illustrating effect on team average and spread for a given Team Result
The teacher can select either the Peer Assessed Score (PA Score) or Peer Assessed Index (PA Index) if they wish to exclude a team result in calculating the Personal Result (PR).
More usually, the Peer Assessed Score and Team Result (TR) are combined mathematically to produce a Personal Result. There are three alternative methods. As the figure illustrates, the Indexed Personal Result (IPR) is the least discriminating method, whilst the Rank-Based Personal Result (RPR) is the most discriminating in terms of favouring significant team contributors and penalising weak contributors. Most teachers select the Normalised Personal Result, often with a spread factor of 1.5 to 2.0.
In contrast to the graphical illustration earlier, the following table summarises the example calculations presented through a series of FAQ that present the mathematical definition and example calculations for each method.
Comparison of Personal Results calculated by several methods in a team of four members
ASSESSEE | ||||||
ASSESSOR | Bridget | Julian | Lydia | Nigella | Mean | Range |
Rank Reversed | 1 | 2 | 4 | 3 | ||
Peer Assessed Score, PA Score | 54 | 74 | 82 | 78 | 75 | 28 |
Peer Assessed Index, PA Index | 66 | 90 | 100 | 95 | 88 | 34 |
Team Result, TR | 50 | 50 | 50 | 50 | 50 | 0 |
Indexed Personal Result, IPR | 33 | 45 | 50 | 48 | 44 | 17 |
Normalised Personal Result, NPR (SpreadFactor = 1) | 39 | 51 | 56 | 54 | 50 | 17 |
Normalised Personal Result, NPR (Spreadfactor = 2) | 28 | 52 | 62 | 58 | 50 | 34 |
Rank-Based Personal Result, RBR | 20 | 40 | 80 | 60 | 50 | 60 |
Source: FAQ: How are peer assessment and personal results calculated and defined mathematically?
Definitions and features of calculation methods used in Peer Assess Pro
Attribute (X1) | Abbreviation (X1) | Definition (X1) |
Peer Assessed Score | PA Score | A relative measure of the degree to which a team member has contributed to their team's overall achievement, team processes, and leadership. The Peer Assessed Score (PA Score) is calculated for each team member directly from their Average Team Contribution (ATC) and Average Leadership Contribution (ALC). That is, from the ten components of Team and Leadership contribution survey in the peer assessment. A Peer Assessed score is generally used to compare the relative contribution of students WITHIN the same team, rather than BETWEEN teams. The Team Result has NO impact on the value of the Peer Assessed Score. Values for the PA Score range from zero through 100. |
Peer Assessed Index | PA Index | The Peer Assessed Score (PA Score) is indexed upwards so that the person in the team with the highest Peer Assessed Score is awarded a Peer Assessed Index of 100. All other team members receive a proportionally lower PA Index in the ratio PA Score / max(PA Score). The Team Result has NO impact on the value of the Peer Assessed Index. |
Team Result | TR | The result awarded to the team for the outputs of their work. The teacher typically derives the Team Result (TR) from grades for team reports, presentations, and results of Team Readiness Assurance Tests. The teacher may select to combine a student's Peer Assessed Index (PA Index) with their team's Team Result (TR) to calculate a Personal Result (PR) for each student, reflecting their relative contribution to the Team Result as assessed by their peer team members. Peer Assess Pro enables the teacher to select from several methods to combine the Team Result and Peer Assessed Index (PA Index) to produce a Personal Result: the Indexed Personal Result (IPR), the Normalised Personal Result (NPR), and the Rank Based Personal Result (RPR). |
Measures of a student's personal result | ||
Personal Result | PR | A student's personal result gained from combining their Peer Assessed Index (PA Index) and, optionally, their Team Result (TR). The teacher selects from one of several Calculation Methods to calculate the Personal Result that incorporates the Team Result. These methods are Indexed Personal Result (IPR), Normalised Personal Result (NPR), and Rank-Based Personal Result (RPR). The choice of method is determined by the teacher's preference for compensating more strongly students who have contributed significantly to their teams, and under-reward students who are peer assessed as weak contributors. Figure 1 illustrates the statistical features, such as team average, range, and standard deviation, associated with each method. The IPR is the least discriminating method, whilst the RPR is the most discriminating in terms of favouring significant team contributors and penalising weak contributors, as the figure illustrates. |
Indexed Personal Result | IPR | The Indexed Personal Result is calculated from the Team Result (TR) combined with the student's specific Peer Assessed Index (PA Index). The Indexed Personal Result method awards the Team Result to the TOP RATED student in the team, since, by definition, their Peer Assessed Index is 100. All remaining students in the same team earn the Team Result downwards, directly proportional to their PA Index. The Indexed Personal Result calculation means that NO team member can earn an Indexed Personal Result greater than the Team Result. That is, values for the Indexed Personal Result range from zero up to the Team Result. |
Normalised Personal Result | NPR | The Normalised Personal Result is calculated from the Team Result combined with the student's specific Indexed Personal Result (IPR). However, in contrast to the IPR method, the Normalised Personal Result method awards the AVERAGE student in the team the Team Result (TR). All remaining students are awarded a Personal Result ABOVE or BELOW the Team Result depending on whether their IPR is above or below that team's average. Features of the Normalised Personal Result are that (a) In contrast to the IPR method, the Normalised Personal Result method calculates a Personal Result ABOVE the Team Result for the above-average peer rated students in the team (b) The average of the team's Normalised Personal Results matches the Team Result (c) The spread of the team's Normalised Personal Results matches the spread of the Indexed Personal Results (IPR) that is calculated for that team. Spread is measured by the standard deviation statistic. . Optional feature: To enhance the effect of rewarding high contributors and penalising weak contributors the tutor can increase the Spread Factor (SF) from the default value of 1.0. Increasing the Spread Factor increases the spread of the results centred around the Team Result. However, an increase in the Spread Factor will maintain a team average NPR that matches that team's Team Result. A Spread Factor of 1.5 to 2.0 is recommended, especially in classes where team members are reluctant to penalise weak contributors and/or reward the highest contributors through their peer assessment rating responses. Values for the NPR range from zero to 100. Calculations that exceed these ranges are clipped to fit within zero to 100 |
Rank Based Personal Result | RPR | The Rank Based Personal Result is calculated from the Team Result combined with the student's specific Rank Within Team based on that student's Peer Assessed Score. Like the Normalised Personal Personal Result the RPR method awards the AVERAGE student in the team the Team Result. All remaining students are awarded a personal result above or below the Team Result depending on whether their Rank Within Team is above or below that team's middle-ranked student. Features of the Rank Based Personal Result (PR) calculation method are that (a) A team's RPR values are spread over a MUCH WIDER range than the NPR and IPR methods. Small differences in PA scores within a team are amplified significantly by this method (b) In contrast to the IPR method, the RPR method calculates a Personal Result significantly ABOVE the Team Result for the top ranked student in the team (c) Like the NPR method, the average of the team's RPR values matches the Team Result. Values for the Rank Based Personal Result range from zero to 100. Calculations that exceed these ranges are clipped to fit within the range zero to 100. |
Note that in the Xorro version of Peer Assess Pro, we have renamed the following Personal Result Methods from those used in the Google Docs version of Peer Assess Pro.
Renaming of terms for Peer Assess Pro
Peer Assess Pro | Abbreviation | Google Peer Assess Pro | Abbreviation |
Peer Assessed Score | PA Score | Team Based Learning Score | TBL Score Score |
Peer Assessed Index | PA Index | Team Based Learning Index | TBL Index |
Quick links and related information
FAQ: How are peer assessment and personal results calculated and defined mathematically?
A teacher has several alternative calculation methods to determine a personal result from a team member’s Peer Assess Pro assessment. The teacher will usually advise team members about the method they have chosen.
The teacher’s choice of calculation method for a personal result is determined by the teacher's preference for
These choices are illustrated in this figure.
A student’s Personal Result emerges from the Teacher’s choice of Calculation Method, relative Peer Assessed Score, and Team Result
The teacher should select either the Peer Assessed Score (PA Score) or Peer Assessed Index (PA Index) if they wish to exclude the team result in calculating the personal result. Alternatively, set the team result, TR, equal for all teams and use either the IPR, NPR, or RPR methods.
Quick links and related information
FAQ: How is the Peer Assessed (PA) Score calculated?
FAQ: How is the Peer Assessed Index (PA Index) calculated?
More usually, the Peer Assessed Score (PA Score) and team result are combined through one of three methods. The following methods are listed in order of increasing impact for compensating more strongly students who have contributed significantly to their teams, and under-rewarding students who are peer assessed as weak contributors
FAQ: How is the Indexed Personal Result (IPR) calculated?
FAQ: How is the Normalised Personal Result (NPR) calculated?
FAQ: How is the Rank Based Personal Result (RPR) calculated?
Quick links and related information
FAQ: What factors are measured in the peer assessment survey?
FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?
The Teachers Process Flowchart: Detail illustrates the points throughout the peer assessment process where emails are sent to students to advise them
In most cases, the emails are generated automatically by the Peer Assess Pro system. In the case of warnings, the teacher has the option of initiating an email request to a student, or ignoring that warning.
Copies of all emails are sent to the teacher whose Xorro account was used to launch the activity
When you create and launch a Peer Assess Pro™ Peer Assessment activity in Xorro AND the Start Date has been reached:
Alternatively, the teacher can direct students to the Participant URL shown at the top left of the Xorro HOME page. The student must then select from a list the correct peer assessment activity for their response. The teacher may deliver other Xorro-based test activities from which the student must select the correct Peer Assess Pro™ activity distinguished by the Activity Title specified by the teacher.
FAQ: How do I view and experience what the students experience?
FAQ: What questions are asked in the peer assessment survey? in the peer assessment survey?
Click on the name of a student, and you will view the feedback report available for the student.
There are four possible views.
The student’s views are anonymised.
The student will see this view when all of the following are TRUE:
Note that students can continue to submit responses AFTER the Due Date UNTIL the teacher has Finalised the activity.
The student will be able to see their Personal Results when all the following are true:
A student with a Xorro Plus account may view his results any time after the Activity is Finalised by the Teacher.
The student views
Example results for a student
Xxx TO DO xxx
Quick links and related information
FAQ: What questions are asked in the peer assessment survey? in the peer assessment survey?
FAQ: How is the Peer Assessed (PA) Score calculated?
The following terms have been renamed from the Google version of Peer Assess Pro for Peer Assess Pro
Renaming of terms for Peer Assess Pro
Peer Assess Pro | Abbreviation | Google Peer Assess Pro | Abbreviation |
Peer Assessed Score | PA Score | Team Based Learning Score | PA Score Score |
Peer Assessed Index | PA Index | Team Based Learning Index | PA Index |
Quick links and related information
FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?
In general, see the Sections
Warning messages are under constant development and refinement as we respond to facilitators’ and team members’ experience of Peer Assess Pro.
These warnings must be resolved, otherwise utterly invalid results will arise, and students’ time will be wasted completing incorrect surveys.
Example: The composition of a team needs adjusting, see
See Adjusting team composition
Peer Assess Pro will not be able to present results for all teams unless these warnings are resolved.
Example: Insufficient responses from a team are received
See Results hidden from team members and teacher
Example: Enter Team Results:
Advisory warnings do not affect critically the operation of Peer Assess Pro. However, the teacher would be prudent to review the details to ensure that peer assessments have been conducted fairly and honestly.
Example: Overgenerous or parsimonious ratings by a team member.
FAQ: How is an outlier peer assessment rating identified? WARNING 0042
Several warnings give the facilitator the option to despatch an email to students advising them of exceptional conditions and requesting their action. For example
The criteria used to generate these warnings, and the recommended response by the facilitator is detailed in this section:
For example, in the case of a Mismatched self-assessment, the team member is invited to meet with the teacher to explore the reasons for the mismatch, and develop approaches to narrow the gap.
Quick links and related information
FAQ: What is the content of emails sent by Peer Assess Pro?
You select Refresh and Update results when
The most important reason is that you as a teacher MUST be able to review results BEFORE displaying (publishing) results to students. After examining the results to date, you might publish an interim snapshot of the results for view by students.
Students may review the interim results and raise issues such as a questionable peer assessment rating, such as scapegoating. Alternatively, you may need to adjust a Team Result, or experiment with another method of Personal Result Calculation.
In this situation, we have presumed you do not want new responses, nor adjustments to be immediately viewable by students. In particular, you need the opportunity to review the effect of adjustments before explicitly publishing revised results to students.
Quick links and related information
The Peer Assess Pro survey measures one overall assessment, Recommendation, followed by ten quantitative ratings, then several qualitative questions.
The ten quantitative ratings are used to calculate the Peer Assessment Score (PA Score). The ten ratings are categorized into two classes: Contribution to Task, and Contribution to Leadership and Teamwork, as shown in the example survey below.
In addition, two qualitative questions are asked that request examples of behaviours supporting the quantitative ratings in relation to Contribution to Task, and Contribution to Leadership and Teamwork. Finally, the assessor is asked to provide Development Feedback. That is, advice that would help the team member improve their future contribution to the team.
Quick links and related information
FAQ - What is the benefit of a standardized peer assessment rubric?
FAQ: How is the Peer Assessed (PA) Score calculated?
FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?
The ten questions used as the basis for calculating the Peer Assessment Score are adapted from:
Deacon Carr, S., Herman, E. D., Keldsen, S. Z., Miller, J. G., & Wakefield, P. A. (2005). Peer feedback. In The Team Learning Assistant Workbook. New York: McGraw Hill Irwin.
My name is: | I am rating my team member: | ||||
My Team name is: | Team Member A | ||||
Team Member B | |||||
Team Member C | |||||
Self | |||||
Recommendation | How likely is it that you would recommend this team member to a friend, colleague or employee? 1 = Highly unlikely, 5 = Extremely likely | ||||
Contribution to Task Accomplishment | |||||
Rate the team member on a 5-point scale. Rating scale: 1 = Almost never, 2 = Seldom, 3 = Average, 4 = Better than most, 5 = Outstanding Rate your typical or average team member a mid-level rating of 3. | |||||
Initiative | Shows initiative by doing research and analysis. Takes on relevant tasks with little prompting or suggestion. | ||||
Attendance | Prepares for, and attends scheduled team and class meetings. | ||||
Contribution | Makes positive contributions to meetings. Helps the team achieve its objectives. | ||||
Professionalism | Reliably fulfils assigned tasks. Work is of professional quality. | ||||
Ideas and learning | Contributes ideas to the team's analysis. Helps my learning of course and team project concepts. | ||||
Contribution to Leadership and Team Processes | |||||
Focus and task allocation | Keeps team focused on priorities. Facilitates goal setting, problem solving, and task allocation to team members. | ||||
Encourages contribution | Supports, coaches, or encourages all team members to contribute productively. | ||||
Listens and welcomes | Listens carefully and welcomes the contributions of others. | ||||
Conflict management and harmony | Manages conflict effectively. Helps the team work in a harmonious manner. | ||||
Chairmanship | Demonstrates effective leadership for the team. Chairs meetings productively. |
Peer Assessment Survey: Feedback to the team member Submit one copy of this form for each team member |
My name is: |
I am a member of Team Number and Name: |
I am assessing (student’s name): |
Contribution to Task Accomplishment For the team member you have assessed, provide specific examples of productive or ineffective behaviours related to your ratings of Contribution to Task Accomplishment. For example, shows initiative; attends meetings; makes positive contributions; helps team achieve objectives; is reliable; contributes quality work; contributes to learning of course concepts. Further examples here http://tinyurl.com/BARSOhland |
Contribution to Leadership and Team Processes For the team member you have assessed, provide specific examples of productive or ineffective behaviours related to your ratings of Contribution to Leadership and Team Processes. For example: keeps team focused on priorities; supports, coaches and encourages team members; listens carefully; manages conflict effectively; demonstrates effective leadership. |
Development feedback What specific behaviours or attitudes would help your team member contribute more effectively towards your team's accomplishments, leadership, and processes? Please provide specific positive or constructive feedback that could enable the team member to improve their behaviour productively. Considering your team member's strengths, how could that person coach other team members to acquire similar strengths for Task Accomplishment, Team Processes, and Leadership? |
Source: Peer Assess Pro (2019). |
Survey responses from Team Members are received and available for incorporation into the Peer Assessment activity UNTIL the you explicitly Finalise the Survey. Specifically, responses submitted by students after the Due Date announced to students, at the launch of the Activity, will be available for incorporation UNTIL the survey is Finalised deliberately by the Teacher. Until you Finalise, you can request a student to reconsider their responses. Students can optionally resubmit their responses until you Finalise.
When you FINALISE, the current state of the Live View of Gradebook results will be Published to students for their view.
Recommended steps prior to FINALISE: LMS platform
Recommended steps prior to FINALISE: Xorro platform
From the Peer Assess Pro Teacher’s Dashboard, select either
Example Gradebook Summary Statistics
Example Gradebook Full Statistics
Quick links and related information
FAQ - What is an inactive team member? WARNING 0048
The Peer Assessed Score, PA Score, is a relative measure of the degree to which a team member has contributed to their team's overall achievement, team processes, and leadership.
A Peer Assessed Score is generally used to compare the relative contribution of students WITHIN the same team, rather than BETWEEN teams. The Team Result has NO impact on the value of the Peer Assessed Score.
The PA Score is calculated for each team member directly from summing the ten ratings of Team and Leadership Contribution surveyed in the peer assessment. The sum of ratings is adjusted by scale factors to give values for the PA Score that range from zero through 100.
The Peer Assessed Score is an essential factor used as the basis for calculating several alternative measures of Personal Result including the Peer Assessed Index (PA Index), Indexed Personal Result (IPR), Normalised Personal Result (NPR), and Rank Based Personal Result (RPR).
The self-assessment conducted by a team-member is EXCLUDED from the calculation of their Peer Assessed Score. The self-assessment, PA (self), is used to enable the student to compare their self-perception with that of their team members, and the class as a whole. One method of comparison, the IRSA, is based on the ratio as detailed in the FAQ:
FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?
There are ten Peer Rating components awarded by each Assessor, a, to each Assessee, s, in the team of t members. The mathematical task is to combine all these ratings into one Peer Assessed Score for each team member.
The Peer Assessed SubScore is defined as the peer assessment score awarded by Assessor a to Assessee s:
Where
= the Peer Rating for each of the ten peer assessment components, r, submitted by the Assessor a for the assessed team member, the Assessee, s. The student’s self-assessment is excluded from the calculation of the PA Score. The Recommendation rating is excluded from calculation of the PA Score.
To ensures the PA Score ranges from zero through 100 the following features are required in the above formula:
The Peer Assessed Score, for team members s is the mean of the PA Subscores awarded by the other (t - 1) team members to the team member s.
Where
t = the number of team members in the team in which s is a team member.
= the peer assessment score awarded by Assessor a to Assessee s, mathematically defined earlier.
Note that Peer Assessed Score takes NO account of the team’s Team Result. The Team Result is accounted for in the Indexed Personal Result (IPR), Normalised Personal Result (NPR) and Rank-Based Personal Result (RPR) methods discussed elsewhere.
An example calculation is shown below. In the first table, the team member Bridget (ASSESSEE) is rated by her three team members (ASSESSORS), plus her own self-rating. The subsequent tables show the calculation of the Peer Assessment Score for all four team members based on all team members’ assessment ratings. The long-form calculations show in detail the arithmetic calculations.
Quick links and related information
FAQ: What questions are asked in the peer assessment survey?
Alternative but equivalent methods for calculating the Peer Assessed Score are detailed below in the section:
Alternative mathematical formulations of PA Score
Example table of assessments for assessed team member Bridget
ASSESSEE: Bridget | ASSESSOR: Ratings by team member: | |||||
Team Name: Kubla | Bridget (Self) | |||||
Julian | ||||||
Lydia | ||||||
Nigella | ||||||
Mean Rating | ||||||
Contribution to Task Accomplishment | ||||||
Rating scale: 1 = Almost never, 2 = Seldom, 3 = Average, 4 = Better than most, 5 = Outstanding | ||||||
Initiative | Shows initiative by doing research and analysis. Takes on relevant tasks with little prompting or suggestion. | 5 | 5 | 3 | 1 | 9/3 |
Attendance | Prepares for, and attends scheduled team and class meetings. | 4 | 4 | 4 | 1 | 9/3 |
Contribution | Makes positive contributions to meetings. Helps the team achieve its objectives. | 4 | 5 | 5 | 1 | 11/3 |
Professionalism | Reliably fulfils assigned tasks. Work is of professional quality. | 4 | 3 | 4 | 1 | 8/3 |
Ideas and learning | Contributes ideas to the team's analysis. Helps my learning of course and team project concepts. | 5 | 5 | 5 | 1 | 11/3 |
Contribution to Leadership and Team Processes | ||||||
Focus and task allocation | Keeps team focused on priorities. Facilitates goal setting, problem solving, and task allocation to team members. | 5 | 5 | 3 | 1 | 9/3 |
Encourages contribution | Supports, coaches, or encourages all team members to contribute productively. | 4 | 4 | 4 | 1 | 9/3 |
Listens and welcomes | Listens carefully and welcomes the contributions of others. | 5 | 5 | 3 | 1 | 9/3 |
Conflict management and harmony | Manages conflict effectively. Helps the team work in a harmonious manner. | 4 | 4 | 4 | 1 | 9/3 |
Chairmanship | Demonstrates effective leadership for the team. Chairs meetings productively. | 5 | 5 | 5 | 1 | 11/3 |
SubTotal | SubTotal = Task + Leadership | 45 | 45 | 40 | 10 | # 95/30 ( 3.167) |
Peer Assessed Score | PA Score = (2.5 x SubTotal ) - 25 | * 87.5 | 87.5 | 75 | 0 | 54.2 |
* The self-assessment ratings are excluded from calculation of the PA Score. So, 54.2 = (87.5 + 75 + 0) / 3 # Alternatively, PA Score = (25 x Mean Rating) - 25. So, 54.2 = 25 x 95/30 - 25 = (25 x 3.167) - 25 |
Suppose that the Peer Assessed Scores determined from all four team members rating each other appear as follows. Bridget’s PA Scores are copied from the previous table, forming the second vertical column here.
Since
Now consider the Assessment by Lydia of Bridget
In the previous table, note how Nigella rated Bridget with the minimum possible rating of one for all ten components. By definition, that gives a PA Score of zero. Similarly, if an assessor had rated a team member the maximum rating of five across all ten components, then a PA Score of 100 would have resulted.
Peer Assessed Sub-Scores for a team of four members
ASSESSEE | ||||
ASSESSOR | Bridget | Julian | Lydia | Nigella |
Bridget | 87.5 | 62.5 | 75 | 72.5 |
Julian | 87.5 | 92.5 | 87.5 | 82.5 |
Lydia | 75 | 82.5 | 77.5 | 80 |
Nigella | 0 | 77.5 | 82.5 | 82.5 |
Now the PA Score for each ASSESSEE team member is calculated from the mean of the PA SubScores provided by the other ASSESSORS in their team, as shown in the following table. The self-assessments of each ASSESSOR are excluded from the calculation. For example, the PA Score for Nigella is determined as follows from the ratings by her three teammates Bridget, Julian and Lydia:
Since
Then for Nigella
Calculation of Peer Assessed (PA) Scores for a team of four members
ASSESSEE | ||||
ASSESSOR | Bridget | Julian | Lydia | Nigella |
Bridget | - | 62.5 | 75 | 72.5 |
Julian | 87.5 | - | 87.5 | 82.5 |
Lydia | 75 | 82.5 | - | 80 |
Nigella | 0 | 77.5 | 82.5 | - |
Peer Assessed Score | 54.2 | 74.2 | 81.7 | 78.3 |
Note how Nigella’s rating of Bridget (PA Score = 0) seems an outlier when compared with the much higher ratings given by Julian and Lydia (7.5 and 75). Peer Assess Pro warns the teacher when outlier ratings like this occur.
This outlier issue is discussed in
FAQ: How is an outlier peer assessment rating identified? WARNING 0042
The following equations provide the identical mathematical result for the calculation of PA Score.
Where:
Average Rating is the average rating of an assessed student s averaged over all the ten components of the rating for that student, by their team members. The Average Rating lies between 1 and 5.
The factor (-1) adjusts the Average Rating value to zero through 4. The scale factor 100 /4 adjusts the PA Score to lie between zero and 100.
Notice from the first table showing ratings of Bridget that the average rating across all ten components contributing to her Peer Assessment Score given by her three team members was shown as
Therefore, the PA Score is calculated directly from the average rating:
Finally,
Where:
ATC and ALC are the average ratings for the five components that comprise the Task and Leadership contributions, respectively.
Mathematically:
ATC and ALC range over the values 1 through 5. The factor (-1) adjusts those values from zero through 4. The scale factor 50/4 (= 12.5) ensures that the PA Score achieves a range from zero to 100.
Quick links and related information
FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?
FAQ: How are peer assessment and personal results calculated and defined mathematically?
The self-assessment conducted by a team-member when they rate their team members is EXCLUDED from the calculation of that team member’s Peer Assessed Score. Instead, their self-assessment, PA (self), is used to enable the team member to compare their self-perception with that of their team members, and the class as a whole. This comparison is provided to the team member through a SPider Chart and the calculation of their Index of Realistic Self Assessment (IRSA).
The Spider Chart shows each of their eleven ratings provided by themself, compared with the average of the ratings provided to them by their peer team members. The class average ratings for each of the 11 factors are also provided. In this example, the team member has significantly UNDERRATED themself on nearly all factors (innermost plots), when compared with the ratings provided by their team members (orange).
Spider Chart comparison of self and other team members’ contribution ratings
Another method of comparison, the IRSA, is based on the ratio
as detailed in the FAQ:
FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?
For the team member illustrated in the foregoing Spider Chart, their Peer Assessed Score, PA Score, is 92 and their self-assessed Score, PA (self), is 75. The ratio results in the Index of Realistic Self Assessment (IRSA) 122 = 100 x 92 / 75.
An IRSA between 75 and 95 is typical of about 2/3 of team members in a class. About ⅙ of team members achieve an IRSA below 75. Such people appear to assess their team members excessively OVERCONFIDENT in their abilities. In contrast, an IRSA above 95 suggests the team member has a tendency to UNDERESTIMATE their team contribution when contrasted with the assessment perceived by their team members.
Quick links and related information
FAQ: How are peer assessment and personal results calculated and defined mathematically?
The Peer Assessed Index is defined such that the team member with the maximum PA Score for each team is assigned a PA Index of 100. All other team members in the same team are scaled in relation to the maximum PA Score for that group.
In a gradebook of results, the PA Index is useful for identifying the team members most highly rated by their peers, as they have PA Indexes in the 90 to 100 range. In combination with the Team Result, the PA Index is used to calculate the Indexed Personal Result, (IPR), Normalised Personal Result, (NPR) and Rank-Based Personal Result (RPR).
Where
= the Peer Assessed Score for a team member s in team t, as defined in: FAQ: How is the Peer Assessed (PA) Score calculated?
= the maximum value of PA Score found across all members in team t.
Consider a team of four team members, whose PA Scores are shown in the following table. Lydia has a PA Score of 82, the highest for the team. Therefore, Lydia’s PA Index is 100, by definition.
Calculation of Peer Assessed Index (PA Index) for a team of four members
ASSESSEE | ||||
ASSESSOR | Bridget | Julian | Lydia | Nigella |
Bridget | - | 62.5 | 75 | 72.5 |
Julian | 87.5 | - | 87.5 | 82.5 |
Lydia | 75 | 82.5 | - | 80 |
Nigella | 0 | 77.5 | 82.5 | - |
Peer Assessed Score | 54 | 74 | 82 | 78 |
Peer Assessed Index | 66 | 90 | 100 | 95 |
Bridget has a PA Score of 54, the lowest for the team. Therefore, since
Note that, as expected
The data for the previous table is drawn from
FAQ: How is the Peer Assessed (PA) Score calculated?
Quick links and related information
FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?
FAQ: How is the Peer Assessed (PA) Score calculated?
The Indexed Personal Result (IPR) is calculated from the Team Result (TR) combined with the team member’s specific Peer Assessed Index (PA Index). The Indexed Personal Result method awards the Team Result to the TOP RATED team member in the team, since, by definition, their Peer Assessed Index is 100. All remaining team members in the same team earn the Team Result downwards, directly proportional to their PA Index.
The definition of Indexed Personal Result means that NO team member can earn an Indexed Personal Result greater than the Team Result. That is, values for the Indexed Personal Result range from zero up to the Team Result. Consequently, the IPR disadvantages team members who have been rated unfavourably by their peers. However, no reward is made for the team member(s) who have been rated as the most contributing team members. In contrast, the Normalised Personal Result and Rank-Based Personal Result do award a Personal Result above the Team Result for those team members who contribute above average to the team’s outputs, as assessed by their peers.
For each team member s, in their team, t
Where
= the team result awarded by the teacher for the outputs of team t
= the Peer Assessed Index for the team member s, as defined in
FAQ: How is the Peer Assessed Index (PA Index) calculated?
Suppose that the following team has a Team Result, TR, of 50 and Peer Assessed Indexes previously calculated as follows. The example data is taken from:
FAQ: How is the Peer Assessed Index (PA Index) calculated?
Calculation of Indexed Personal Result in a team of four members
ASSESSEE | ||||
ASSESSOR | Bridget | Julian | Lydia | Nigella |
Peer Assessed Score, PA Score | 54 | 74 | 82 | 78 |
Peer Assessed Index, PA Index | 66 | 90 | 100 | 95 |
Team Result, TR | 50 | 50 | 50 | 50 |
Indexed Personal Result, IPR | 33 | 45 | 50 | 47.5 |
Bridget has a PA Index of 66, the lowest for the team. Therefore, since
In contrast, Lydia has the highest PA Score in the team, and hence a PA Index of 100. Therefore
The IPR for Lydia is equivalent to the Team Result, 50, as defined.
Quick links and related information
FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?
FAQ: How is the Peer Assessed Index (PA Index) calculated?
FAQ: How is the Peer Assessed (PA) Score calculated?
The Normalised Personal Result, NPR, is calculated from the Team Result combined with the team member’s specific Indexed Personal Result (IPR). The Normalised Personal Result method awards the average student in the team the Team Result (TR). All remaining students are awarded a Personal Result above or below the Team Result depending on whether their IPR is above or below that team's average IPR.
Features of the Normalised Personal Result method are that
Use the Normalised Personal Result method with a high Spread Factor if you
For each team member s, in their team, t
Where
the team result awarded by the teacher for the outputs of team t
That is, the mean value of the IPR values found for team t, containing n team members.
= a factor chosen optionally by the teacher that will S T R E T C H each team’s intrinsic spread of NPRs, as measured by the team’s standard deviation of NPR results. The default Spread Factor is 1.0. However a Spread Factor of between 1.5 and 2.o is recommended.
Values of NPR are trimmed to be within the range zero to 100.
Suppose that the following team has a Team Result, TR, of 50 and Indexed Personal Result previously calculated as follows. This first example illustrates a Spread Factor of 2.0. The example data is taken from:
FAQ: How is the Indexed Personal Result (IPR) calculated?
Calculation of Normalised Personal Result in a team of four members
Spreadfactor = 2.0
ASSESSEE | |||||
ASSESSOR | Bridget | Julian | Lydia | Nigella | Mean |
Peer Assessed Score, PA Score | 54 | 74 | 82 | 78 | |
Peer Assessed Index, PA Index | 66 | 90 | 100 | 95 | |
Team Result, TR | 50 | 50 | 50 | 50 | |
Indexed Personal Result, IPR | 33 | 45 | 50 | 48 | 44 |
Correction Factor (Spreadfactor = 2) | -22 | +2 | +12 | +8 | 0 |
Normalised Personal Result, NPR (Spreadfactor = 2) | 28 | 52 | 62 | 58 | 50 |
Bridget has a PA Index of 66, the lowest for the team.
The for the four-member team is 44, calculated from ¼ x (33 + 45 + 50 + 48).
Since
Then
In contrast, the Normalised Personal Result for Lydia, with her IPR of 50, is calculated as follows:
Note how Lydia’s NPR of 62 is above the team Result of 50. Note also how the mean of the NPR values across the team is 50 = (28 + 52 + 62 + 58)/4, identical to the Team Result of 50.
The previous example showed calculations of NPR using a Spread Factor of 2.0. The following table shows the results of calculating the Normalised Personal Result for the team using a more modest Spread Factor of 1.0.
Note the following:
The default Spread Factor is 1.0. However a Spread Factor of between 1.5 and 2.o is recommended.
Calculation of Normalised Personal Result in a team of four members
SpreadFactor = 1.0
ASSESSEE | |||||
ASSESSOR | Bridget | Julian | Lydia | Nigella | Mean |
Peer Assessed Score, PA Score | 54 | 74 | 82 | 78 | |
Peer Assessed Index, PA Index | 66 | 90 | 100 | 95 | |
Team Result, TR | 50 | 50 | 50 | 50 | 50 |
Indexed Personal Result, IPR | 33 | 45 | 50 | 48 | 44 |
Correction Factor (SpreadFactor = 1) | -11 | +1 | +6 | +4 | 0 |
Normalised Personal Result, NPR (SpreadFactor = 1) | 39 | 51 | 56 | 54 | 50 |
Quick links and related information
FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?
FAQ: How is the Peer Assessed Index (PA Index) calculated?
FAQ: How is the Peer Assessed (PA) Score calculated?
FAQ: How are peer assessment and personal results calculated and defined mathematically?
The Rank-Based Personal Result is calculated from the Team Result combined with the student's specific Rank Within Team based on that student's Peer Assessed Score. Like the Normalised Personal Personal Result the RPR method awards the AVERAGE student in the team the Team Result. All remaining students are awarded a personal result above or below the Team Result depending on whether their Rank Within Team is above or below that team's middle-ranked student.
The Extended version of the RPR method, due for implementation in 2023, incorporates the use of a Spread Factor.
Features of the Rank Based Personal Result (RPR) calculation method are that
For student s in their team t with n team members
Where
the team result awarded by the teacher for the outputs of team t
The number of pieces of cake allocated to team member s. An integer number, except in the case of tied ranks.
= the rank average of the team member s in team t containing n members, where the team member with the lowest Peer Assessed Score in that team is ranked as 1, calculated using the rank.average method. Equal ranks are permitted and are calculated as the average of the rankings they would take.
= numbers of members in team t
Values of RPR are trimmed to lie within the range zero to 100.
Alternative mathematical formulation
The formula defined above can be simplified for calculation purposes by substituting the formula for ShareFraction.
Suppose that the following team with n = 4 team members has a Team Result, TR, of 50 and Peer Assessed Scores previously calculated. The example data is taken from:
FAQ: How is the Peer Assessed (PA) Score calculated?
Calculation of Rank-Based Personal Result in a team of four members
ASSESSEE | |||||
ASSESSOR | Bridget | Julian | Lydia | Nigella | Mean |
Peer Assessed Score, PA Score | 54 | 74 | 82 | 78 | |
Rank Average | 1 | 2 | 4 | 3 | |
Share Fraction | 1/10 | 2/10 | 4/10 | 3/10 | |
Team Result, TR | 50 | 50 | 50 | 50 | 50 |
Rank-Based Personal Result, RBR | 20 | 40 | 80 | 60 | 50 |
Observe how there are ten pieces of cake to be allocated, calculated from the sum of the n=4 ranks.
=
Thus, the poorest ranking team member, Bridget, gets one piece, a 1/10 share fraction. In contrast, the best-ranked team member, Lydia, gains four pieces, a 4/10 share fraction.
Bridget has a PA Score of 64, the lowest for the team. Her rank in the team, is therefore 1.
Note how Julian, ranked 2nd from the lowest, receives double the ShareFraction, and, consequently, double the RPR than does Bridget. Applying the alternative formula,
Lydia, the top-ranked student, = 4, receives four times the RBR that Lydia received,
= 80.
Observe how the mean of the RBR values matches the Team Result for team t of 50.
=
=
=
=
Observe that, by definition, the sum of the ShareFractions across the team is exactly 10/10 = 100 %. All the ten pieces of cake are allocated across the team, and each person gets one more piece than the next poorer-ranked student (except in the case of ties).
The following example shows a case where two team members, Julian and Nigella, have the same Peer Assessed Score of 74, and a rank of 2.5 = (2+3)/2. The Google RANK.AVG function where the is_ascending flag is set to FALSE delivers this ranking behaviour.
Calculation of Rank-Based Personal Result with tied scores
ASSESSEE | |||||
ASSESSOR | Bridget | Julian | Lydia | Nigella | Mean |
Peer Assessed Score, PA Score | 54 | 74 = | 82 | 74 = | |
Rank Average | 4 | 2.5 = | 1 | 2.5 = | |
Share Fraction | 1/10 | 2.5/10 | 4/10 | 2.5/10 | |
Team Result, TR | 50 | 50 | 50 | 50 | 50 |
Rank-Based Personal Result, RBR | 20 | 50 | 80 | 50 | 50 |
In a manner similar to the use of the Scale Factor for the Normalised Personal Result Method, NPR, a Spread Factor can be applied in the calculation of the Rank-Based Personal Result.
A SpreadFactor of 2.0 provides the same ‘natural’ values of RBR as determined by the earlier formula. The default Spread Factor is 1.0, which reduces the ‘natural’ spread given by the earlier definition, and brings the spread of RBR values to align comparably with the values typically found using the Normalised Personal Result Method with the default SpreadFactor of 1. A Spread Factor of between 1.5 and 2.o is recommended. The following example shows the calculations for RBR with Spread Factors of 2.0, 1.0, and 2.5.
Calculation of Rank-Based Personal Result with several spreadfactors
ASSESSEE | ||||||
ASSESSOR | Bridget | Julian | Lydia | Nigella | Mean | Range |
Peer Assessed Score, PA Score | 54 | 74 = | 82 | 74 = | 28 | |
Rank Average | 1 | 2.5 = | 4 | 2.5 = | ||
Share Fraction | 1/10 | 2.5/10 | 4/10 | 2.5/10 | 5/5 | |
Team Result, TR | 50 | 50 | 50 | 50 | 50 | |
Rank-Based Personal Result, RBR (Spreadfactor = 2) | 20 | 50 | 80 | 50 | 50 | 60 |
Rank-Based Personal Result, RBR (Spreadfactor = 1.0) | 35 | 50 | 65 | 50 | 50 | 30 |
Rank-Based Personal Result, RBR (Spreadfactor = 1.5) | 27.5 | 50 | 72.5 | 50 | 50 | 45 |
Observe from the table
Consider Lydia, ranked best in team, with a ScaleFactor of 1.5
Quick links and related information
FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?
FAQ: How is the Peer Assessed Index (PA Index) calculated?
FAQ: How is the Normalised Personal Result (NPR) calculated?
FAQ: How are peer assessment and personal results calculated and defined mathematically?
From late-2022 this method of calculation will be superseded by
FAQ: How is the Rank Based Personal Result (RPR) calculated?
The Rank Based Personal Result is calculated from the Team Result combined with the student's specific Rank Within Team based on that student's Peer Assessed Score. Like the Normalised Personal Personal Result the RPR method awards the AVERAGE student in the team the Team Result. All remaining students are awarded a personal result above or below the Team Result depending on whether their Rank Within Team is above or below that team's middle-ranked student.
Features of the Rank Based Personal Result (RPR) calculation method are that
For student s in their team t with n team members
Where
the team result awarded by the teacher for the outputs of team t
= the reversed rank of the team member s in team t where the team member with the lowest Peer Assessed Score in that team is defined as 1. Equal ranks are permitted.
= numbers of members in team t
Values of RPR are trimmed to be within the range zero to 100.
Suppose that the following team has a Team Result, TR, of 50 and Peer Assessed Scores previously calculated as follows. The example data is taken from:
FAQ: How is the Peer Assessed (PA) Score calculated?
Calculation of Rank-Based Personal Result in a team of four members
ASSESSEE | |||||
ASSESSOR | Bridget | Julian | Lydia | Nigella | Mean |
Peer Assessed Score, PA Score | 54 | 74 | 82 | 78 | |
Rank (Reversed) | 1 | 2 | 4 | 3 | |
Share Fraction | 1/10 | 2/10 | 4/10 | 3/10 | |
Team Result, TR | 50 | 50 | 50 | 50 | 50 |
Rank-Based Personal Result, RBR | 20 | 40 | 80 | 60 | 50 |
First calculate the sum of ranks for the team of four members, n = 4. This number is the denominator for calculating the ShareFraction for each team member.
Consequently, there are 10 ‘pieces of cake’ to be shared amongst the 4 team members, in proportion to their reversed rank.
Bridget has a PA Score of 64, the lowest for the team. Her rank in the team, is therefore 1.
Note how the second-ranked student, Julian receives double the ShareFraction, and, consequently, double the RPR than does Bridget
Lydia, the top-ranked student, = 4, receives four times the RBR that Lydia received,
= 80.
Note how the mean of the RBR values matches the Team Result for team t of 50.
=
=
=
=
Note that, by definition, the sum of the ShareFractions across the team is exactly 100 %.
The following example shows a case where two team members have the same Peer Assessed Score of 74. Note how Lydia has a reverse rank of 4, not 3. The Google RANK function, for example, with the optional is_ascending flag set to 1 demonstrates this ranking behaviour.
Calculation of Rank-Based Personal Result with tied scores
ASSESSEE | |||||
ASSESSOR | Bridget | Julian | Lydia | Nigella | Mean |
Peer Assessed Score, PA Score | 54 | 74 | 82 | 74 | |
Rank (Reversed) | 1 | 2= | 4 | 2= | |
Share Fraction | 1/9 | 2/9 | 4/9 | 2/9 | |
Team Result, TR | 50 | 50 | 50 | 50 | 50 |
Rank-Based Personal Result, RBR | 22 | 44 | 89 | 44 | 50 |
Quick links and related information
From mid-2022, to be superseded by
FAQ: How is the Rank Based Personal Result (RPR) calculated?
FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?
FAQ: How is the Peer Assessed Index (PA Index) calculated?
FAQ: How are peer assessment and personal results calculated and defined mathematically?
How do we compare students within a class, and between classes based on their Peer Assessed Scores?
The short answer is: We can use the Peer Assessed Score to compare students ONLY within their team. A PA Score above that specific team’s average PA Score suggests that team member has contributed more than a team member with a lower PA score.
A Peer Assessed Score of 90 indicates that a student in the same team has contributed clearly more to their team’s outcomes than a student in the same team with a Peer Assessed Score of 30. However, a Peer Assessed Score achieved by a student in one team does not meaningfully compare with the Peer Assessed Score of a student in another team. A Peer Assessed Score of 60 in Team t1 is no better nor worse than a PA Score of 90 achieved by a student in another team t2.
We cannot conclude from comparing the Peer Assessed Score which is the better student in terms of team contribution and/or leadership when the students are from different teams. Why? Some students and teams diligently commit to rating each other so the average student in their team does rate ⅗ on each of the ten items in the peer assessment survey, as intended. Meanwhile, other teams believe they are all above average, having come from their local equivalent of Lake Wobegon. By chance and/or good team functioning, some teams achieve that desired state where all members work productively and effectively together: the Holy Grail of the Dream Team. Other teams comprising high performers can conversely fall into the desolation of dismal performance characterised by the Apollo Syndrome (Belbin).
The long answer is that through applying appropriate data analytics, we can develop three related numbers that enable comparison of peer assessed team members both within and between classes, and over time. These measures are Standard Peer Assessed Score (SPAS), Employability, and Net Employability. In essence, the data analytic processes can be likened to a forensic photoanalyst attempting to read an automobile’s number plate. Imagine the original photo image has been photographed through smog, on a dark night, from a far distance, with a low resolution setting, using a poor quality lense and poor imaging sensor. But through advanced algorithms that remove background noise, amplify relevant signals, and enhance clarity, a readable, useful image can be discerned, as illustrated in the example of from Acclaim Software
Source: Acclaim Software. (2015). Forensics - Recovering the Most Detail from Your Image - Focus Magic. http://www.focusmagic.com/forensics-tutorial.htm
The Standard Peer Assessed Score (SPAS) is our first measure designed to enable a more realistic relative comparison of peer assessment ratings between members of a whole class. The Standard Peer Assessed Score combines normalised values of the Recommendation and Peer Assessed Score for each team member. The normalisation applies several data analytic processes to correct for the biases introduced by some students and teams in their rating. The SPAS approach is not perfect, but it’s a start. Furthermore, the determination of Standard Peer Assessed Score is a necessary precursor to the calculation of Employability and Net Employability, discussed elsewhere.
The whole-of-class values of Standard Peer Assessed Score for a particular class response dataset are targeted to have these features:
Mean: 50
Standard Deviation: 20
Maximum possible range: from 0 to 100
By virtue of the definition of the Standard Peer Assessed Score, the following effects occur by design:
One half of the class values of Standard Peer Assessed Score will fall in the range zero to 50 (below the target average). Naturally, the remaining one half of values will fall in the range 50 to +100 (above average).
Approximately ⅔ of Standard Peer Assessed Scores in the class will lie between 30 and 60. That is, within one standard deviation of the mean value of 50. More accurately, if SPAS was normally distributed, then 68.3 percent of the class dataset values of SPAS will lie between plus and minus one standard deviation of the mean.
Approximately ⅙ of students in the class will receive a Standard Peer Assessed Score value of either greater than 60, or less than 30. More precisely, 15.9 percent of values will lie in each of these ranges.
Finally, given the wonders of the normal distribution, 95% of all class members will lie in the range of SPAS 10 through 90. That implies that a student with a SPAS above 90 is in the top 2.5 % of members of the class. Conversely a student with SPAS less than 10 is in the bottom 2.5% of the class. This knowledge allows the teacher to more reliably identify their star students, and students at risk, rather than relying simply on Peer Assessed Score.
The general approach to creating the Standard Peer Assessed Score is to apply z-score normalisation to a student’s (raw) Recommendation, R and Peer Assessed Score, PAS. The two z-scores () are added, then re-scaled to achieve, for the class dataset as a whole, the target mean,
of 50, and target standard deviation,
of 20 required for the SPAS statistic. Note that the result of z-score normalisation for any data set is such that the normalised data has a mean of zero and standard deviation of 1.0, detailed later.
The Standard Peer Assessed Score for student s is defined as
Where
= Target mean for the SPAS statistic, by definition a constant of 50
=Target standard deviation for the SPAS statistic, by definition a constant of 20
= a correction factor to ensure the standardisation process achieves the target standard deviation,
. The factor is required because in practice the distributions of the raw data are not normally distributed, but tend to have strong negative skew, due to such factors as the Lake Wobegon effect mentioned earlier. A factor of 1.2 has been found appropriate in practice.
= The z-score normalisation of the Recommendation rating for student s by their team members.
= The z-score normalisation of the Peer Assessed Score rating for student s by their team members.
= Recommendation rating awarded to student s by their team members in team t
= Peer Assessed Score awarded to student s by their team members in team t
= estimate of the class mean Recommendation rating derived over all valid assessed teams in the class responses dataset
= estimate of the class standard deviation of the class Recommendation ratings derived over all valid assessed teams in the class responses dataset
= the population mean of the Peer Assessed Scores derived over all team members in the team t in which student s is a member.
= the population standard deviation of the Peer Assessed Scores derived over all team members in the team t in which student s is a member.
Notes
Divisor of 2. The sum of the two z-score normalised functions, each with unit standard deviation, gives a resulting distribution with standard deviation of 2.0. Consequently the divisor of 2 is required in the calculation of SPAS so that has a mean of zero and standard deviation of 1.
Trimming. Values of Standard Peer Assessed Score that calculate above +100 are trimmed down to +100. Similarly, values of Standard Peer Assessed Score that calculate below 0 are trimmed up to 0.
The following table shows example calculations of Standard Peer Assessed Score for three students in two teams, A and B. Note how Michael Mass (Team A) and Lydia Loaded (Team B) have both been awarded the same Peer Assessed Score 0f 50 by their team members. However, because of their different team means and standard deviations, the z-score normalisations realise +1 and +2 respectively.
As part of the journey towards calculating SPAS, the intermediate calculations of the combined z-scores provide the basis for calculating the percentage proportion of the entire class who would fall below that combined z-score. This can be interpreted as the percentage of the class who would recommend the specific team member to an employee, a colleague, or another team. This percentage is rounded conservatively to produce the student’s Employability rating, the detailed methodology for which is detailed in the FAQ
FAQ: What is Employability? How is it calculated?
Example calculations for Standard Peer Assessed Score (SPAS) and Employability
Student, s | Peter Johns | Michael Mass | Lydia Loaded |
Recommendation, | 2.0 | 4.5 | 3.0 |
Mean of class Recommendation, | 3.0 | 3.0 | 3.0 |
Standard deviation of class Recommendation, | 0.5 | 0.5 | 0.5 |
Normalised Recommendation, | (2-3)/0.5 = -2 | (4.5-3)/0.5 = +3 | (3.0-3.0)/0.5 = 0 |
Peer Assessed Score, | 30 | 50 | 50 |
Team, | A | A | B |
Mean of team Peer Assessed Score, | 40 | 40 | 20 |
Standard Deviation of of team Peer Assessed Score, | 10 | 10 | 15 |
Normalised Peer Assessed Score, | (30-40)/10 = -1 | (50-40)/10 = +1 | (50-20)/15 = +2 |
Combined z-scores, | (-2-1)/2 = -1.5 | (+3+1)/2 = +2 | (0+2)/2 = +1 |
Target Standard Deviation, | 20 | 20 | 20 |
Correction factor, | 1.2 | 1.2 | 1.2 |
Target mean, | 50 | 50 | 50 |
Standardised Peer Assessed Score, SPAS | 50 + 1.2 x 20 x (-1.5) = 50 - 36 = 14 | 50 + 1.2 x 20 x 2 = 50 + 48 = 98 | 50 + 1.2 x 20 x 1 = 50 + 24 = 74 |
Proportion of class below Combined z-score | 0.5 + GAUSS(-1.5) = 0.5 - 0.4332 = 6.9% | 0.5 + GAUSS(+2) = 0.5 + 0.4772 = 97% | 0.5 + GAUSS(+1) = 0.5 +0.34 = 84% |
Employability | 10 | 90 | 80 |
The following figures show a Standard Peer Assessed Score histogram, and the histograms for the Recommendation and Peer Assessed Score data that contribute to the Standard Peer Assessed Score chart.
Figure 1. Histogram of Recommendation
Mean = 3.7, standard deviation = 0.53
Figure 2: Histogram of Peer Assessed Score
Mean = 67, standard deviation = 11.3
Figure 3: Standard Peer Assessed Score histogram
Mean = 0, Standard deviation = 20
The calculation of Standard Peer Assessed Score assumes several conditions, described as follows.
The statistical distributions of the Recommendation and Peer Assessed Scores (PA_Score) are assumed to be normally distributed. In practice, the distributions are typically asymmetric with negative skew. See Figures 1 and 2 earlier.
The Recommendation score awarded to a student s1 in team t1 are assumed to be absolutely comparable to a similar Recommendation score awarded to another student s2 in another team t2. In other words, a Recommendation score of 3.5 awarded to student s1 in team t1 means exactly the same for student s2 in team t2 if they are also awarded a Recommendation score of 3.5. Similarly, a difference in Recommendation ratings of 1.0 unit means the same in any team. In practice, the Recommendations made by one team may not be consistent with the Recommendation values assigned by another team. However, given that Recommendation is a ‘top of mind’ peer assessment done at the start of the Peer Assess Pro survey, we think it is a reasonable approximation. Consequently, the Recommendation values are z-score normalised using the mean and standard deviation of the entire class of responses.
In contrast, in normalising the Peer Assessed Score it is well recognised that different teams award quite different Peer Assessed Scores to a students who would ordinarily achieve the same Peer Assessed Score in an ideal world of perfect raters. Consequently, it is assumed that each team possesses a uniform, random mix of student capabilities drawn from the entire class. Therefore, all things being equal, one would expect that the mean and standard deviations of each team’s Peer Assessed Score would be equivalent. However, in practice, this equivalence is rarely observed. Consequently, the need arises to z-score normalise the Peer Assessed Score for each team to achieve a set of nor aloised Peer Assessed Scores with mean zero and standard deviation 1 FOR EACH TEAM.
The Peer Assessed Score awarded to a student s1 in team t1 is assumed NOT to be comparable to similar Peer Assessed Score that might be awarded to another student x in team y. Why? Some teams honestly peer assess each, whilst others attempt to ‘game’ the peer assessment process, such as awarding everyone above average, or even the full 5/5 rating for each of the team peer assessment factors. In contrast, it is assumed that the Peer Assessed Score of the average student in team t1 should be adjusted to match the peer rating of the average student rated in another team t2, even though the arithmetic value of the (original) Peer Assessed Scores usually differ. The same reasoning applies to the spread of Peer Assessed Score values within teams, namely, that the best team members in team t1 should be rated comparably with the best team member in team t2, even if their Peer Assessed Scores differ. Consequently, the Peer Assessed Scores WITHIN a team are scaled to match the relative values within other teams through normalisation using each team’s mean and standard deviation.
In that case, the z-score normalised Peer Assessed Score for every team member is set to 0.5.
A Future option to consider: Exclude students from consideration for receiving calculation of their SPAS in the case of a ‘misguided team’, identified as
In general, ‘NO’. A student is motivated differently in each of the classes the take. The luck of the draw is that they may work with a superior or inferior team, who will rate them relatively differently.
Quick links and related information
FAQ: What is Employability? How is it calculated?
FAQ: How are peer assessment and personal results calculated and defined mathematically?
For a specific Peer Assess Pro assessment, Employability is the statistical probability that team members from the class would recommend the specific team member to an employee, a colleague, or another team.
Employability is a proprietary measure defined by Peer Assess Pro™ drawn from the calculation of a student’s Standard Peer Assessed Score (SPAS). SPAS combines a student’s Peer Assessed Score and their Recommendation score, through various statistical treatments such as z-score normalisation. The resulting Employability score is a statistical probability, ranging from 5 to 95 percent. Employability is the best available estimate of the degree to which team members from the class in which the student has participated in a team project would recommend that specific team member to an employee, a colleague, or another team.
Where
= the employability for student s, ranging over values from 5 to 95 in steps of 5.
= the Gaussian distribution. The statistical probability that a random variable, z, drawn from a normal distribution, will lie between the mean and z standard deviations above (or below) the mean. The GAUSS function returns values between -0.5 and +0.5
is the combined z-score resulting from combining the z-score normalisation of the Recommendation
and Peer Assessed Scores
for student s, as explained in the mathematical calculations for the Stand Peer Assessed Score. Through the process of normalisation,
has a mean of zero, standard deviation 1, which is the required input for the GAUSS function.
is a mathematical function that rounds one number to the nearest integer multiple of another. In the case of Employability, m = 5. For example,
and
. The MROUND function coupled with the attenuation factor of 95 achieves a step interval of 5 units.
The constant 0.5, adds the probability that a z-score lies between minus infinity and the mean, which is, by definition, 50%.
The following transformations are applied to remove the impression of an over-precise measure of Employability, and reduce the possibilities of elation or despair in response to extreme values of Employability. Specifically, we apply a Principle of Conservatism the result of which is that Employability is conditioned to lie between 5 to 95, and rounded to increase in steps of 5, rather than the theoretically possible values of zero to 100, with apparently infinite precision!
The MROUND to the closest multiple of 5 coupled with attenuation by 95 achieves the step interval of 5 units.
The constant 2.5 is a translation factor that compensates for the shift downwards in mean values on account of the 95 attenuation factor.
The following table shows example calculations for of Employability based on the most likely range of possible values for combined z-scores arising from the generation of Standardised Peer Assessed Score, SPAS
The subsequent graph shows the data from the calculations of Employability charted against Combined z-scores.
As an example, consider a student achieving a SPAS of zero, arising from their combined z-score of -3. According to the normal distribution, less than 1 in 1000 students would recommend this student, as indicated by the proportion of the class who would fall below a combined z-score of -3. The calculation of Employability generously raises the assessment of the student suggesting that 5 % of the class would recommend them! The same conservativism happens at the other extreme, where a brilliantly contributing student (eg above a Combined z-score of +2) achieves an Employability of 95%, whereas if the normal distribution was to believed, they might expect 98% of the class to recommend them.
Example calculations of Employability from Combined z-scores
Combined z-scores, | -3 | -1.5 | -1 | -0.5 | 0 | 0.5 | 1 | 1.5 | 2 | 3 |
-0.50 | -0.43 | -0.34 | -0.19 | 0 | 0.19 | 0.34 | 0.43 | 0.48 | 0.50 | |
Standardised Peer Assessed Score, SPAS | -22 | 14 | 26 | 38 | 50 | 62 | 74 | 86 | 98 | 122 |
Standardised Peer Assessed Score, SPAS (Trimmed to 0 to 100) | 0 | 14 | 26 | 38 | 50 | 62 | 74 | 86 | 98 | 100 |
Proportion of class below Combined z-score | 0.1% | 7% | 16% | 31% | 50% | 69% | 84% | 93% | 98% | 99.9% |
Employability | 5 | 10 | 20 | 30 | 50 | 70 | 80 | 90 | 95 | 95 |
Quick links and related information
FAQ: How is Standard Peer Assessed Score (SPAS) calculated?
FAQ: How are peer assessment and personal results calculated and defined mathematically?
Having a good sense of who you are enables you to build upon your strengths and correct your weaknesses. In turn, that can make you more successful at work and in your personal life. You are able to better understand, predict and cope with others more effectively. You can better distinguish valid and invalid informal and formal feedback from others. You are more likely to select (and achieve!) realistic personal goals. (‘ERSI: Exceptionally Realistic Self-Image’, 2012)
The Index of Realistic Self Assessment (IRSA) is a first step in providing data upon which to develop an Exceptionally Realistic Self-Image (ERSI).
The Index of Realistic Self Assessment (IRSA) is a ratio-based measure of the extent to which a team members’ SELF assessment is matched by the assessment of the OTHER members of your team.
Where
= the Peer Assessed Score assigned that student by their team members
= the Peer Assessed Score a student has assessed themself
IRSA typically lies in the range 50 to 120. However, theoretically, IRSA could lie between zero and infinity. IRSA values generally calculate as:
IRSA is calculated only when these two conditions occur:
Extreme values of IRSA are notified in the teacher’s Active Warnings, as detailed in the FAQ
FAQ: What is a mismatched self-assessment (IRSA)?
The data for the following table is drawn from
FAQ: How is the Peer Assessed (PA) Score calculated?
Calculations of the Index of Realistic Self Assessment for four team members
ASSESSEE | ||||
ASSESSOR | Bridget | Julian | Lydia | Nigella |
Bridget | 87 | 62.5 | 75 | 72.5 |
Julian | 87.5 | 93 | 87.5 | 82.5 |
Lydia | 75 | 82.5 | 78 | 80 |
Nigella | 0 | 77.5 | 82.5 | 82 |
Peer Assessed Score (others) | 54 | 74 | 82 | 78 |
Peer Assessed Score (self) | 87 | 93 | 78 | 82 |
Index of Realistic Self Assessment | 62 | 80 | 105 | 95 |
Indication | Overconfident | Typical | Underconfident | Typical (Borderline underconfident) |
Lydia has been assessed by others with a PA Score of 82. Her self-assessment has produced her of 78. Therefore, since
Lydia’s IRSA of 105 indicates that she is an outlier when compared with most team members in a typical class. Specifically, she is underconfident in terms of assessing her strengths when compared with how others perceive her.
From our experience using Peer Assess Pro in many classes, we find most team members overrate themself when compared with how their team members rate them. This overrating results in a self-assessed Peer Assessed Score typically 7 to 10 points higher than the Peer Assessed Score awarded by the other members of that same team. This phenomenon of overrating of one’s self assessment is well-established in the literature, termed self-enhancement bias (See, for instance, (Loughnan et al., 2011). Informally, self-enhancement bias is also known as the Lake Wobegon Effect, a phenomenon observed in a fictional town “where all the women are strong, all the men are good looking, and all the children are above average." (‘Lake Wobegon effect’, n.d.; ‘Lake Wobegon: The Lake Wobegon Effect’, 2017).
Quick links and related information
FAQ: What is a mismatched self-assessment (IRSA)?
FAQ: What is a valid assessed team?
FAQ: How do I interpret measures of realistic self-assessment?
Lake Wobegon effect. (n.d.). Retrieved 25 July 2017, from http://psychology.wikia.com/wiki/Lake_Wobegon_effect
Lake Wobegon: The Lake Wobegon Effect. (2017). In Wikipedia. Retrieved from https://en.wikipedia.org/w/index.php?title=Lake_Wobegon&oldid=787029148#The_Lake_Wobegon_effect
Loughnan, S., Kuppens, P., Allik, J., Balazs, K., de Lemus, S., Dumont, K., … Haslam, N. (2011). Economic Inequality Is Linked to Biased Self-Perception. Psychological Science, 22(10), 1254–1258. https://doi.org/10.1177/0956797611417003
From our experience using Peer Assess Pro in many classes, we find most team members overrate themself when compared with how their team members rate them. This overrating results in a self-assessed Peer Assessed Score typically 7 to 10 points higher than the Peer Assessed Score awarded by the other members of that same team. This phenomenon of overrating of one’s self assessment is well-established in the literature, termed self-enhancement bias (See, for instance, (Loughnan et al., 2011). Informally, self-enhancement bias is also known as the Lake Wobegon Effect, a phenomenon observed in a fictional town “where all the women are strong, all the men are good looking, and all the children are above average." (‘Lake Wobegon effect’, n.d.; ‘Lake Wobegon: The Lake Wobegon Effect’, 2017).
The usual tendency of team members is to apply a self-enhancement bias when rating themselves using Peer Assess Pro. Consequently, we can interpret Index of Realistic Self Assessment (IRSA) scores in one of three ways: typical team members, overestimated, and underestimated.
An IRSA score between 75 and 95 suggests the assessed team member understand realistically their team contribution when contrasted with the assessment perceived by other team members. A score between 75 and 95 is typical of about 2/3 of team members in a class.
An IRSA below 75 suggests the assessed team member OVERESTIMATES their team contribution when perceived by other team members. An index below 75 suggests the team member undertake action to understand proactively their areas for development by informally soliciting further feedback and guidance from their team members. About ⅙ of team members achieve an index of below 75.
An IRSA above 95 suggests the assessed team member has a tendency to UNDERESTIMATE their team contribution when contrasted with the assessment perceived by other team members. The team member should consider developing more confidence in applying and displaying their strengths. About ⅙ of team members achieve an index of above 95.
An Index of Realistic Self Assessment that is not in the ‘typical’ range of 75 to 95 suggests that the team member take active steps to
A three-step programme to develop an Exceptionally Realistic Self-Image includes
Quick links and related information
FAQ: What is a mismatched self-assessment (IRSA)?
FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?
ERSI: Exceptionally Realistic Self-Image. (2012). Orange County Human Resource Services Portal. Retrieved from http://bos.ocgov.com/hr/hrportal/docs/docs_hr_leadership_forum/minutes_2012/minutes_030812/ersi.doc
Lake Wobegon effect. (n.d.). Retrieved 25 July 2017, from http://psychology.wikia.com/wiki/Lake_Wobegon_effect
Lake Wobegon: The Lake Wobegon Effect. (2017). In Wikipedia. Retrieved from https://en.wikipedia.org/w/index.php?title=Lake_Wobegon&oldid=787029148#The_Lake_Wobegon_effect
Loughnan, S., Kuppens, P., Allik, J., Balazs, K., de Lemus, S., Dumont, K., … Haslam, N. (2011). Economic Inequality Is Linked to Biased Self-Perception. Psychological Science, 22(10), 1254–1258. https://doi.org/10.1177/0956797611417003
An important test of the validity of teammate peer assessment is the degree to which all the members of a team agree on the ratings they have awarded their teammates, intra-team agreement.
Peer Assess Pro conducts an advanced statistical test to verify that all the ratings provided by a team’s members are concordant in their agreement with each other. The degree of concordance is tested for statistical significance. If the ratings fail to agree to an acceptable level of significance, the following Active Warning is raised.
CRITICAL 0041 Insignificant team agreement
The team members’ ratings of each other fail to agree to an acceptable level of statistical significance. It is neither fair nor valid to award a different peer-assessed personal result to each student in the team.
The following table shows Peer Assessed Scores calculated from the four ratings provided by each teammate in a team. Self-assessments are excluded. Note how Yode is rated very low by Andy and Mike, but very high by Pat and Pete. Similarly, Pat is rated lowest in the team by Yode, Mike and Pete, but highest by Andy.
In this example, the teammates, as assessors, fail to agree in their ratings of each other. Consequently, the resulting Peer Assessed Scores, from which contribution-based personal results are calculated, are not trustworthy. Technically, they are imprecise measures of the Peer Asessed Score. For example, perhaps the Peer Asessed Score for Pat should be zero (the mode), 1.5 (the median), or 25 (the mean)?
In the discussion that follows, we’ll assert that the fair and valid approach to dealing with this team’s untrustworthy peer assessments is to consider awarding the same result to each teammate, such as the team result.
The measure of a team’s agreement in their ratings of each other is termed concordance, which ranges from zero to 1. In this example the concordance, W, is 0.333. The statistical significance of this concordance, p(W), is 50% which does not reach the acceptable level of 10% or better we demand in Peer Assess Pro.
In conclusion, we can state ‘The teammates agree slightly at a level that is essentially insignificant’.
Theses statistics are explained with calculation examples later in this FAQ.
Untrustworthy intra-team agreement in a peer assessment survey | ||||||
ASSESSOR (m) | ||||||
ASSESSEE | Peer Assessed Score | Yode | Andy | MIke | Pete | Pat |
Yode | 52 | 85 | 8 | 3 | 98 | 100 |
Andy | 63 | 68 | 75 | 60 | 63 | 60 |
Mike | 57 | 65 | 53 | 100 | 50 | 58 |
Pete | 45 | 23 | 48 | 58 | 50 | 50 |
Pat | 25 | 3 | 95 | 0 | 0 | 100 |
Peer Assessed Score = MEAN(Peer Assessed Subscores) excluding Self-assessed score | ||||||
GREEN shows highest ranked ASSESSEE by the ASSESSOR RED shows lowest ranked ASSESSEE by the ASSESSOR | ||||||
Concordance, W = 0.33. Significance, 25 < p(W) < 50% For calculations, see Example B below. |
The extended detail for the Active Warning displays one or more messages for each relevant team such as
Team Alpha's ratings of each other fail to agree with acceptable significance. Concordance (W) = 0.52. Significance p(W) = 20%.
The Insignificant team agreement Active Warning is raised when the statistical significance, p(W) of the concordance value, W, exceeds the threshold level of 10 per cent.
The degree of agreement is measured by Kendall’s Concordance statistic, W. When W=1, there is complete agreement amongst the ratings given by the team members. When W=0, there is complete disagreement; the ratings by the team are essentially random.
However, the statistical significance of the agreement, p(W) is of more practical value than the degree of agreement, W. A numerically smaller significance value, p(W), between 0 and 10 per cent, indicates a highly significant degree of agreement (concordance) amongst the team members. A highly significant degree of agreement (concordance) implies greater confidence in the validity and fairness of personal results determined from the peer assessments in that team.
In contrast, the Active Warning is raised only when the significance value, p(W), lies above 10 per cent through 100 per cent. A significance value above 10 per cent indicates unacceptable or weak degree of agreement (concordance) amongst the team members, that is, Insignificant intra-team agreement. A weakly significant concordance implies low confidence in the validity and fairness of personal results determined from the peer assessments in that team.
All other factors remaining equal, if a team member rates two or more team members with the same peer assessed score those ratings are termed tied ratings. As the proportion of tied ratings within a team increases, then the degree of agreement, W, reduces towards zero. Consequently, the numerical value of significance, p(W), increases and the intra-team agreement becomes increasingly unacceptable.
In the extreme case where all team members rate everyone the same then the concordance is W = 0, and the significance p(W) is 100%. These results signify no agreement because the assessors have made no distinction in relative ranking amongst the teammates they are assessing.
Example C shows the impact of tied ranking in the calculation of the concordance statistics when compared with similar team results without tied rankings in Example A.
When the team's ratings agree with weak significance, that is, 10% < p(W) <= 100%, then it is neither fair nor valid to award a peer-assessed personal result to each student in the team. Accordingly, the teacher is presented with several approaches to resolve the lack of agreement.
The Active Warnings presents to the teacher a proforma message to send to all the team members of the selected teams explaining the available options.
The same teacher-determined Team Result will be applied to each team member if the teacher has selected the Personal Results Methods of either
If the teacher subsequently updates the Team Result, then the updated Team Result will be applied to the member of teams for whom ‘Award the same personal result’ has been activated. See Special Cases below.
A Personal Result of 50/100, the mid-range value, is applied to each team member if the teacher has selected the Personal Results Methods of either
In their Personal Feedback Report, the student will view the original Peer Assessed Score and quantitative ratings derived from the peer assessment survey. However, the Personal Result will show as either the Team Result or 50/100 depending on the Personal Result Method selected at the time the students’ personal feedback reports were published or updated.
At any time, the teacher can view in the Full Statistics csv download the original peer assessed scores and personal results calculated by each of the available personal result methods.
Once the ‘Award the same personal result’ status is activated for a team’s personal results, then the personal result awarded to each team member will update in response to subsequent adjustments made by the teacher on the platform.
Specifically, if the teacher, at any time, adjusts
In other words, the personal result presented to a student in their personal feedback report depends on the Personal Result Method selected at the time their personal feedback reports are published or updated.
In general, if the students in a team resubmit their responses, but the recalculated value of p(W) remains greater than the threshold level of 10 per cent then ‘Award the same personal result’ status remains in effect, since the team’s ratings of each other continue to fail to agree with acceptable significance
However, when the ‘Award the same personal result’ status for a team is in effect, that status is deactivated when
When the ‘Award the same personal result’ status is deactivated, then the personal result (individual result) applied to each team member is the personal result calculated according to the team result, peer assessed scores, and personal result method in effect at that instant. As stated in the earlier special case, the personal result updates subsequently in response to the teacher’s adjustments to team result or result method.
When the Exclude from calculations action has been set for a student through a teacher’s response to Active Warning 0048 Inactive Team Member this action OVERRIDES certain responses to Active Warning 0041 Insignificant Team Agreement.
The concordance value, W, and significance, p(W) is calculated ONLY for the team members who have not been excluded. In effect, the team size is reduced by the number of excluded students.
If the teacher chooses the action Award same personal result to all members of the team then only the active team members will earn the relevant personal result. The excluded student(s) will earn a missing result, as dictated by the ‘Exclude from Calculations’ action.
The quantitative degree of agreement amongst several judges evaluating several objects is termed concordance. In the case of teammate peer assessment, the teammates are both judges (assessors) and the ‘objects’ judged, (assessees).
Concordance is calculated using the non-parametric statistic Kendall’s W (Gibbons & Chakraborti, 2020; Kendall & Babington-Smith, 1939). The statistic is calculated not from the raw peer assessment ratings, but from the relative rankings of the raw ratings each teammate provides. The use of relative rankings rather than raw peer assessment ratings corrects for several assessment issues such as a judge (student) who overall rates their teammates near-Expert and/or over a narrow range.
Conceptually, the notion of concordance is analogous to the statistical notion of correlation. Correlation is a statistical measure that expresses the extent to which two variables are linearly related, meaning they change together at a constant rate (JMP Statistical Discovery, 2022). In contrast, statistical concordance applies to three or more variables, in our case, the three or more members of a team who are peer assessing each other.
For example, consider three instruments for measuring the temperature of water, a mercury thermometer, an alcohol thermometer, and a digital thermometer. We place each of these instruments in several samples of water. Perhaps one sample contains ice and salt, another is boiling, and another sample is taken directly from the kitchen faucet. We want to know ‘To what extent are the measurements observed by three instruments in agreement?’.
If the concordance statistic W is 1, then all the peer assessment survey respondents have been unanimous. Each respondent has assigned the same rank order to the list of their teammates (as derived from their raw peer assessment ratings). If W is 0, then there is no overall trend of agreement among the respondents. The team’s responses may be regarded as essentially random. Intermediate values of W indicate a greater or lesser degree of unanimity among the various responses.
Note that a correlation coefficient statistic can range from -1.0 to +1.0, with zero indicating the complete absence of a linear relationship. An inverse relationship is signalled by -1.0. In contrast, concordance, W, can lie only between +1 (complete agreement) and zero (no agreement).
In Peer Assess Pro, the calculation of the concordance statistic, W, is modified to exclude the self-assessed scores of each teammate (Willerman, 1955; Lewis & Johnson, 1971; Gibbons & Chakraborti, 2003). This modification can be applied only when all members of the team have conducted their peer assessment survey AND there are at least four team members.
When self-assessments are excluded, but all team members provide a peer assessment rating then the set of ratings is known as a Youden Square Design. Technically speaking, a Youden Square Design is a ‘balanced incomplete block design with square data matrices and zero major diagonal’. In plain language, in a team of five members, every member is rated four times, and also provides four ratings of the other team members.
The effect of ties is to reduce the value of W. However, this effect is small unless there are a large number of ties. Peer Assess Pro applies the standard statistical correction to account for ties in the calculation of W (Gibbons & Chakraborti, 2020, ch. 12; Siegel & Castellan, 1988, p. 266, and Zaiontz, 2021). This correction is demonstrated in Example C below.
When there are many tied ranks, this is a symptom of one or more low-quality individual assessments. In the extreme case of low-quality team assessment where everyone rates everyone the same, then the concordance value W will be zero. No agreement. These results signify no agreement because the assessors have made no distinction in relative ranking amongst the teammates they are assessing. If the team is truly and honestly unable to make a distinction about the relative contribution of its teammates (W=0), then by mathematical definition the team result will be awarded by Peer Assess Pro to the teammates.
Several significance tests can be used to test the concordance statistic, W, against the null hypothesis of no agreement. In other words, if the significance test p(W) is GREATER than 10 per cent we accept that there is NO statistically valid agreement amongst the teammates. More correctly, we ACCEPT the null hypothesis that there is NO AGREEMENT. Conversely, if the significance test p(W) is LESS than 10 per cent we ACCEPT that THERE EXISTS statistically valid agreement amongst the peer assessment ratings provided by the teammates to each other in their team.
Conventionally, the chi-square test is used to test concordance W for significance for large values of team size (Kendall & Gibbons, 1990). For team size <=10, alternative tests for significance are required, such as the F-Test or the results of a Monte Carlo permutation test.
When self-assessments are excluded, Willerman (1955) provided a significance test based on the Beta distribution restricted to significance values of 1 per cent and 5 per cent. To provide greater range and precision, Peer Assess Pro has implemented significance tests derived from a Monte Carlo permutation test presented in Lewis & Johnson (1971). The Lewis and Johnson test values cover significance values over the range from p(W) = 0.1 per cent (Exceptionally significant) through p(W) = 99 per cent (Essentially insignificant). This ranging enables the teacher to more clearly see the extent to which their entire class’ peer assessment ratings and agreement are significant.
The concordance statistic, W and its significance, p(W) can only be determined practicably for teams
For teams of three members the concordance statistic can be calculated. However, even when W=1 the significance is weakly significant, p(W) >10%. The same personal result should be awarded all team members.
A table of concordance statistics for the entire class, W and p(W), is presented in the Team Discrimination display available from the Teachers Dashboard in Advanced Statistics. You can sort the table by selecting the column header.
In general, the teacher would accept as valid the contribution-based personal results derived from peer assessment for teams Sparkling Violeteer and Black Robins. In contrast, the teacher would generally choose to award the Team Result or a Peer Assessed Score of 50/100 to the teams Red Ruru and Xinjiang Ground-Jay.
Several measures of team discrimination in teammates’ peer assessment ratings | |||||
Team | Range | PA Score (Mean) | Team Result | ConcordanceW | Significance p(W) |
Sparkling Violetear | 71.3 | 47.9 | 75.0 | 0.80 | 1.0% |
Black Robins | 55.9 | 60.6 | 95.0 | 1.00 | 2.5% |
Red Ruru | 34.1 | 77.3 | 30.0 | 0.25 | 98.0% |
Xinjiang Ground-Jay | 61.6 | 57.5 | 80.0 | 0.24 | 98.0% |
Pukekos | 8.8 | 54.5 | 85.0 | ||
Grey Warblers | 89.2 | 57.4 | 75.0 |
Five teammates in team Sparkling Violetear have rated each other on three separate occasions, A, B, C. The results of the detailed examples are summarised in the table, with the interpretation of the level of significance.
Example | Concordance W | Significance p(W) | Interpretation |
A | 0.8 | 1% | The teammates agree very strongly at a level that is exceptionally significant. |
B | 0.333 | 50% | The teammates agree slightly at a level that is essentially insignificant. |
C (with ties) | 0.76 | 1% | The teammates agree strongly at a level that is exceptionally significant. |
Five teammates in team Sparkling Violetear have rated each other with the following peer-assessed subscores calculated from their teammate peer assessment survey.
Ex. A. Peer assessed subscores: Team Sparkling Violetear | ||||||
ASSESSOR | ASSESSOR | |||||
ASSESSEE | Peer Assessed Score | Yode | Andy | Mike | Pete | Pat |
Yode | 74 | 85 | 95 | 3 | 98 | 100. |
Andy | 63 | 68 | 75 | 60 | 63 | 60. |
Mike | 56 | 65 | 53 | 100 | 50 | 58. |
Pete | 44 | 23 | 48 | 58 | 50 | 50. |
Pat | 3 | 3 | 8 | 0 | 0 | 100 |
The Peer Assessed Score for each Assessee excludes the Self-Assessed Score. |
Excluding the self-assessment scores yields the following table of rankings, where the top-ranked assessee is awarded a rank of 1.
Ex A. Ranked peer-assessed subscores, excluding self-assessed score | |||||
ASSESSOR | |||||
ASSESSEE | Yode | Andy | Mike | Pete | Pat |
Yode | 1 | 3 | 1 | 1 | |
Andy | 1 | 1 | 2 | 2 | |
Mike | 2 | 2 | 3 | 3 | |
Pete | 3 | 3 | 2 | 4 | |
Pat | 4 | 4 | 4 | 4 |
Kendall’s Concordance W is calculated from the formula by Willerman, and Lewis and Johnson.
Where
, the sum of squared deviations
number of teammates in the team
the rank given to Assessee i by Assessor j
, the total rank awarded to Assessee i by the other
Assessors j, excluding the self-assessment
the mean value of the total ranks
Ex A. Calculation of Concordance, W, from ranks | |||||||
ASSESSEE | Total Ranks, | Yode, j=1 | Andy, j=2 | Mike, j=3 | Pete, j=4 | Pat, j=5 | |
Yode, i=1 | 6 | 16 | 1 | 3 | 1 | 1 | |
Andy, i=2 | 6 | 16 | 1 | 1 | 2 | 2 | |
Mike, i=3 | 10 | 0 | 2 | 2 | 3 | 3 | |
Pete, i=4 | 12 | 4 | 3 | 3 | 2 | 4 | |
Pat, i=5 | 16 | 36 | 4 | 4 | 4 | 4 | |
Totals |
50/5 = 10 |
|
From the previous table of ranked peer assessed subscores
number of teammates in the team = 5
The calculation of for Yode, where i=1
Now, Kendall’s Concordance is
Significance is determined from the table Values of S Associated with a Given Probability and a Given Number of Judges (Lewis and Johnson) adapted to enable lookup by the value of W, since .
For n = 5, the critical values are W = .756 and W = 0.822 at the 1 per cent and 0.5 per cent levels. Since W = 0.80 is greater than the critical value of 0.756 but less than 0.822, then we adopt the conservative significance level p(W) at the 1 per cent level.
Given that the concordance statistic is significant at the 1 per cent level, we REJECT the null hypothesis that there is NO AGREEMENT amongst the raters. From a statistical perspective, we accept that the team has submitted a fair and valid teammate peer assessment. We could say ‘The teammates agree very strongly at a level that is exceptionally significant.’ No Active Warning is generated for this Example A.
Note how it is the rating by Assessor Mike of Teammate Yode, that reduces the concordance from a perfect 1.0 down to 0.80. This outlier rating is identified in Peer Assess Pro through the Active Warning 0042 Outlier individual rating. However, in this case, the outlier rating does not have a material impact on the Peer Assessment Scores and, therefore, the contribution-based Personal Results that would be awarded.
On another occasion, the five teammates in Team Sparkling Violetear rated each other with the following peer-assessed subscores calculated from their teammate peer assessment survey. The ratings are similar to those in Example A, except that two ratings by Andy are swapped, signalled by *, assessing Yode and Pat.
Ex. B. Peer assessed subscores: Team Sparkling Violetear | ||||||
ASSESSOR (m) | ||||||
ASSESSEE | Peer Assessed Score | Yode | Andy | MIke | Pete | Pat |
Yode | 52 | 85 | 8 * | 3 | 98 | 100 |
Andy | 63 | 68 | 75 | 60 | 63 | 60 |
Mike | 57 | 65 | 53 | 100 | 50 | 58 |
Pete | 45 | 23 | 48 | 58 | 50 | 50 |
Pat | 25 | 3 | 95 * | 0 | 0 | 100 |
Ex B. Calculation of Concordance, W, from ranks | |||||||
ASSESSEE | Total Ranks, | Yode, j=1 | Andy, j=2 | Mike, j=3 | Pete, j=4 | Pat, j=5 | |
Yode, i=1 | 9 | 1 | 4 | 3 | 1 | 1 | |
Andy, i=2 | 6 | 16 | 1 | 1 | 2 | 2 | |
Mike, i=3 | 10 | 0 | 2 | 2 | 3 | 3 | |
Pete, i=4 | 12 | 4 | 3 | 3 | 2 | 4 | |
Pat, i=5 | 13 | 9 | 4 | 1 | 4 | 4 | |
Totals |
50/5 = 10 |
|
Now, Kendall’s Concordance is
For Example B, the concordance, W, is calculated as 0.333, which yields a significance value between 50% and 25% (Lewis and Johnson, 1971). Conservatively, we adopt a significance value of p(W) = 50 per cent. Compared with Example A, (Concordance, W = 0.80) the concordance value here, W = 0.33, is lower and essentially insignificant. Consequently, we can state ‘The teammates agree slightly at a level that is essentially insignificant’.
For Example B, Peer Assess Pro will present the Active Warning
Team Sparkling Violeteer’s ratings of each other fail to agree with acceptable significance. Significance p(W) = 50%. Concordance (W) = 0.33.
Given the poor significance, worse than the threshold of 10 per cent, we accept the hypothesis that there is no agreement amongst the raters. The team has submitted a potentially unfair and invalid teammate peer assessment from a statistical perspective. In this example, the Teacher should either
In this case, we note that it is Assessors Andy and Mike who are the outlier assessors compared with the other three teammates, particularly in their assessment of Yode. But can we be confident about that suspicion? Perhaps we can only verify by consulting with the team.
This example revisits team Sparkling Violetear where several teammates, Pat, Pete and Mike, have been less discerning in their ratings compared with Example A. The adjustments are signalled with * in the following table.
Mike has rated two teammates the same. Pete has rated three teammates the same, and Pat has rated everyone the low, same rating except himself!
Ex. C. Peer assessed subscores: Team Sparkling Violetear | ||||||
ASSESSOR | ASSESSOR | |||||
ASSESSEE | Peer Assessed Score | Yode | Andy | Mike | Pete | Pat |
Yode | 23 | 85 | 95 | 3 | 50 * | 30 * |
Andy | 55 | 68 | 75 | 60 * | 63 | 30 * |
Mike | 50 | 65 | 53 | 100 | 50 * | 30 * |
Pete | 40 | 23 | 48 | 60 * | 50 | 30 * |
Pat | 3 | 3 | 8 | 0 | 0 | 100 |
The Peer Assessed Score for each Assessee excludes the Self-Assessed Score. |
The ranking for these peer assessments subscores is shown here. Note that tied ranks are calculated using the RANK.AVERAGE function. For example, the ranks awarded by Pete are {(2 + 3)/2, 1, (2 + 3)/2, 4} = {1, 2.5, 2.5, 4}. Similarly, the four identical ranks awarded by Pat are each (1+2+3+4)/4 = 10/4 = 2.5.
Ex C. Calculation of Concordance, W, from ranks | |||||||
ASSESSEE | Total Ranks, | Total Ranks Squared, | Yode, j=1 | Andy, j=2 | Mike, j=3 | Pete, j=4 | Pat, j=5 |
Yode, i=1 | 11.5 | 132.25 | 3.5 | 3 | 2.5 | 2.5 | |
Andy, i=2 | 6 | 36 | 1 | 1.5 | 1 | 2.5 | |
Mike, i=3 | 8 | 64 | 2 | 1 | 2.5 | 2.5 | |
Pete, i=4 | 9 | 81 | 3 | 2 | 1.5 | 2.5 | |
Pat, i=5 | 15.5 | 240.25 | 4 | 3.5 | 4 | 4 | |
Totals | 553.5 |
Now, Kendall’s Concordance, correcting for ties in a Youden Square is
, the square of the total ranks awarded to Assessee i by the other
Assessors j, excluding the self-assessment.
the correction factor accounting for tied ranks, in this case,
78.
For this revised example for team Sparkling Violeteer, the concordance, W, is calculated as 0.76, which is significant at the 1 per cent level. Consequently, we can state ‘The teammates agree strongly at a level that is exceptionally significant’.
Note that compared with Example A, Andy is now ranked best in the team, indicated by his lowest sum of ranks, 6. Yode is now ranked one of the poor performers in this team with sum of ranks 11.5. Despite Pat’s attempts to manipulate the peer assessment, he still remains ranked worst in the team indicated by the highest sum of ranks 15.5, since his self-assessment is excluded from the calculation of the Peer Assessed Score and Personal Result.
The W value for Example C is corrected for the tied rankings awarded by Mike, Pete and Pat with the correction factor T = 78. The calculation of the correction factor, T, is beyond the scope of this explanation but see Gibbons & Chakraborti, (2020, ch. 12), Siegel & Castellan (1988, p. 266) and a worked example in Zaiontz, (2021).
In general, the effect of ties is to reduce the value of W. In this case, the uncorrected value of W = 0.59 has a lesser significance, between 5 and 10 per cent. These values compare with the tie-corrected values of W = 0.76 and a significance of 1 per cent.
Note that the calculation of the numerator in Example C is numerically equivalent to the formula used in Examples A and B. However, the method of Example C is computationally more efficient in not requiring the calculation of . In other words, in the specific case of an n-sized Youden Square
Therefore,
, the square of the total ranks awarded to Assessee i by the other
Assessors j, excluding the self-assessment.
number of teammates in the team
the rank given to Assessee i by Assessor j
the correction factor accounting for tied ranks as per Gibbons & Chakraborti (2020, ch. 12), Siegel & Castellan (1988, p. 266), and Zaiontz (2021).
Quick links and related information
FAQ: How is an outlier peer assessment rating identified? WARNING 0042
Training: Practice developing and applying a peer assessment survey rubric
Guidelines for conducting a team-based courageous conversation
Durbin, J. (1951). Incomplete Blocks in Ranking Experiments. British Journal of Statistical Psychology, 4(2), 85–90. https://doi.org/10.1111/j.2044-8317.1951.tb00310.x
Gibbons, J. D., & Chakraborti, S. (2020). Nonparametric Statistical Inference (Apple Books; 6th ed.). Chapman and Hall/CRC Press.
Kendall, Maurice. G., & Babington-Smith, B. (1939). The Problem of m Rankings. The Annals of Mathematical Statistics, 10(3), 275–287. https://doi.org/10.1214/aoms/1177732186
Lewis, G. H., & Johnson, R. G. (1971). Kendall’s Coefficient of Concordance for Sociometric Rankings with Self Excluded. Sociometry, 34(4), 496–503. https://doi.org/10.2307/2786195
Mellalieu, P. J. (2021, April 28). Is peer assessment valid for determining individual grades in group work? Better Feedback. Better Teams. https://www.peerassesspro.com/is-peer-assessment-valid-for-determining-individual-grades-in-group-work/
Siegel, S., & Castellan, N. J., Jr. (1988). Nonparametric Statistics for the Behavioral Sciences (2nd ed.). New York: McGraw-Hill. p. 266. ISBN 978-0-07-057357-4.
Willerman, B. (1955). The adaptation and use of Kendall’s Coefficient of Concordance (W) to sociometric-type rankings. Psychological Bulletin, 52(2), 132–133. https://doi.org/10.1037/h0041665
Zaiontz, C. (2021). Kendall’s Concordance (W) Coefficient. Real Statistics Using Excel. https://www.real-statistics.com/reliability/interrater-reliability/kendalls-w/
If one member of a team submits a peer assessment for an assessee ‘materially different’ than the assessments given by the other team members, this difference gives rise to an Active Warning in Peer Assess Pro titled
Critical Warning 0042 Outlier individual rating
A team member has assessed another team member very differently than the other team members.
The extended detail for the Active Warning displays one or more messages such as:
Harris assessed Michael awarding a PA Subscore of 38. Compared with the average rating by the other team members of 70 this subscore DEPRESSED the PA Score to 64 by 7 PA Score Units. Team Alpha
Josef assessed Alvin awarding a PA Subscore of 100. Compared with the average rating by the other team members of 66 this subscore RAISED the PA Score to 73 by 7 PA Score Units. Team Alpha
An Outlier individual rating warning will be raised ONLY if the impact on the assessee’s Peer Assessed Score is raised or lowered by more than 5 PA Score Units outside the average rating given by the other members of the team. Note. The threshold will be adjusted to 10 PA Score Units by August 2022.
The warning will be generated only for members of a valid assessed team, as detailed in
FAQ: What is a valid assessed team?
When the team members in a particular team all fail to agree on their ratings, then most of the team will receive this Active Warning. This is a symptom of poor training in peer assessment or application of that training. This situation is identified through the Active Warning CRITICAL 0041 Insignificant team agreement: A team's ratings of each other fail to agree.
See FAQ: How is insignificant team agreement identified? WARNING 0041
Consider team Alpha containing 5 members where Adam has been assessed with the following Peer Assessed Subscores by the other four team members.
Impact of removing one Assessor from the calculation of Peer Assessed Score for Adam
Assessee | Assessor | PA Subscore | Team Size | PA Score | PA Score Exclusive | Assessor Impact | Impact Direction |
Adam | Edward | 53 | 5 | 73 | 80 | -7 | DEPRESSED |
Adam | Mary | 63 | 5 | 73 | 77 | -4 | |
Adam | Stephanie | 78 | 5 | 73 | 72 | 1 | |
Adam | Josef | 100 | 5 | 73 | 64 | 9 | RAISED |
Adam has a Peer Assessed Score of 73, calculated from his four team members subscores as follows:
= (Edward + Mary + Stephanie + Josef) / (Team size - 1)
= (53 + 63 + 78 + 100) / (5 - 1)
= 294 / 4
= 73.5
To determine the impact of Edward’s assessment of Adam, we can calculate the Peer Assessed Score Adam would receive from just the other three members as follows:
PA Score Exclusive = (Mary + Stephanie + Josef) / (Team size - 2)
= (63 + 78 + 100) / 3
= 241 / 3
= 80.3
Therefore, the Assessor’s impact is the difference between the whole-of-team’s originally-calculated PA Score and the PA Score Exclusive
Impact = PA Score - PA Score Exclusive
= 73.5 - 80.3
= - 6.8
We observe that the impact of Edward’s relatively low assessment of Adam has an impact that DEPRESSED Adam’s overall Peer Assessed Score by about 7 PA Units.
Peer Assessed Pro presents the following detail:
Edward assessed Adam, awarding a PA Subscore of 53. Compared with the average rating by the other team members of 80 this subscore DEPRESSED the PA Score to 73 by 7 PA Score Units. Team Alpha
In contrast, we see Josef’s rating of 100 had an impact that raised Adam’s Peer Assessment score. The following detailed outlier warning is presented:
Josef assessed Adam, awarding a PA Subscore of 100. Compared with the average rating by the other team members of 64 this subscore RAISED the PA Score to 73 by 9 PA Score Units. Team Alpha
Note that Adam’s self-assessed score is never used to determine a Peer Assessed Score. Therefore, there is no requirement to test for the impact of dropping his self-assessment from calculation of the Peer Assessed Score.
The threshold for raising this Warning in Peer Assess Pro is +/- 5 PA Score units, the ThresholdOutlier constant. That is, if one assessor’s rating would affect the PA Score awarded to an assessee by more than 5 units, then the Outlier Warning will be raised.
In the previous example, the impact on Adam by assessors Mary and Stephanie is within the ThresholdOutlier constant of 5 PA Score units, so no outlier warning message is generated for these two assessors.
Note. The threshold will be adjusted to 10 PA Score Units by August 2022.
A more elegant method for calculating the Assessor Impact follows. First, calculate the Peer Assessed Score for Assessed student s, excluding the PA Subscore awarded by Assessor a. Self-assessments are excluded.
Where
t = the number of team members in the team in which s is a team member.
= the Peer Assessed Score for assessed student s
= the peer assessment score awarded by Assessor a to Assessee s
The Assessor Impact, of removing Assessor a’s assessment of Assessee s is
No calculation is made for the case a = s, since self-assessments are excluded from the process.
Consider Edward’s assessment of Adam using the data from the table above.
Quick links and related information
FAQ: How is the Peer Assessed (PA) Score calculated?
FAQ: What is a valid assessed team?
FAQ: How is insignificant team agreement identified? WARNING 0041
If a team member submits a self-assessment that is ‘materially different’ than the assessments given by their other team members, this difference gives rise to an Active Warning in Peer Assess Pro titled
Critical Warning 0040 Mismatched self-assessment
A team member's self assessment is materially different to the peer assessment given by their team
The extended detail for the Active Warning displays one or more messages such as:
Gregor’s self-assessment of 63 is UNDERCONFIDENT compared with the peer assessment of 93 given by others in team Charlie. IRSA = 148
Daphne’s self-assessment of 68 is OVERCONFIDENT compared with the peer assessment of 34 given by others in team Alpha. IRSA = 51
The warning will be generated only for members of a valid assessed team, as detailed in
FAQ: What is a valid assessed team?
Furthermore, the warning will only be generated when a student has completed their self-assessment as part of their peer assessment submission.
The Peer Assess Pro system constant ThresholdIrsaUnderconfident is defined as 115. Values greater than or equal to ThresholdIrsaUnderconfident will raise the UNDERCONFIDENT active warning. In general, about 7 % to 16 % of students will be flagged with this warning.
The Peer Assess Pro system constant ThresholdIrsaOverconfident is defined as 75. Values less than or equal to ThresholdIrsaOvererconfident will raise the OVERCONFIDENT active warning. In general, about 7 % to 16 % of students will be flagged with this warning.
The Mismatched self-assessment warning is raised from the value of the Index of Realistic Self Assessment (IRSA) that is calculated for each student.
See FAQ
FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?
The warning of UNDERCONFIDENT is raised when IRSA for a student is greater than 115, the ThresholdIrsaUnderconfident.
The warning of OVERCONFIDENT is raised when IRSA for a student is less than 75, the ThresholdIrsaOverconfident.
Sample of several Peer Assessed Scores and self-assessments
Name | PA Score | PA Self | IRSA | Confidence |
Abel | 96.7 | 100 | 96.7 | |
Baker | 100.0 | 82.5 | 121.2 | UNDERCONFIDENT |
Charlie | 82.1 | 70 | 117.3 | UNDERCONFIDENT |
Daphne | 34.2 | 67.5 | 50.6 | OVERCONFIDENT |
Edward | 95.8 | 87.5 | 109.5 |
Consider the case of Daphne
Since 50.6 is less than 75, then Daphne’s self-assessment is regarded as OVERCONFIDENT. Consequently, the Mismatched self-assessment warning is raised.
The extended detail message is:
Daphne’s self-assessment of 68 is OVERCONFIDENT compared with the peer assessment of 34 given by others in team Alpha. IRSA = 51
For UNDERCONFIDENT and OVERCONFIDENT students, Peer Assess Pro generates an email that the teacher can send optionally. The email recommends the student arrange an appointment to meet with the teacher to explore the reasons for the variation in self and others’ peer assessment.
Quick links and related information
FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?
FAQ: How do I interpret measures of realistic self-assessment?
FAQ: What is an ‘at risk’ team member? WARNING 0036
FAQ: How is the Peer Assessed (PA) Score calculated?
FAQ: What is a valid assessed team?
Context: LMS version only
Through the Peer Assess Pro survey, one or more respondents may indicate that their team membership is incorrect. Perhaps a the respondent has been incorrectly assigned to an incorrect team. Or the respondent observes that team members should be added to, or deleted from their team. Peer Assess Pro streamlines the process of handling these adjustments and ensures tha the LMS team arrangement matches the team arrangement in the active running peer assessment activity.
The survey response gives rise to an Active Warning in Peer Assess Pro titled
Critical Warning 0006 Adjusted teamset request
A participant has advised that the membership of a team requires urgent adjustment.
The extended detail for the Active Warning displays one or more messages such as:
Participant Jayden Williams in Team Khan advises that a team requires adjustment to its membership. Jayden Williams should be reassigned to Team Bravo.
Participant Jane Seymour in Team Bravo advises that a team requires adjustment to its membership. Micheal Brown should be added to Team Charlie.
There are two circumstances where the teacher can only respond to the Adjusted teamset request Active Warning through using the LMS team arrangement facilities
In these special cases the reassignment action must be undertaken by either
In both cases, the new or existing team must then be incorporated into the team arrangement (grouping, group set, teamset) that was used to launch the peer assessment.
Upon returning to the Teachers Dashboard in Peer Assess Pro, the updated team arrangement will be detected and will raise the Active Warning 0021 Team arrangement unsynchronised
Quick links and related information
FAQ: How do I correct the team composition in a running peer assessment activity?
FAQ - How do I resolve an unsynchronised team arrangement? ACTIVE WARNING 0021
FAQ - What is an inactive team member? WARNING 0048
Context: LMS version only
Through the Peer Assess Pro survey, one or more respondents may indicate that one or more of their team members has been inactive. An inactive student may have
The survey response gives rise to an Active Warning in Peer Assess Pro titled
Critical Warning 0048 Inactive team member
A team member has been reported as essentially absent or unproductive.
The extended detail for the Active Warning displays one or more messages such as:
Lazy Lizzie in Team Zealous has been reported as essentially inactive by 2 team member(s) Zane Gray, Zoe Green.
Absent Abe in Team High Achievers has been reported as essentially inactive by 1 team member(s) Dean Smith.
At the top of the list are presented those team members with the highest number of inactive reports: those most deserving of the teacher’s attention.
Peer Assess Pro presents the teacher with several available actions
This reassignment action must be undertaken by creating the new team using the LMS team arrangement facilities.
When the Exclude from calculations action has been set for a student through a teacher’s response to Active Warning 0048 Inactive Team Member this action OVERRIDES certain responses to Active Warning 0041 Insignificant Team Agreement. If the teacher chooses the action Assign same personal result to all members of the team then only the active team members will earn the relevant personal result. The inactive student will earn a missing result, as dictated by the ‘Exclude from Calculations’ action.
The following example shows how excluding the peer assessment result of team member Bridget leads towards
Consider the following Peer Assessed SubScores awarded by each team member (Assessor) to their teammates (Assessees).
Calculation of Peer Assessed Scores (PAS) for a team of four members
ASSESSEE | ||||
ASSESSOR | Bridget | Julian | Lydia | Nigella |
Bridget | - | 62.5 | 75 | 72.5 |
Julian | 87.5 | - | 87.5 | 82.5 |
Lydia | 75 | 82.5 | - | 80 |
Nigella | 0 | 77.5 | 82.5 | - |
Peer Assessed Score, † | 54 | 74 | 82 | 78 |
† Peer Assessed Score = mean of scores awarded by each assessor, excluding self-assessed score. For Lydia, PAS = (75 + 87.5 + 82.5) / 3 = 82 |
From the Peer Assessed Score for each Assessee we can now calculate a variety of Personal Results from which the teacher may select.
Calculation of Personal Results in a team of four members - original
Spreadfactor = 2.0
ASSESSEE | |||||
ASSESSOR | Bridget | Julian | Lydia | Nigella | Mean |
Peer Assessed Score, † | 54 | 74 | 82 | 78 | |
Peer Assessed Index, PA Index | 66 | 90 | 100 | 95 | |
Team Result, TR | 50 | 50 | 50 | 50 | |
Indexed Personal Result, IPR | 33 | 45 | 50 | 48 | 44 |
Normalised Personal Result, NPR (Spreadfactor = 2) | 28 | 52 | 62 | 58 | 50 ‡ |
‡ By definition for the NPR method, the mean of a team’s NPR values equals the team result. The method of calculation is specified in FAQ: How is the Normalised Personal Result (NPR) calculated? |
Suppose that Assessor Nigella advised the teacher that Assessee Bridget was essentially inactive. The teacher reviews the Bridget’s Personal Feedback Report, consults with Bridget, and determines that the ratings of Bridget by Julian and Lydia were somewhat generous! It’s an end-of course summative peer assessment. Consequently, the teacher chooses to Exclude from calculations the peer assessments relating to Bridget.
The revised calculation of Peer Assessed Scores for the team yields the following table based on the remaining, active team members. The subscores awarded by the three active assessors remain unchanged, but the Peer Assessed Scores will usually be adjusted, possibly up or down for each student due to the exclusion of the inactive student’s ratings.
Re-calculation of Peer Assessed Scores (PAS) with excluded team member
ASSESSEE | ||||
ASSESSOR | Bridget | Julian | Lydia | Nigella |
Bridget | - | - | - | - |
Julian | - | - | 87.5 * | 82.5 * |
Lydia | - | 82.5 * | - | 80 * |
Nigella | - | 77.5 * | 82.5 * | - |
Peer Assessed Score | - | 80 | 85 | 81.25 |
* Indicates no change from original table For Lydia, PAS = (87.5 + 82.5) / 2 = 85 |
The revised Peer Assessed Scores yield these consequential adjustments to the personal results of the team.
Calculation of Personal Results with excluded team member
Spreadfactor = 2.0
ASSESSEE | |||||
ASSESSOR | Bridget | Julian | Lydia | Nigella | Mean |
Peer Assessed Score, PA Score | - | 80 | 85 | 81 | |
Peer Assessed Index, PA Index | - | 94 | 100 * | 95 | |
Team Result, TR | 50 * | 50 * | 50 * | 50 * | |
Indexed Personal Result, IPR | - | 47 | 50 * | 48 | 48 |
Normalised Personal Result, NPR (Spreadfactor = 2) | - | 48 | 53 | 49 | 50 ‡ * |
* Indicates no change from original table ‡ By definition, the mean of a team’s NPR values equals the team result. |
Comparing the two tables of personal results, observe that
Quick links and related information
FAQ: How is the Peer Assessed (PA) Score calculated?
FAQ: How is the Normalised Personal Result (NPR) calculated?
FAQ: How are peer assessment and personal results calculated and defined mathematically?
FAQ: What is an ‘at risk’ team member? WARNING 0036
FAQ: What is an adjusted teamset request? WARNING 0006
Suppose a team collectively submits a set of peer assessments that are both
This feature is an indication that the team may have engaged unconstructively with the peer assessment process. When these conditions are both fulfilled, an Active Warning in Peer Assess Pro is generated:
Critical Warning 0050 Low-quality team rating
A team may have engaged unconstructively with peer assessment
The extended detail for the Active Warning displays one or more messages such as:
Team Alpha may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 100 and low range 0.
Team Bravo may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 96 and low range 8.
Team Charlie may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 96 and low range 8.
The warning will be generated only for members of a valid assessed team, as detailed in
FAQ: What is a valid assessed team?
The Peer Assess Pro system constant ThresholdTeamAverage is defined as 90.
The Peer Assess Pro system constant ThresholdTeamRange is defined as 11.
The Active Warning Low-quality team rating 0050 is generated for a team when both conditions are true:
Suppose Team Mike contains 6 members, whose Peer Assessed Scores are shown below. The Average Peer Assessed Score and Range of Peer Assessed Scores are calculated.
Peer Assessed Scores for members of Team Mike
Name | Peer Assessed Score |
Annie | 93.75 |
Emma | 92.55 |
Joe | 90.85 |
Freddie | 92.50 |
Tammy | 95.88 |
Tilly | 88.32 |
Team Average | 92 = 553 / 6 |
Team Range | 8 = 95.88 - 88.32 |
The Team Average Peer Assessed Score and Team Range are examined for every team. A low quality team rating is identified for those teams that breach the Threshold parameters defined below.
Identification of low quality team ratings
Team | Team Average Peer Assessed Score | Team Range | Low Quality Team Rating |
Alpha | 100 | 0 | YES |
Mike | 92 | 8 | YES |
November | 87 | 8 | NO |
Oscar | 95 | 9 | YES |
Papa | 85 | 9 | NO |
Quebec | 95 | 10 | YES |
Romeo | 92 | 12 | NO |
In general, a team that has this warning may have engaged unconstructively with peer assessment. Most team members have not entered the spirit of the peer assessment process. They may have attempted to ‘game’ the peer assessment by giving everyone well above typical or average ratings.
Peer Assess Pro provides the facilitator with the option to send out an email to all members of the team suggesting they may wish to reconsider their ratings. Furthermore, the students are encouraged to provide qualitative evidence in support of the ratings they have provided.
In a small proportion of teams, it is possible that a high performing team will ALSO have this Active Warning generated. In a high performing team all team members contribute effectively to the results and team processes. This outcome will be evident to the teacher through the team gaining a high Team Result for their submitted work.
The following graph shows a large first year university class of 848 students who have undertaken their first, formative experience of Peer Assess Pro. Of the 84 valid teams, 24 teams are identified as potentially having a low quality team rating.
In a case like this, the teacher might consider guiding the class of students towards more constructive and discriminating peer assessment before undertaking the final, summative peer assessment. For example, remind students of the purpose of peer feedback, and how to provide useful feedback.
Quick links and related information
FAQ: What is a valid assessed team?
FAQ: What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process?
FAQ: What is the purpose of peer assessment?
FAQ: How do I provide useful feedback to my team members?
FAQ: What is a low quality assessor rating?
Scheduled for deployment November 2022
Aim: Expanded scope to identify LOW RANGE of Peer Assessment Scores OR High Average Scores
Suppose a team collectively submits a set of peer assessments that is EITHER
These features are an indication that the team may have engaged unconstructively with the peer assessment process. When EITHER of these conditions is fulfilled, an Active Warning in Peer Assess Pro is generated:
Critical Warning 0050 Low-quality team rating
A team may have engaged unconstructively with peer assessment
The extended detail for the Active Warning displays one or more messages such as:
Team Alpha may have engaged unconstructively with peer assessment. HIGH Peer Assessment Scores awarded. Average 100. LOW range 10.
Team Mike may have engaged unconstructively with peer assessment. HIGH Peer Assessment Scores awarded. Average 96. Range 40.
Team Quebec may have engaged unconstructively with peer assessment. Peer Assessment Scores awarded. Average 5. LOW range 8.
The warning will be generated only for members of a valid assessed team, as detailed in
FAQ: What is a valid assessed team?
The Peer Assess Pro system constant ThresholdTeamAverage is defined as 90.
The Peer Assess Pro system constant ThresholdTeamRange is defined as 10.
The Active Warning is generated for a team when EITHER condition is true:
Suppose Team Mike contains 5 members, whose Peer Assessed Scores are shown below. The Average Peer Assessed Score and Range of Peer Assessed Scores are calculated.
Peer Assessed Scores for members of Team Mike
Name | Peer Assessed Score |
Emma | 93 |
Joe | 90 |
Freddie | 92 |
Tammy | 96 |
Tilly | 89 |
Team Average | 92 = 460 / 5 |
Team Range | 7 = 96 - 89 |
The following Active Warning is generated for Team Mike
Team Mike may have engaged unconstructively with peer assessment. HIGH Peer Assessment Scores awarded. Average 92. LOW Range 7.
The Team Average Peer Assessed Score and Team Range are examined for every valid-assessed team. A low-quality team rating is identified for those teams that breach either of the Threshold parameters defined earlier.
Identification of low-quality team ratings
Team | Team Average Peer Assessed Score | Team Range | Low Quality Team Rating? |
Alpha | 100 HIGH | 0 LOW | YES |
Mike | 92 HIGH | 7 LOW | YES |
November | 90 HIGH | 20 | YES |
Oscar | 87 | 8 LOW | YES |
Papa | 85 | 12 | NO |
Quebec | 50 | 10 LOW | YES |
Romeo | 50 | 12 | NO |
Sierra | 20 | 20 | NO |
In general, a team that has this warning may have engaged unconstructively with peer assessment. Most team members have not entered the spirit of the peer assessment process. They may have attempted to ‘game’ the peer assessment by giving everyone well above typical or average ratings.
Peer Assess Pro provides the facilitator with the option to send out an email to all members of the team suggesting they may wish to reconsider their ratings. Furthermore, the students are encouraged to provide qualitative evidence in support of the ratings they have provided.
In a small proportion of teams, it is possible that a high-performing team will ALSO have this Active Warning generated. In a high-performing team all team members contribute effectively to the results and team processes. This outcome will be evident to the teacher through the team gaining a high Team Result for their submitted work.
The following graph shows a large first year university class of 848 students who have undertaken their first, formative experience of Peer Assess Pro. Of the 84 valid teams, 24 teams are identified as potentially having a low quality team rating.
In a case like this, the teacher might consider guiding the class of students towards more constructive and discriminating peer assessment before undertaking the final, summative peer assessment. For example, remind students of the purpose of peer feedback, and how to provide useful feedback.
Quick links and related information
FAQ: What is a valid assessed team?
FAQ: What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process?
FAQ: What is the purpose of peer assessment?
FAQ: How do I provide useful feedback to my team members?
FAQ: What is a low quality assessor rating?
Suppose a team member submits a set of peer assessments that are both
This feature is an indication that the team member may have engaged unconstructively with the peer assessment process. When these conditions are both fulfilled, an Active Warning in Peer Assess Pro is generated:
Critical Warning 0300 Low-quality assessor rating
An assessor may have engaged unconstructively with peer assessment.
The extended detail for the Active Warning displays one or more messages such as:
Tony may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 100 and low range 0. Team Alpha
Kathy may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 85 and low range 8. Team Bravo
The warning will be generated only for members of a valid assessed team, as detailed in
FAQ: What is a valid assessed team?
The Peer Assess Pro system constant ThresholdAssessorAverage is defined as 85.
The Peer Assess Pro system constant ThresholdAssessorRange is defined as 9.
The Active Warning is generated for an assessor when BOTH conditions are true:
Suppose Kathy a member of Team Bravo assesses all her fellow team members as follows:
Peer Assessed SubsScores assessed by Kathy in Team Bravo
Name | Peer Assessed Sub Score |
Garry | 90 |
Dan | 87.5 |
Sunny | 82.5 |
Freddie | 82.5 |
Robby | 82.5 |
Average | 85 = 425 / 5 |
Range | 7.5 = 90 - 82.5 |
The Average Peer Assessed Score and Range are examined for every assessor. A low quality assessor rating is identified for those individuals that breach the threshold parameters defined above..
Identification of low quality individual assessor ratings
Assessor | Average PA Score (awarded) | Range (PAS Units) | Low quality assessor rating |
Tony | 100 | 0.0 | Y |
Andy | 93 | 0.0 | Y |
Jess | 75 | 0.0 | N |
Johnny | 93 | 2.5 | Y |
Chance | 82 | 2.5 | N |
Kathy | 85 | 7.5 | Y |
Zara | 83 | 7.5 | N |
Riley | 97 | 10.0 | N |
In general, an individual with this warning may (or may not!) have engaged unconstructively with peer assessment. The team member may not have entered the spirit of the peer assessment process. They may have attempted to ‘game’ the peer assessment by giving everyone well above typical or average ratings.
Peer Assess Pro provides the facilitator with the option to send out an email to the assessor suggesting they may wish to reconsider their ratings. Furthermore, the student is encouraged to provide qualitative evidence in support of the ratings they have provided.
In a small proportion of teams, it is possible that a member of a high performing team will ALSO have this Active Warning generated. In a high performing team all team members contribute effectively to the results and team processes. Consequently, it is reasonable to expect a high average Peer Assessed Score to be awarded most members, with a concurrent low range. This outcome will be evident to the teacher through the assessor’s team gaining a high Team Result for their submitted work.
Quick links and related information
FAQ: What is a valid assessed team?
FAQ: What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process?
FAQ: What is the purpose of peer assessment?
FAQ: How do I provide useful feedback to my team members?
FAQ: What is a low quality team rating?
Peer Assess Pro restricts the display of results to teachers and students when a small number of peer assessments from a team has been submitted - less than three, or less than half. This restriction gives rise to an Active Warning in Peer Assess Pro titled
Critical Warning 0022 Insufficient team responses
The number of responses from a team is insufficient for presenting valid results.
The following extended detail is provided
Alpha has received 2 team member responses. Minimum required 3 from team size 4
Bravo has received 3 team member responses. Minimum required 4 from team size 6
Peer Assess Pro restricts the display of results to valid assessed teams. The notion of a valid assessed team is to prevent the display of results to students (and facilitators) when a small number of peer assessments from a team has been submitted. Such a low response situation could distort the reliability and accuracy of both the team’s peer assessment and personal result calculations, and ACTIVE WARNING messages for a team. Consequently, class statistics such as mean, maximum, range, and standard deviation are calculated only for team members that are designated as part of a valid assessed team.
Students can only view results if they belong to a valid assessed team.
A facilitator may only view results from valid assessed teams.
The Teacher’s Dashboard Active Warnings and (i) Information button inform you of the number of valid teams and valid assessments throughout the progress of managing the peer assessment responses. The Active Warning enables you to ‘hunt down’ the teams that have not yet achieved valid status.
Peer Assess Pro generates an email that the teacher can send optionally to members of non-valid team who have not yet responded. The email reminds students to respond by the Due Date.
For teams with five or fewer members, a valid assessed team must have peer ratings from at least three members of the team. For teams with six or more team members ‘just over’ half the team members must peer assess. The required minimum number of team members who must rate within a particular team of size n members is defined as:
Where
= the minimum number of team members required to rate within a particular team
is a function that selects the maximum of the calculated values
is a function that calculates the integer value of the result
For teams of size 0, 1 and 2 peer assessment results are not calculated. The default Personal Pesult in these circumstances is the Team Result.
Team size, | Required minimum assessors, | Proportion of whole team |
3 | 3 | 100% |
4 | 3 | 75% |
5 | 3 | 60% |
6 | 4 | 66% |
7 | 4 | 57% |
8 | 5 | 62% |
9 | 5 | 56% |
10 | 6 | 60% |
Quick links and related information
FAQ: How is the Peer Assessed (PA) Score calculated?
FAQ: How are peer assessment and personal results calculated and defined mathematically?
An ‘at-risk’ team member has been rated amongst the bottom 10 per cent of students in the class as measured by
A low rating on any of these assessments gives rise to an Active Warning in Peer Assess Pro titled
Critical Warning 0036 At-risk team member
A team member has been rated at-risk.
The following extended detail is provided
Anne Smith is at-risk. Personal Result 25 LOW. Recommendation 2.3 LOW. Peer Assessment Score 90. IRSA 60 OVERCONFIDENT. Team Alpha.
The warning identifies with ‘LOW’ which of the measures presented falls in the at-risk range.
The Index of Realistic Self Appraisal, IRSA, is also identified with OVERCONFIDENT or UNDERCONFIDENT when the threshold for overconfidence or underconfidence is exceeded, as defined in Critical Warning 0040 Mismatched self-assessment.
The most at-risk students are listed first. Specifically, the list of at-risk students is sorted by
Peer Assess Pro generates an email that the teacher can send optionally to team members with an ‘at risk’ rating. The email requests that the team member make an appointment to meet promptly with the teacher to discuss their peer assessment results so they can develop a more productive contribution to the team's future outputs, processes and leadership.
The facilitator could view the Personal Feedback Report of low rated team members examining the qualitative feedback given. You should expect that the qualitative feedback will confirm the low Recommendation or Peer Assessed Scores. It will be helpful to have reviewed these Personal Feedback Reports prior to your interviewing and counselling the at risk students who visit you.
Furthermore, a low value of IRSA, less than 75, suggests that the at-risk student is likely to be surprised or angered by the low peer assessment and/or recommendation provided by their teammates. See the WARNING that is generated by this latter condition:
FAQ: What is a mismatched self-assessment (IRSA)? WARNING 0040
The Peer Assess Pro system constants ThresholdPaScore, ThresholdRecommendation and ThresholdPersonalResult are defined to identify approximately the bottom 10 per cent of students in the class.
Where
The Active Warning is generated for an assessor when ANY condition is true:
The test is conducted only when the team has sufficient assessments to qualify as a valid assessed team.
The PERCENTILE function identifies the point in the sorted array of results below which 10 per cent of the scores in the array fall. Note that the MEDIAN of a set of results is equivalent to PERCENTILE(50%,...) of those results.
Consider this dataset of results from an end-of-course group assignment comprising 16 students in four teams.
Original dataset of personal results and other peer assessment data sorted by personal results
Name | Team | Personal result | Recommend ation | Peer Assessed Score | IRSA | Realism |
Jesse Crane | Game Plan | 27.6 | 3.0 | 70.0 | 72 | OVERCONFIDENT |
Jakob Bradley | Game Plan | 29.3 | 4.5 | 73.8 | 87 | REALISTIC |
Brogan Madden | Game Plan | 36.1 | 4.0 | 88.8 | 178 | UNDERCONFIDENT |
Jayson Mayo | Atomic Bombs | 40.7 | 3.0 | 41.7 | 48 | OVERCONFIDENT |
Cecilia Rosales | Cupcakes | 50.6 | 4.0 | 75.0 | 97 | REALISTIC |
Alfredo Koch | Cupcakes | 62.9 | 5.0 | 90.0 | 113 | REALISTIC |
Ean Cisneros | Atomic Bombs | 64.3 | 3.7 | 63.3 | 101 | REALISTIC |
Kylan Shea | Cupcakes | 64.9 | 4.7 | 92.5 | 92 | REALISTIC |
Aubrey Jarvis | Atomic Bombs | 69.5 | 3.5 | 68.1 | MISSING | |
Jesse Hughes | Cupcakes | 69.7 | 4.7 | 98.4 | 98 | REALISTIC |
Elsie Riggs | College Dropouts | 73.3 | 3.3 | 59.2 | 64 | OVERCONFIDENT |
Dalton Vincent | College Dropouts | 75.4 | 3.7 | 60.9 | 66 | OVERCONFIDENT |
Aryan Huffman | Atomic Bombs | 83.3 | 4.0 | 80.8 | 92 | REALISTIC |
Ariel Peterson | Atomic Bombs | 84.3 | 4.7 | 81.7 | 91 | REALISTIC |
Cortez Farmer | College Dropouts | 100.0 | 5.0 | 97.5 | 126 | UNDERCONFIDENT |
Dakota Stafford | College Dropouts | 100.0 | 5.0 | 95.1 | 106 | REALISTIC |
The following Active Warnings will be presented, with the most at-risk presented first in the list.
A team member has been rated at-risk in class (4)
Active Warning 0036 At-risk team member
Jesse Crane is at-risk. Personal Result (NPR) 28 LOW. Recommendation 3.0 LOW. Peer Assessed Score (PAS) 70. Self-assessment (IRSA) 72 OVERCONFIDENT. Team Game Plan.
Jakob Bradley is at-risk. Personal Result (NPR) 29 LOW. Recommendation 4.5. Peer Assessed Score (PAS) 74. Self-assessment (IRSA) 87 REALISTIC. Team Game Plan.
Jayson Mayo is at-risk. Personal Result (NPR) 41. Recommendation 3.0 LOW. Peer Assessed Score (PAS) 42 LOW. Self-assessment (IRSA) 48 OVERCONFIDENT. Team Atomic Bombs.
Elsie Riggs is at-risk. Personal Result (NPR) 73. Recommendation 3.3. Peer Assessed Score (PAS) 59 LOW. Self-assessment (IRSA) 64 OVERCONFIDENT. Team College Dropouts.
The thresholds that determine the foregoing at-risk selections are calculated from the class results in the following table, and compared with other statistics
Calculation of 10 per cent thresholds
Selected statistics | personal result | p_rec | pa_score |
SDEV | 22.6 | 0.7 | 16.2 |
MAXIMUM | 100.0 | 5.0 | 98.4 |
MEAN | 64.5 | 4.1 | 77.3 |
MEDIAN | 67.2 | 4.0 | 77.9 |
MINIMUM | 27.6 | 3.0 | 41.7 |
THRESHOLD @ 10.% PERCENTILE | 29.3 | 3.0 | 59.2 |
In this case of n=16 assessed students, the 10th percentile occurs at the 2nd lowest item in the sorted results for each variable, according to the formula
The original dataset of results is colour-coded below in which the at-risk candidates for selection are highlighted with *. Threshold values are highlighted by ¶, the second item counted upwards in the sorted column of results, since for this case.
Note how Brogan Madden is not selected, as he is better than the cutoff threshold on all three statistics. In contrast, Jayson Mayo is selected because of his below-threshold Recommendation and Peer Assessed Score, the lowest values in the dataset. Similarly, Elsie Riggs is selected because of her low-ranking Peer Assessed Score.
Source data sorted sidentifying threshold points and at-risk selections
Name | Team | Personal result | Recommend ation | Peer Assessed Score | IRSA | Realism |
* Jesse Crane | Game Plan | 27.6 | 3.0 | 70.0 | 72 | OVERCONFIDENT |
* Jakob Bradley | Game Plan | ¶ 29.3 | 4.5 | 73.8 | 87 | REALISTIC |
Brogan Madden | Game Plan | 36.1 | 4.0 | 88.8 | 178 | UNDERCONFIDENT |
* Jayson Mayo | Atomic Bombs | 40.7 | ¶ 3.0 | 41.7 | 48 | OVERCONFIDENT |
Cecilia Rosales | Cupcakes | 50.6 | 4.0 | 75.0 | 97 | REALISTIC |
Alfredo Koch | Cupcakes | 62.9 | 5.0 | 90.0 | 113 | REALISTIC |
Ean Cisneros | Atomic Bombs | 64.3 | 3.7 | 63.3 | 101 | REALISTIC |
Kylan Shea | Cupcakes | 64.9 | 4.7 | 92.5 | 92 | REALISTIC |
Aubrey Jarvis | Atomic Bombs | 69.5 | 3.5 | 68.1 | MISSING | |
Jesse Hughes | Cupcakes | 69.7 | 4.7 | 98.4 | 98 | REALISTIC |
* Elsie Riggs | College Dropouts | 73.3 | 3.3 | ¶ 59.2 | 64 | OVERCONFIDENT |
Dalton Vincent | College Dropouts | 75.4 | 3.7 | 60.9 | 66 | OVERCONFIDENT |
Aryan Huffman | Atomic Bombs | 83.3 | 4.0 | 80.8 | 92 | REALISTIC |
Ariel Peterson | Atomic Bombs | 84.3 | 4.7 | 81.7 | 91 | REALISTIC |
Dakota Stafford | College Dropouts | 100.0 | 5.0 | 97.5 | 106 | REALISTIC |
Cortez Farmer | College Dropouts | 100.0 | 5.0 | 95.1 | 126 | UNDERCONFIDENT |
* At-risk students selected ¶ Threshold value based on 10th PERCENTILE |
The teacher has several graphical approaches to identifying the most at-risk students in their class
FAQ: What is a mismatched self-assessment (IRSA)? WARNING 0040
Identifying at-risk students from sorted table of Recommendations
Identifying at risk students from sorted table of Peer Assessed Scores with concurrent examination of Self Assessment
Quick links and related information
FAQ: What steps can I take to get a better personal result?
FAQ: What is a valid assessed team?
FAQ: How is the Peer Assessed (PA) Score calculated?
FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?
FAQ: What is a mismatched self-assessment (IRSA)? WARNING 0040
Low team psychological safety is often associated with poor teamwork processes and, therefore, the prospect of poor team results (Cauwelier et al., 2016; Edmondson, 1999; Edmondson & Lei, 2014; Google Re;Work, n.d.; Google re:Work, n.d.; Kim et al., 2020).
A team is collectively designated ‘unsafe’ when the team’s median response to the Peer Assess Pro survey psychological safety statement lies within the bottom class responses.
Critical Warning 0034 Low team psychological safety
A team has identified that several members feel psychologically unsafe in their team.
The following extended detail is provided
Team Bravo teammates have stated they feel strongly unsafe about taking risks, making mistakes, or challenging their teammates. Team Safety 1.5.
The warning details are sorted by increasing value of Team Safety. Those teams at the top of the list are more likely to require the teacher’s intervention.
The Active Warning 0034 Low team psychological safety is raised when
The team psychological safety for is defined as
Where
= the psychological safety of team member i
= the number of survey respondents in team t
= 1.5
Rather than using the mean value, the median is used because it is not skewed by a small proportion of extremely large or small values. The median provides a better representation of a ‘typical’ value for the team.
In general, psychological safety is measured using multiple statements to which team members are asked to rate their strength of agreement or disagreement (Edmondson, 1999; Google Re:Work).
In pursuit of parsimony, Peer Assess Pro uses one statement that captures the most important essence of the construct of team psychological safety.
‘In my team, I feel safe to take risks, make mistakes, speak about tough issues, or challenge my teammates.’
On a five-point Likert scale from 1 to 5, the team members respond 1 = Strongly disagree’ through 5 = Strongly Agree. This is the value for presented in the earlier definition for Team Safety.
Team Safety is calculated only for teams that meet the definition of a ‘valid assessed team’.
The team members of Team Alpha have self-reported their psychological safety as follows.
Example 1
Psychological safety self-reported from a team of five members
Survey respondent: Safety | |||||
Team | Bridget | Patrick | Julian | Lydia | Nigella |
Alpha | 1 | 5 | 2 | 1 | 5 |
The team psychological safety for is defined as
Therefore, for Team Alpha
Example 2
The team members of Team Bravo have self-reported their psychological safety as follows.
Psychological safety self-reported from a team of four members
Survey respondent: Safety | ||||
Team | Gillian | Jenny | Jules | Jollion |
Bravo | 1 | 5 | 1 | 2 |
Therefore, for Team Bravo
The Active Warning 0034 Low team psychological safety will be raised for Team Bravo, but not for Team Alpha.
Note that the related Critical Warning 0035 Unsafe team member will be raised for Bridget and Lydia in Team Alpha, and for Gillian and Jules in Team Bravo.
Peer Assess Pro generates an email that the teacher can send optionally to all members of teams that present an unsafe team psychological health. The email requests that the team members make an appointment to meet promptly with the teacher.
The email states
…several of your team members stated that they 'Strongly Disagree' with the survey question 'In my team, I feel safe to take risks, make mistakes, speak about tough issues, or challenge my teammates'. The team's response is a symptom that your team may have low psychological safety. Low team psychological safety is often associated with poor teamwork processes by your team as a whole, and, therefore, poor team results in the future. Please make an appointment to meet promptly with your teacher to discuss the team's response, so the teacher can coach your team to improve its future processes, leadership and outputs.
Consider undertaking these actions with the team.
Quick links and related information
FAQ: What is an unsafe team member? WARNING 0035
FAQ: What is an ‘at risk’ team member? WARNING 0036
Cauwelier, P., Ribière, V. M., & Bennet, A. (2016). Team Psychological Safety and Team Learning: A Cultural Perspective. The Learning Organization, 23(6), 458–468.
Edmondson, A. C. (1999). Psychological Safety and Learning Behavior in Work Teams. Administrative Science Quarterly, 44(2), 350–383. https://doi.org/10.2307/2666999
Edmondson, A. C., & Lei, Z. (2014). Psychological Safety: The History, Renaissance, and Future of an Interpersonal Construct. Annual Review of Organizational Psychology and Organizational Behavior, 1(1), 23–43. https://doi.org/10.1146/annurev-orgpsych-031413-091305
Google Re:Work. (n.d.). [Re_Work] Manager Actions for Psychological Safety. https://docs.google.com/document/d/1PsnDMS2emcPLgMLFAQCXZjO7C4j2hJ7znOq_g2Zkjgk/export?format=pdf
Google re:Work. (n.d.). Understand Team Effectiveness [Guide]. Retrieved 18 November 2019, from https://rework.withgoogle.com/print/guides/5721312655835136/
Kim, S., Lee, H., & Connerton, T. P. (2020). How Psychological Safety Affects Team Performance: Mediating Role of Efficacy and Learning Behavior. Frontiers in Psychology, 11, 1581. https://doi.org/10.3389/fpsyg.2020.01581
Mellalieu, P. J. (2020). STEP 6—Promote courageous conversations among your students. In How to teach using group assignments: The 7 step formula for fair and effective team assessment (ePub 1.0, Chapter 8). Peer Assess Pro. https://www.peerassesspro.com/encourage-courageous-conversations/
Low team psychological safety is often associated with poor teamwork processes and, therefore, the prospect of poor team results (Cauwelier et al., 2016; Edmondson, 1999; Edmondson & Lei, 2014; Google Re;Work, n.d.; Google re:Work, n.d.; Kim et al., 2020).
A team member is designated ‘unsafe’ when they respond ‘Strongly disagree’ to the survey statement
‘In my team, I feel safe to take risks, make mistakes, speak about tough issues, or challenge my teammates.’
On a five-point Likert scale from 1 to 5, the student has responded 1 = Strongly disagree’.
A ‘Strongly disagree’ rating gives rise to an Active Warning in Peer Assess Pro titled
Critical Warning 0035 Unsafe team member
A team member has identified they feel psychologically unsafe in their team.
The following extended detail is provided
Anna Smith has stated they feel strongly unsafe about taking risks, making mistakes, or challenging their teammates. Recommendation 1.2. Team Alpha.
The warning details are sorted by increasing value of Recommendation. Those team members at the top of the list are more likely to require the teacher’s intervention.
Peer Assess Pro generates an email that the teacher can send optionally to team members with an unsafe self-rating. The email requests that the team member make an appointment to meet promptly with the teacher. The team member is advised
‘Your response is a symptom that your team as a whole has relatively weak psychological safety. Low team psychological safety is often associated with poor teamwork processes and, therefore, poor team results in the future. Please make an appointment to meet promptly with your teacher to discuss your response, so the teacher can coach your team to improve its future processes, leadership and outputs.’
For further guidance on how to improve a team member’s psychological health, see the recommended actions for FAQ: What is a team with low psychological safety? WARNING 0034
In pursuit of parsimony, Peer Assess Pro uses one statement to capture the most important essence of the construct of team psychological safety. In general, psychological safety is measured using multiple statements to which team members are asked to rate their strength of agreement or disagreement (Edmondson, 1999; Google Re:Work).
Quick links and related information
FAQ: What is a team with low psychological safety? WARNING 0034
FAQ: What is an ‘at risk’ team member? WARNING 0036
Cauwelier, P., Ribière, V. M., & Bennet, A. (2016). Team Psychological Safety and Team Learning: A Cultural Perspective. The Learning Organization, 23(6), 458–468.
Edmondson, A. C. (1999). Psychological Safety and Learning Behavior in Work Teams. Administrative Science Quarterly, 44(2), 350–383. https://doi.org/10.2307/2666999
Edmondson, A. C., & Lei, Z. (2014). Psychological Safety: The History, Renaissance, and Future of an Interpersonal Construct. Annual Review of Organizational Psychology and Organizational Behavior, 1(1), 23–43. https://doi.org/10.1146/annurev-orgpsych-031413-091305
Google Re:Work. (n.d.). [Re_Work] Manager Actions for Psychological Safety. https://docs.google.com/document/d/1PsnDMS2emcPLgMLFAQCXZjO7C4j2hJ7znOq_g2Zkjgk/export?format=pdf
Google re:Work. (n.d.). Understand Team Effectiveness [Guide]. Retrieved 18 November 2019, from https://rework.withgoogle.com/print/guides/5721312655835136/
Kim, S., Lee, H., & Connerton, T. P. (2020). How Psychological Safety Affects Team Performance: Mediating Role of Efficacy and Learning Behavior. Frontiers in Psychology, 11, 1581. https://doi.org/10.3389/fpsyg.2020.01581
Notifications History shows the email notifications that have been sent by the Peer Assess Pro platform to participants. The history also records event notifications sent by email to the Facilitator. The delivery status of the email is designated as SENT, DELIVERED or FAILED.
Notifications History shows emails that are sent automatically by the Peer Assess Pro platform, and those initiated by the facilitator in response to Active Warnings.
The Notifications History feature is presented at the very bottom of the Facilitator Dashboard. Click on the column Emails sent, then Message/View to examine the email sent to a specific recipient.
The Notifications History is helpful for audit purposes such as when a student denies receiving an email from you.
The delivered status of the message is designated as
The internet will take a few minutes, even hours to confirm the final state of SENT emails. You’ll need to REFRESH or RELOAD your Running Activity to update the status of the Notifications History.
Overview of Survey Notification History feature in Peer Assess Pro
Delivered status of messages sent from Peer Assess Pro platform
Quick links and related information
FAQ: What is the content of emails sent by Peer Assess Pro to Participants?
FAQ: What is the content of emails sent by Peer Assess Pro to Facilitators?
FAQ - How do I fix an invalid, missing or failed email delivery? WARNING 0026
In the Active Warnings section of the Facilitators Dashboard, select Preview Email
Note that a record of the email sent from Active Warnings is recorded in the Survey Notifications History. See FAQ - What emails have been sent by the platform?
The first table shows the SUBJECT title of the email generated in response to automated various events and responses to warnings activated by the Facilitator.
The second table shows the detailed content of each email. The Facilitator can, of course, copy this template text, modify, and send their own email.
Some emails are automatically generated by the Peer Assess Pro platform, such as 0011 Request to COMPLETE peer assessment and 0013 RESUBMIT peer assessment due to TEAM CHANGE.
Other emails are sent under the direction of the Facilitator when they respond to an Active Warning. Examples include 0103 WARNING Request to RECONSIDER peer assessment: Assessor unconstructive
The content of email generated by Peer Assess Pro undergoes regular review and improvement. Details may not match exactly the detail presented here.
Quick links and related information
FAQ: What is the content of emails sent by Peer Assess Pro to Facilitators?
FAQ - What emails have been sent by the platform?
FAQ - How do I fix an invalid, missing or failed email delivery? WARNING 0026
Email ID - Priority - Short Descriptor SUBJECT |
0011 CRITICAL - Participant - Request to COMPLETE peer assessment SUBJECT - Please complete peer assessment due by << Due Date >>. <<Activity Title>> |
0012 CRITICAL - Participant - REMINDER to complete peer assessment SUBJECT - REMINDER! Please complete peer assessment due by << Due Date >>. <<Activity Title>> |
0013 CRITICAL - Participant - RESUBMIT peer assessment due to TEAM CHANGE SUBJECT - RESUBMIT! Please complete peer assessment due by << Due Date >>. <<Activity Title>> |
0020 CRITICAL - Participant - ABANDONED Peer Assessment activity. SUBJECT - ABANDONED peer assessment for peer assessment due by << Due Date >>. <<Activity Title>> |
0103 WARNING - Participant - Request to RECONSIDER peer assessment: Assessor unconstructive SUBJECT - Request to reconsider peer assessment due by << Due Date >>. <<Activity Title>> |
1001 ADVISORY - Participant - Personal results PUBLISHED and available to view SUBJECT - Please view your personal results for peer assessment due by << Due Date >>. <<Activity Title>> |
1002 ADVISORY - Participant - REVISED personal results published and available to view SUBJECT - REVISED RESULTS! Please view your personal results for peer assessment due by << Due Date >>. <<Activity Title>> |
1003 ADVISORY - Participant - FINALISED personal results published and available to view SUBJECT - FINALISED RESULTS! Please view your personal results for peer assessment <<Activity Title>>. Available until << finalisation date + 2 weeks >> |
1004 ADVISORY - Participant - Personal results PUBLISHED but NOT available to view SUBJECT - Incomplete submissions from your team for peer assessment due by << Due Date >>. <<Activity Title>> |
1005 ADVISORY - Participant - FINALISED personal results published but NOT available to view SUBJECT - FINALISED RESULTS: Incomplete submissions from your team for peer assessment due by << Due Date >>. <<Activity Title>> |
Email ID - Priority - Recipient - Short Descriptor SUBJECT - Subject Detail |
0011 - CRITICAL - Participant - Request to COMPLETE peer assessment SUBJECT - Please complete peer assessment due by << Due Date >>. <<Activity Title>> Dear <<team member>>, Please complete the Peer Assess Pro peer assessment activity << Activity Title>> for your team before << Due Time>> << Due Date >>. To complete the activity, please visit the Activity URL << Activity Specific URL>>. The peer assessment requires a Login ID. Usually, the Login ID will be your student id, unless your teacher has advised an alternative. The Activity URL will become available for your responses from << Activity Start Time>> << Activity Start Date >>. Team membership check The following are your team members. If there is a mistake in this list please urgently advise your teacher the correct composition, using the email listed below. << Team Name>> <<List of Team members>> Further information For further information about preparing for, and using the results of the peer assessment, please visit https://www.peerassesspro.com/resources/resources-for-students/. You may find the answers to these Frequently Asked Questions helpful at this stage: FAQ: What is the purpose of peer assessment? FAQ: What questions are asked in the peer assessment survey? FAQ: How do I provide useful feedback to my team members? FAQ: I am unable to login. My login failed Do not reply to this email. Rather, contact your teacher whose email is listed below. Sent by Peer Assess Pro on behalf of << Teacher fullname >> << Teacher email >> |
0012 - CRITICAL - Participant - REMINDER to complete peer assessment SUBJECT - REMINDER! Please complete peer assessment due by << Due Date >>. <<Activity Title>> Dear <<team member>>, The peer assessment activity for << Activity Title>> will soon become unavailable for you to complete. Therefore, please complete the Peer Assess Pro peer assessment activity for your team before << Due Time>> << Due Date >>. To complete the activity, please visit the Activity URL << Activity Specific URL>>. The peer assessment requires a Login ID. Usually, the Login ID will be your student id, unless your teacher has advised an alternative. The Activity URL became available for your responses from << Activity Start Time>> << Activity Start Date >>. Further information For further information about preparing for, and using the results of the peer assessment, please visit https://www.peerassesspro.com/resources/resources-for-students/. You may find the answers to these Frequently Asked Questions helpful at this stage: FAQ: What is the purpose of peer assessment? FAQ: What questions are asked in the peer assessment survey? FAQ: How do I provide useful feedback to my team members? FAQ: I am unable to login. My login failed Do not reply to this email. Rather, contact your teacher whose email is listed below. Sent by Peer Assess Pro on behalf of << Teacher fullname >> << Teacher email >> |
0013 - CRITICAL - Participant - RESUBMIT peer assessment due to TEAM CHANGE SUBJECT - RESUBMIT! Please complete peer assessment due by << Due Date >>. <<Activity Title>> Dear <<team member>>, You may have already completed the Peer Assess Pro peer assessment for <<Activity Title>> due before << Due Time>> << Due Date >>. I regret to advise that I require you to resubmit your survey. You response submitted to date has been deleted from the analysis. The reasons for this request may be due to a change to the membership of your team, such as a deletion or addition of a team member. To complete the activity, please visit the Activity URL << Activity Specific URL>>. Please resubmit your peer assessment for <<Activity Title>> due before << Due Time>> << Due Date >>. We apologise for your inconvenience. Team membership check The following are your team members. If there is a mistake in this list please urgently advise your teacher the correct composition, using the email listed below. << Team Name>> <<List of Team members>> Further information For further information about preparing for, and using the results of the peer assessment, please visit https://www.peerassesspro.com/resources/resources-for-students/. You may find the answers to these Frequently Asked Questions helpful at this stage: FAQ: How do I login to my peer assessment Activity URL FAQ: How do I provide useful feedback to my team members? FAQ: I am unable to login. My login failed Do not reply to this email. Rather, contact your teacher whose email is listed below. Sent by Peer Assess Pro on behalf of << Teacher fullname >> << Teacher email >>Team membership check |
0020 - CRITICAL - Participant - ABANDONED Peer Assessment activity. SUBJECT - ABANDONED peer assessment for peer assessment due by << Due Date >>. <<Activity Title>> Dear <<team member>>, You and your team members were invited to participate recently in the peer assessment for << Activity Title>> due << Due Date >>. The Teacher ABANDONED the activity on << Abandoned Date >> due to exceptional circumstances. Please disregard any previous interim published results. I apologise for your inconvenience. Do not reply to this email. Rather, contact your teacher whose email is listed below. Sent by Peer Assess Pro on behalf of << Teacher fullname >> << Teacher email >> |
0103 - WARNING - Participant - Request to RECONSIDER peer assessment: Assessor unconstructive SUBJECT - Request to reconsider peer assessment due by << Due Date >>. <<Activity Title>> Dear <<team member>>, You recently completed the peer assessment of << Activity Title>>. However your teacher noted that your individual responses suggest you have not engaged constructively with the peer assessment process. Specifically, you may have: - Rated all team members over a narrow range and/or - Rated all team members overgenerously and/or - The qualitative comments in your feedback failed to justify the ratings you provided. If you feel that your ratings and feedback are justified, you need take no further action. For example, such high ratings may be justified for a team with evidence of exceptionally high performance on its tasks and outputs. Alternatively, if you wish to resubmit a more accurate survey, please use the URL below to submit a replacement peer assessment survey. Please take special care to provide useful and accurate qualitative feedback that will help your team member(s) and teacher understand the ratings you have provided. To complete the activity, please visit the Activity URL << Activity Specific URL>>. Complete the revised Peer Assess Pro peer assessment activity for your team before << Due Time>> << Due Date >>. Further information For further information about preparing for, and using the results of the peer assessment, please visit https://www.peerassesspro.com/resources/resources-for-students/. You may find these answers to these Frequently Asked Questions helpful at this stage: FAQ: How do I provide useful feedback to my team members? FAQ: What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process? FAQ: Is my self-assessment used to calculate my Peer Assessed Score? Do not reply to this email. Rather, contact your teacher whose email is listed below. Sent by Peer Assess Pro on behalf of << Teacher fullname >> << Teacher email >> |
1001 - ADVISORY - Participant - Personal results PUBLISHED and available to view SUBJECT - Please view your personal results for peer assessment due by << Due Date >>. <<Activity Title>> Dear <<team member>>, You recently completed the peer assessment of << Activity Title>>. You may now view your Personal Result and feedback. Please visit the Activity URL << Activity Specific URL>>. Your results will be available for you to view for a period of two weeks following the finalisation of the activity. If you have specific questions or concerns about your Personal Results please contact the teacher promptly so that a remedy can be determined. Further information For further information about using the results of the peer assessment, please visit https://www.peerassesspro.com/resources/resources-for-students/. You may find these answers to these Frequently Asked Questions helpful at this stage: FAQ: How do I interpret the feedback results I've received from the peer assessment? FAQ: What steps can I take to get a better personal result? FAQ: I don't understand what my teammates are trying to tell me. How do I ask for better feedback? FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern? Do not reply to this email. Rather, contact your teacher whose email is listed below. Sent by Peer Assess Pro on behalf of << Teacher fullname >> << Teacher email >> |
1002 - ADVISORY - Participant - REVISED personal results published and available to view SUBJECT - REVISED RESULTS! Please view your personal results for peer assessment due by << Due Date >>. <<Activity Title>> Dear <<team member>>, You recently completed the peer assessment of << Activity Title>>. You may now view your revised Personal Result and feedback. Please visit the Activity URL << Activity Specific URL>>. Your results may have been revised from those previously made available to you. Reasons for revisions include: - A change in Team Results - Late peer assessment responses - An adjustment to the method the teacher has used to calculate your personal result. Your results will be available for you to view for a period of two weeks following the finalisation of the activity. If you have specific questions or concerns about your Personal Results please contact the teacher promptly so that a remedy can be determined. Further information For further information about using the results of the peer assessment, please visit https://www.peerassesspro.com/resources/resources-for-students/. You may find the answers to these Frequently Asked Questions helpful at this stage: FAQ: How do I interpret the feedback results I've received from the peer assessment? FAQ: What steps can I take to get a better personal result? FAQ: I don't understand what my teammates are trying to tell me. How do I ask for better feedback? FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern? Do not reply to this email. Rather, contact your teacher whose email is listed below. Sent by Peer Assess Pro on behalf of << Teacher fullname >> << Teacher email >> |
1003 - ADVISORY - Participant - FINALISED personal results published and available to view SUBJECT - FINALISED RESULTS! Please view your personal results for peer assessment <<Activity Title>>. Available until << finalisation date + 2 weeks >> Dear <<team member>>, You recently completed the peer assessment of << Activity Title>>. You may now view your final Personal Result and feedback. Please visit the Activity URL << Activity Specific URL>>. Your results are available for you to view for a period of two weeks following the finalisation of the activity. That is, from now until << Finalisation date + two weeks >>. Your results may have been revised from those previously made available to you. The revisions may have been due to: - A change in Team Results - Late peer assessment responses - An adjustment to the method the teacher has used to calculate your personal result. If you have specific questions or concerns about your Personal Results please contact the teacher promptly so that a remedy can be determined. Further information For further information about using the results of the peer assessment, please visit https://www.peerassesspro.com/resources/resources-for-students/. You may find the answers to these Frequently Asked Questions helpful at this stage: FAQ: How do I interpret the feedback results I've received from the peer assessment? FAQ: What steps can I take to get a better personal result? FAQ: I don't understand what my teammates are trying to tell me. How do I ask for better feedback? FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern? Do not reply to this email. Rather, contact your teacher whose email is listed below. Sent by Peer Assess Pro on behalf of << Teacher fullname >> << Teacher email >> |
1004 - ADVISORY - Participant - Personal results PUBLISHED but NOT available to view SUBJECT - Incomplete submissions from your team for peer assessment due by << Due Date >>. <<Activity Title>> Dear <<team member>>, You recently completed the peer assessment << Activity Title>>. However, several team of your members have yet to complete their peer assessment. Consequently, you are restricted from viewing the results of the peer assessment as the results would not yet be valid. You may wish to take action by reminding your team members to complete the peer assessment. Once the remainder of your team have completed their peer assessments, you will be able to view your final Personal Result and feedback. Please visit the Activity URL << Activity Specific URL>>. Further information For further information about using the results of the peer assessment, please visit https://www.peerassesspro.com/resources/resources-for-students/. You may find the answers to these Frequently Asked Questions helpful at this stage: FAQ: What is a valid assessed team? FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern? Do not reply to this email. Rather, contact your teacher whose email is listed below. Sent by Peer Assess Pro on behalf of << Teacher fullname >> << Teacher email >> |
1005 - ADVISORY - Participant - FINALISED personal results published but NOT available to view SUBJECT - FINALISED RESULTS: Incomplete submissions from your team for peer assessment due by << Due Date >>. <<Activity Title>> Dear <<team member>>, You and your team members were invited to participate recently in the peer assessment for << Activity Title>> due << Due Date >>. The Teacher finalised the results on << Finalisation Date >>. However, several of your team members failed to complete their peer assessment. Consequently, you are restricted from viewing the results of the peer assessment as the results are not valid. Since the activity has been finalised, there is no option for further peer assessments to be submitted from your team. Further information For further information about using the results of the peer assessment, please visit https://www.peerassesspro.com/resources/resources-for-students/. You may find the answers to these Frequently Asked Questions helpful at this stage: FAQ: What is a valid assessed team? FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern? Do not reply to this email. Rather, contact your teacher whose email is listed below. Sent by Peer Assess Pro on behalf of << Teacher fullname >> << Teacher email >> |
The first table shows the SUBJECT of the emails sent TO the Facilitator in response to various automated events happening during the launch and management of a Peer Assess Pro activity.
The second table shows the detailed content of each email.
The content of email generated by Peer Assess Pro undergoes regular review and improvement. Details may not match exactly the detail presented here.
Quick links and related information
FAQ: What is the content of emails sent by Peer Assess Pro to Participants?
FAQ - What emails have been sent by the platform?
FAQ - How do I fix an invalid, missing or failed email delivery? WARNING 0026
Email ID - Priority - Short Descriptor SUBJECT |
2001 ADVISORY - Facilitator - Launch successful SUBJECT - SUCCESSFUL LAUNCH: Your peer assessment <<Activity Title>> . Due by <<Due Date>> |
2002 ADVISORY - Facilitator - Manage progress SUBJECT - MANAGE PROGRESS: Your peer assessment <<Activity Title>> . Due by <<Due Date>> |
2006 ADVISORY - Facilitator - Due Date imminent SUBJECT - DUE DATE IMMINENT: Review your peer assessment <<Activity Title>> . Due by <<Due Date>> |
2008 ADVISORY - Facilitator - Due Date reached SUBJECT - DUE DATE REACHED: Finalise your peer assessment <<Activity Title>> . Due by <<Due Date>> |
Email ID - Priority - Recipient - Short Descriptor SUBJECT - Subject Detail |
2001 - ADVISORY - Facilitator - Launch successful SUBJECT - SUCCESSFUL LAUNCH: Your peer assessment <<Activity Title>> . Due by <<Due Date>> Dear << Teacher Fullname >>, You have launched successfully the peer assessment activity << Activity Title>>. Number of participants << class size>> allocated to << Number of teams >> teams. The peer assessment is available for students from <<Start Date >> and due for completion by <<Due Date>>. Students complete the peer assessment at the survey URL <<Students Survey URL>>>. To manage this activity, view your Teacher's Peer Assess Pro Dashboard here <<Teacher's Activity URL>>. Manage the peer assessment The Peer Assess Pro Quickstart Guide reminds you of the next steps you will take as you wait for students to respond. See https://www.peerassesspro.com/quickstart-guide-for-teachers/ Prepare your students for peer assessment These multi-media resources will help you prepare your students for undertaking peer assessment through giving honest, fair, and useful feedback. See https://www.peerassesspro.com/resources/introducing-students-peer-assessment/ As a backup communication to your students, we recommend that the material in the next section 'Advise your students' is emailed and/or posted to your students on your Learning Management System Messaging facility. Advise your students [Teacher, send this section by email and/or post on your LMS] +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Dear students, The peer assessment << Activity Title>> is available for you to complete from <<Start Date >> and due for completion by <<Due Date>>. Complete the peer assessment at the survey URL <<Students Survey URL>>>. You will be sent emails from @peerassesspro.com to advise you when feedback results are available. and where to complete the survey. Please check your junk and spam email. Ensure you allow emails from @peerassesspro.com into your Important mailbox and add @peerassesspro.com as a Contact. You may find these Frequently Asked Questions (FAQs) relevant before you start the peer assessment, view here https://www.peerassesspro.com/frequently-asked-questions-2/. FAQ: What is the purpose of peer assessment? FAQ: How do I provide useful feedback to my team members? FAQ: What questions are asked in the peer assessment survey? FAQ: What if I am unable to login to the peer assessment survey? Students can find further information about peer assessment here https://www.peerassesspro.com/resources/resources-for-students/. Kind regards << Teacher fullname >> << Teacher email >> +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Further information For additional advice on managing a Peer Assess Pro activity, review the Frequently Asked Questions here https://www.peerassesspro.com/frequently-asked-questions-2/ FAQ: When and how is the peer assessment conducted? FAQ: How do I correct the Team Composition in a running peer assessment activity? FAQ: How do I take action on the Active Warnings presented in the Peer Assess Pro™ Teacher’s Dashboard?’ What if I ignore the Warnings? FAQ: How do I decide which Personal Result method to apply in my peer assessment activity? FAQ: What is the content of emails sent by Peer Assess Pro? FAQ: Can I adjust the Start Date or Due Date for a running peerassessment activity? Kind regards, Peer Assess Pro https://www.peerassesspro.com/support/ |
2002 - ADVISORY - Facilitator - Manage progress SUBJECT - MANAGE PROGRESS: Your peer assessment <<Activity Title>> . Due by <<Due Date>> Dear << Teacher Fullname >>, Your peer assessment activity << Activity Title>> is available for students from <<Start Date >> and due for completion by <<Due Date>>. However, students can continue submitting responses beyond the Due Date until you personally FINALISE the activity on the Peer Assess Pro Dashboard. To manage this activity, view your Teacher's Peer Assess Pro Dashboard here <<Teacher's Activity URL>>. Review Active Warnings At this mid-point of the peer assessment process, we suggest you review carefully the Active Warnings on your Peer Assess Pro dashboard. In particular 1. Review the Class Statistics particularly for students rated with the lowest Peer Assessed Scores. The Personal Snapshots for these students may identify absent students or students at risk of course failure 2. Review teams that have not yet submitted sufficient responses to become assessed validly. Remind the team's members to submit 3. Review students who have rated a team member significantly differently than the other team members. This outlier rating may be a sign of dysfunction within the team 4. Review the Qualitative Feedback report, examining especially students who have been peer assessed with a low rating by other team member(s) 5. Review students with an OVERCONFIDENT or UNDERCONFIDENT Index of Realistic Self Assessment (IRSA). Suggest they meet with you to discuss their peer assessment results 6. Identify students or teams who have not engaged constructively with the peer assessment process. That is, they have rated their team members over a narrow range of scores or have rated their team members well above average. Encourage them to resubmit and justify the ratings they have provided. Enter team results At this stage you can enter your (provisional or final) Team Results. You can test the impact of the alternative methods for calculating students' Personal Results. Consider downloading the Statistics, Qualitative Feedback, and Teachers Feedback to preview the format of the final reports you will receive from Peer Assess Pro. Provisional publication of results Consider publishing provisionally the Personal Results for members of valid teams. Valid teams that have met the minimum threshold required number of responses. You can preview students' Personal Snapshots in 'live' mode to see what students will see on their login dashboard before your Publish or Update the results. This is helpful for your quality management of the peer assessment process. Prepare your students for interpreting the results of their peer assessment These multi-media resources will help you prepare your students for making productive learning from their peer feedback and results. See https://www.peerassesspro.com/resources/introducing-students-peer-assessment/ Furthermore, the class may find these FAQs relevant from now FAQ: How do I interpret the feedback results I've received from the peer assessment? FAQ: How do I provide useful feedback to my team members? FAQ: I don't understand what my teammates are trying to tell me. How do I ask for better feedback? FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern? Students can find further information about peer assessment here https://www.peerassesspro.com/resources/resources-for-students/. Managing the peer assessment The Peer Assess Pro Quickstart Guide reminds you of the next steps you will take as you remind the remaining students to respond. See https://www.peerassesspro.com/quickstart-guide-for-teachers/ Further information For additional advice on managing a Peer Assess Pro activity, review the Frequently Asked Questions here: https://www.peerassesspro.com/frequently-asked-questions-2/ FAQ: How do I take action on the Warnings presented in the Peer Assess Pro™ Teacher’s Dashboard?’ What if I ignore the Warnings? FAQ: How do I decide which Personal Result method to apply in my peer assessment activity? FAQ: What is a valid assessed team? FAQ: How is an outlier peer assessment rating identified? FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated? Kind regards, Peer Assess Pro https://www.peerassesspro.com/support/ |
2006 - ADVISORY - Facilitator - Due Date imminent SUBJECT - DUE DATE IMMINENT: Review your peer assessment <<Activity Title>> . Due by <<Due Date>> Dear << Teacher Fullname >>, You scheduled the peer assessment activity << Activity Title>> due for completion by <<Due Date>>. However, students can continue submitting responses beyond the Due Date until you personally FINALISE the activity. View your Teacher's Peer Assess Pro Dashboard here <<Teacher's Activity URL>>. Review Active Warnings Prior to publishing and finalising the peer assessment, we suggest you review carefully the Active Warnings on your Peer Assess Pro dashboard. In particular, 1. Review the Class Statistics particularly for students rated with the lowest Peer Assessed Scores. The Personal Snapshots for these students may identify absent students or students at risk of course failure 2. Review the Qualitative Feedback report, examining especially students who have been peer assessed with a low rating by other team member(s). There may be feedback comments upon which you wish to take proactive intervention with the assessor or assessed student 3. Review students with an OVERCONFIDENT or UNDERCONFIDENT Index of Realistic Self Assessment (IRSA). Suggest they meet with you to discuss their peer assessment results. Enter team results At this stage you may enter your Team Results. Next, confirm the method for calculating the Personal Result that will be awarded to each student. Prepare your students for interpreting the results of their peer assessment These multi-media resources will help you prepare your students for making productive learning from their peer feedback and results. See https://www.peerassesspro.com/resources/introducing-students-peer-assessment/ Furthermore, the class may find these FAQs relevant from now FAQ: How do I interpret the feedback results I've received from the peer assessment? FAQ: I don't understand what my teammates are trying to tell me. How do I ask for better feedback? FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern? FAQ: How do I interpret measures of realistic self-assessment? Students can find further information about peer assessment here https://www.peerassesspro.com/resources/resources-for-students/. Further information For additional advice on managing and finalising a Peer Assess Pro activity, review the Frequently Asked Questions in the section 'Manage the peer assessment activity' here: https://www.peerassesspro.com/frequently-asked-questions-2/ FAQ: How do I take action on the Warnings presented in the Peer Assess Pro™ Teacher’s Dashboard?’ What if I ignore the Warnings? FAQ: What is a valid assessed team? FAQ: How is an outlier peer assessment rating identified? FAQ: What steps can I take to get a better personal result? Kind regards, Peer Assess Pro https://www.peerassesspro.com/support/ |
2008 - ADVISORY - Facilitator - Due Date reached SUBJECT - DUE DATE REACHED: Finalise your peer assessment <<Activity Title>> . Due by <<Due Date>> Dear << Teacher Fullname >>, You scheduled the peer assessment activity << Activity Title>> due for completion by <<Due Date>>. However, students can continue submitting responses beyond the Due Date until you personally FINALISE the activity. View your Teacher's Peer Assess Pro Dashboard here <<Teacher's Activity URL>>. Review Active Warnings Prior to publishing and finalising the peer assessment, we suggest you review carefully the Active Warnings on your Peer Assess Pro dashboard. In particular, 1. Review the Class Statistics particularly for students rated with the lowest Peer Assessed Scores. The Personal Snapshots for these students may identify absent students or students at risk of course failure 2. Review the Qualitative Feedback report, examining especially students who have been peer assessed with a low rating by other team member(s). There may be feedback comments upon which you wish to take proactive intervention with the assessor or assessed student 3. Review students with an OVERCONFIDENT or UNDERCONFIDENT Index of Realistic Self Assessment (IRSA). Suggest they meet with you to discuss their peer assessment results. Enter team results At this stage you should enter your Team Results and select the method for calculating the Personal Result that will be awarded to each student. Prepare your students for interpreting the results of their peer assessment These multi-media resources will help you prepare your students for making productive learning from their peer feedback and results. See https://www.peerassesspro.com/resources/introducing-students-peer-assessment/ Furthermore, the class may find these FAQs relevant from now. FAQ: How do I interpret the feedback results I've received from the peer assessment? FAQ: I don’t understand what my teammates are trying to tell me. How do I ask for better feedback? FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern? FAQ: How do I interpret measures of realistic self-assessment? Students can find further information about peer assessment here https://www.peerassesspro.com/resources/resources-for-students/. Further information For additional advice on managing and finalising a Peer Assess Pro activity, review the Frequently Asked Questions in the section 'Manage the peer assessment activity' here: https://www.peerassesspro.com/frequently-asked-questions-2/ FAQ: How do I take action on the Warnings presented in the Peer Assess Pro™ Teacher’s Dashboard?’ What if I ignore the Warnings? FAQ: What is a valid assessed team? FAQ: How is an outlier peer assessment rating identified? FAQ: What steps can I take to get a better personal result? Kind regards, Peer Assess Pro https://www.peerassesspro.com/support/ |
To access the Peer Assess Pro survey you require an activity-specific URL. In general, the format of the Activity URL is:
https://q.xorro.com/teacherid/activityid
Example
https://q.xorro.com/smup/23021
The teacherid is usually four letters, such as smup. The teacher is ALWAYS identified by these letters.
The activityid is usually several digits, such as 23021.
The Activity URL is provided to a student through:
The Participant URL lists ALL the activities currently running that have been started by one teacher. The format is a truncated form of the Activity URL. That is, no Activityid, just the teacherid:
When everything is working correctly, you follow the link to the Activity URL. You should see the Login Page. Note the Activity title in the top left corner. That information should confirm you have the correct Activity URL for the peer assessment you are required to undertake.
Login to Peer Assess Pro Activity
Enter you ID. For students, this is usually your Student ID or Student Registration. Your teacher or facilitator will advise you if a different system of identification is being used.
Successful login confirms your name, and details about the institution and teacher that should be familiar to you!
Select ‘Next’ to proceed to the peer assessment.
Successful login to Peer Assess Pro Activity
Quick links and related information
FAQ: I am unable to login. My login failed
There are several reasons why a student’s login may fail to be successful. Steps to effect a remedy are detailed later.
This FAQ explains how to login correctly:
FAQ: How do I login to my Activity URL
FAQ: How do I correct the Team Composition in a running peer assessment activity?
Select the required Activity Title from the list of teacher’s activities
There is a rare exception that can prevent a student’s login. This exception occurs when there are two or more identical IDs in the Xorro institution participants database.
Search All Participants by ID to identify duplicate ID matches
8. Some other mysterious fault
If none of the previous explanations or solutions resolve the issue, contact Peer Assess Pro providing full details of the messages shown, activity, activity URL, participants CSV, and institution.
Quick links and related information
FAQ: How do I login to my Activity URL
The short answer is No, you cannot adjust either of these dates. However, there are workarounds described below.
When the Start Date and time is reached a multiplicity of emails are sent to the students in the class advising them
Since there is NO MECHANISM to recall the despatched emails, the sole workaround to adjust the Start Date is to abandon the peer assessment, then launch a new per assessment. The abandon process is detailed below, Worst case scenario: Abandon the peer assessment
The teacher establishes the deadline for completing a peer assessment when the activity is first created then launched. The deadline is termed the Due Date.
You cannot extend the Due Date once the activity has been launched. However, the Due Date is advisory only. See later!
The Due Date is the date that students will be advised by which they should complete the peer assessment. The Due Date is advised to students through:
The Due Date is also used to prompt the teacher to conduct important administration and management activities during the activity, and prior to Finalisation.
The ‘Due Date’ date is advisory only. Students can CONTINUE to submit responses beyond the Due Date UNTIL the teacher Finalises or Abandons the activity. After Finalisation, the students will have up to two weeks to review their results after the Finalisation Date.
Given that the Due Date is advisory only, we suggest you announce in class, or by email that you have ‘extended’ the Due Date for peer assessment submissions until some arbitrary future date you select. On that date you can then choose to Finalise the peer assessment. Check the progress results first!
Abandoning a peer assessment is a worst case, last ditch attempt which we advise against.
However if you insist on changing the Start Date or the Due date, then from the Peer Assess Pro Dashboard
The Activity URL will become invalid. The students will be advised the peer assessment has been abandoned. All survey results collected to date WILL BE nulled.
Now launch a new activity with the revised Start Date and/or Due Date. A few students will be confused by receiving the (old, abandoned) peer assessment Activity URL and the superseding peer assessment request. However, only the new activity will be available to students. Furthermore, the URL link to the abandoned activity will direct the student to a list of all Xorro running activities initiated by that teacher. Hopefully, you have clearly identified the title of the peer assessment activity so that students can select it correctly.
Abandoning a peer assessment is a worst case, last ditch attempt which we advise against.
Quick links and related information
The importance of correctly selecting the Start Date and Due Date is detailed in the Reference Manual:
2.3 Launch and create the peer assessment activity
About Finalisation and Abandonment
4. Finalise the peer assessment activity
“I am frustrated beyond reason when it comes to creating or editing my teamset csv file!”
When you attempt to IMPORT or UPDATE a teamset csv into Peer Assess Pro, you may be unsuccessful for a variety of reasons. This table lists the most common issues.
Problem | Reason and solution |
My specified file is rejected | The Participants CSV file specified is not a comma-separated variables (csv) file. You MUST supply a csv file. You cannot attempt to launch using an xls, tsv, pdf or other file type. Here is a sample.csv file from the Xorro site. |
I get a message like ‘missing group’ | The first row of your Participants CSV file must contain these column headers id, first, last, email, team, group_code. REQUIRED in any order AND separated with a comma or semicolon delimiter. Check your file follows the rules here Is my Participants CSV in the correct format for launching a peer assessment activity? |
I don’t understand the importance of column headers | In a Participants CSV file intended for use in a peer assessment the first row must specify literally these column headers id, first, last, email, team, group_code. Ensure there are NO leading or trailing spaces or hidden characters in the headers. The subsequent data, presented row by row in the csv, is defined in Xorro Help Importing Participants, Teams and Groups |
I am completely mystified about the distinction between the group_code and team columns of data. | The team column designates the allocation of all the participants in the file who belong to the same team, such as Tiger in the sample.csv illustrated later.. The group_code designates the allocation of all the participants in the file who belong to a higher-order arrangement, such as the SAME class, a tutorial, an assignment. Explained exhaustively in Xorro Help Importing Participants, Teams and Groups |
I receive several error messages during the peer assessment launch process | When you upload or revise your team composition participants csv there is extensive quality assurance before you commit to launch or update your Peer Assess Pro activity. RED-coloured errors will halt the launch until you correct your csv. Other errors are warnings that allow you to ‘proceed with care’. Error notifications upon upload of a teamset CSV to a peer assessment |
Other mystifying messages | Here is a comprehensive list of all the potential errors that can occur when you attempt to import your teamset csv file, and how to correct those errors Comprehensive list of potential errors when attempting to import participants csv |
Give me confidence I’m following the correct steps | Practice downloading the sample.csv file from the Xorro site. Then launch the peer assessment using the same unchanged sample file. After building confidence that process works, use your spreadsheet editor or text editor to make changes such as replacing the email addresses to those you know, and adding new team members. Practice saving as a csv. Confirm by opening your csv with a text editor that you have created a file with the extension and format csv. Now, launch the peer assessment using your adjusted Participants csv. Is my Participants CSV in the correct format for launching a peer assessment activity? |
When all else fails | You could
|
You create or edit a Participants CSV file using a spreadsheet editor (Google Sheets, Apple Numbers, Microsoft Excel) or a text editor (Apple Textedit, or Microsoft Windows Notepad).
After you have created your arrangement of participants into teams using your editor, you must create a CSV file. For example, see FAQ: How do I create a CSV file from a Google Sheet?
Download the Xorro Participants CSV sample file here https://www.xorro.com/wp-content/uploads/2020/10/Participant_CSV_Sample.csv
id,first,last,group_code,team,email,
BOWI12,Bob,Wilson,123.101,Tiger,Bob.Wilson@xorroinstitution.com,
ALJO11,Alice,Jones,123.101,Panda,Alice.Jones@xorroinstitution.com,
JOSM13,John,Smith,123.101,Tiger,John.Smith@xorroinstitution.com,
JOSM13,John,Smith,123.202,,John.Smith@xorroinstitution.com,
GRGR15,Greta,Green,123.101,Panda,Greta.Green@xorroinstitution.com,
GRGR15,Greta,Green,123.204,,,
HEJO19,Henry,Jones,123.101,Tiger,Henry.Jones@xorroinstitution.com,
AMTO01,Amanda,Tolley,123.101,Bear,Amanda.Tolley@xorroinstitution.com,
JEWA06,Jeff,Wang,123.101,Panda,Jeff.Wang@xorroinstitution.com,
HOBR03,Holly,Brown,123.101,Bear,Holly.Brown@xorroinstitution.com,
HOBR03,Holly,Brown,123.202,,Holly.Brown@xorroinstitution.com,
THWI18,Thomas,Windsor,123.101,Tiger,Thomas.Windsor@xorroinstitution.com,
ANWO08,Anna,Worth,123.101,Bear,Anna.Worth@xorroinstitution.com,
ANWO08,Anna,Worth,123.202,,Anna.Worth@xorroinstitution.com,
ANWO08,Anna,Worth,123.204,,,
For the purposes of improving the explanation that follows, the Xorro sample file has been sorted in Google Sheets by id, group_code, then team within group_code. Download here the Sorted Participants csv shown here.
id | first | last | group_code | team | |
AMTO01 | Amanda | Tolley | 123.101 | Bear | Amanda.Tolley@xorroinstitution.com |
ANWO08 | Anna | Worth | 123.101 | Bear | Anna.Worth@xorroinstitution.com |
HOBR03 | Holly | Brown | 123.101 | Bear | Holly.Brown@xorroinstitution.com |
ALJO11 | Alice | Jones | 123.101 | Panda | Alice.Jones@xorroinstitution.com |
GRGR15 | Greta | Green | 123.101 | Panda | Greta.Green@xorroinstitution.com |
JEWA06 | Jeff | Wang | 123.101 | Panda | Jeff.Wang@xorroinstitution.com |
BOWI12 | Bob | Wilson | 123.101 | Tiger | Bob.Wilson@xorroinstitution.com |
HEJO19 | Henry | Jones | 123.101 | Tiger | Henry.Jones@xorroinstitution.com |
JOSM13 | John | Smith | 123.101 | Tiger | John.Smith@xorroinstitution.com |
THWI18 | Thomas | Windsor | 123.101 | Tiger | Thomas.Windsor@xorroinstitution.com |
ANWO08 | Anna | Worth | 123.202 | Anna.Worth@xorroinstitution.com | |
ANWO08 | Anna | Worth | 123.204 | ||
GRGR15 | Greta | Green | 123.204 | ||
HOBR03 | Holly | Brown | 123.202 | Holly.Brown@xorroinstitution.com | |
JOSM13 | John | Smith | 123.202 | John.Smith@xorroinstitution.com |
Note these features of the Participants CSV illustrated by the two technically identical identical samples
It is permissible in the file to have a unique individual, allocated to three separate group_codes (classes). The participant ALJ011 Alice Jones is allocated to all three group_codes 123.101, 123.202, 123.204.
This example file of participants when imported into Xorro during a launch peer assessment activity will give you the option of selecting which group_code will be used for the peer assessment, either 123.101, 123.202, or 123.204. This capability enables tremendous flexibility for peer assessment management when you have, for example, large classes with many arrangements and team assignments. Discuss your requirements with members of peer assess pro Ltd.
The team column designates the allocation of all the participants in the file who belong to the same team, such as Tiger in the example below.
The group_code designates the allocation of all the participants in the file who belong to a higher-order arrangement, such as the SAME class, a tutorial, an assignment.
More detail on Xorro help Importing Participants, Teams and Groups
When you upload or revise your participants csv the Xorro Import operation provides extensive quality assurance before you commit to launch or update your peer assessment activity. For example, you’ll get RED coloured error warnings, and the launch will not proceed when
You also receive advisory ORANGE coloured error warnings. In these cases, the launch can proceed, but take care! For example
You receive Advisory notifications when you make a change to a student’s data, such as their id, team or email. The launch can proceed, but take care! The following example shows the different classes of Fatal, non-critical and Advisory notifications displayed to a Facilitator attempting to import a Participants CSV that updates previous data.
Confirm the notifications make sense to you before committing to the launch or update. Did you really want to change that email? Did you really want to change someone’s id or name? Did you really mean to have two or more people share the same email? (that is permitted!).
Items in red will prevent the peer assessment activity from launching. Correct the identified errors
Items in orange will allow you to proceed to launch. Proceed with care!
Items in green are not notified during the launch process. The launch will proceed. However, you might check your source csv file for these symptoms of a potential problem.
Short Descriptor | Active Warning Message | Recommended Action |
Import file must be csv | The file you supplied is not in the required format of csv, comma separated variables. | Create a comma separated variable (csv) file from your source data. Attempt re-import. |
Column headers incorrect | The csv file does not include the required data description header row. | Ensure the Participants csv header row includes id, first, last. Optionally, include email, team, group_code. |
Column headers incorrect for teamset | The csv file does not specify the required data description headers for a peer assessment teamset. | A Participants csv that includes teamset data requires the header row to include id, first, last, email, team, group_code. Remove all trailing or leading spaces or hidden characters. |
Missing values in csv | Values must always be provided for id, first and last. | Supply the required data into the csv. |
Missing values in teamset | In a teamset, values must always be provided for id, first, last, email, team and group_code. | Supply the required data into the csv. |
Id repeated within teamset | An id is restricted to one use within a teamset. | Remove or correct the duplicate ids in the csv. |
Id contains invalid characters | An id contains invalid characters. | Remove invalid characters from the id data such as blanks, tilde (~), ampersand (&). Best practice: - Begin the id with a letter - Contains no spaces - Only use alphanumeric characters - Optionally, use these special characters @ $ # _ |
Email is invalid | An email address is not valid. | Correct the email address. |
Data record includes invalid characters | A data record contains invalid characters. | Remove invalid characters from the data such as tilde (~), ampersand (&). Best practice: - Use only alphanumeric characters - Optionally, use only these special characters @ $ # _ - Blanks may be used within the data |
Group_code data missing for teamset | A data record containing team data has a missing group_code. | Supply a group_code for all rows of data containing team data. |
Team size is too small | Team size is too small for effective peer assessment. A minimum of three participants per team is required. | Check to confirm you have allocated participants to the correct teams. Caution: teams of less than three participants will receive limited feedback results. |
Email data is missing | A participant who is a member of a teamset has no email specified. | Check to confirm that the participant has a valid email address. Caution: It is not good practice to proceed with missing emails in a teamset. |
Email repeated within teamset | An email has been used multiple times within a teamset. | Good practice: Ensure that each participant within a teamset has a different email. Caution: It is not good practice to proceed with missing emails in a teamset. |
Group_code data is missing in csv | A data record has missing group_code. | Caution: Good practice suggests that a group_code is provided for all rows of data. |
Name repeated within teamset | A first and last name pair has been used more than once within the same teamset. | Check to confirm that participants named similarly are different people. |
Redundant header descriptions | The CSV header row contains column headers and data that will be ignored by Xorro. | Check to confirm row headers are correct. If you proceed now, the data in the additional columns will be ignored. |
Team repeated within csv | The same team name is associated with several group_codes (teamsets). | Confirm your use of the same team name within several group_codes is correct. |
Id repeated within csv | The same id is associated with several group_codes. | Check to confirm your multiple use of ids within the Participants csv. |
Name repeated within csv | The same name is associated with several group_codes. | Check to confirm your multiple use of similar names within the Participants csv. |
Quick links and related information
FAQ: How do I create a CSV file from a Google Sheet?
FAQ - Problems editing and creating participants CSV files
Reference Guide Section 2.2 Create the peer assessment Participants CSV
Comma-separated values. (2020). In Wikipedia. https://en.wikipedia.org/w/index.php?title=Comma-separated_values&oldid=940422860
decimal point—Wiktionary. (n.d.). Retrieved 25 February 2020, from https://en.wiktionary.org/wiki/decimal_point
Delimiter-separated values. (2019). In Wikipedia. https://en.wikipedia.org/w/index.php?title=Delimiter-separated_values&oldid=916302992
List of text editors. (2020). In Wikipedia. https://en.wikipedia.org/w/index.php?title=List_of_text_editors&oldid=93574916
Zobel, D. (2010, March 11). I have trouble opening CSV files with Microsoft Excel. Is there a quick way to fix this? Paessler Knowledge Base. https://kb.paessler.com/en/topic/2293-i-have-trouble-opening-csv-files-with-microsoft-excel-is-there-a-quick-way-to-fix-this
Zobel, D., & Schoch, G. (2014, November 4). Trouble With Opening CSV Files With Excel? The Comma and Semicolon Issue in Excel Due to Regional Settings for Europe. https://kb.paessler.com/en/topic/2293-i-have-trouble-opening-csv-files-with-microsoft-excel-is-there-a-quick-way-to-fix-this#reply-5193
“I am frustrated beyond reason when it comes to creating or editing my teamset csv file!”
This technical note is mostly obsolete. We think we have fixed this problem. The Xorro teamset csv import now accepts either comma (,) or semicolon (;) delimited CSV files.
This FAQ should address most issues
FAQ - I’m having problems importing my participants csv
This technical note refers to difficulties related to
The foregoing difficulties are associated with factors including
One reason for these mysterious behaviors is that Xorro and Peer Assess Pro currently requires that a CSV file has its contents STRICTLY delimited using the comma (,). However, in regions such as non-Brexit Europe and South Africa the semicolon (;) is used as the delimiter in a so-called Commas Separated Variables file. It’s complicated! This behavior is determined by the Language and Region setting of your computer.
Peer Assess Pro has provided a practical solution to the issue of regional language and display settings, such as CSV delimiters. The Xorro teamset csv import now accepts either comma (,) or semicolon (;) delimited CSV files.
Here are two alternative workaround solutions that applied before this problem was resolved..
After you have edited and EXPORTED your CSV from Numbers or Microsoft Excel
id;first;last;group_code;team;email
BOWI12;Bob;Wilson;123.101;Tiger;Bob.Wilson@xorroinstitution.com
ALJO11;Alice;Jones;123.101;Panda;Alice.Jones@xorroinstitution.com
JOSM13;John;Smith;123.101;Tiger;John.Smith@xorroinstitution.com
JOSM13;John;Smith;123.202;;John.Smith@xorroinstitution.com
GRGR15;Greta;Green;123.101;Panda;Greta.Green@xorroinstitution.com
GRGR15;Greta;Green;123.204;;
HEJO19;Henry;Jones;123.101;Tiger;Henry.Jones@xorroinstitution.com
AMTO01;Amanda;Tolley;123.101;Bear;Amanda.Tolley@xorroinstitution.com
The following three screenshots illustrate how to adjust your Apple Mac current system region to the UK region. The UK (and several former British colonies such as the US, Canada and New Zealand) use the period, full stop or point (.) as the decimal mark (radix or separatrix) in a number such as 12,345.678. That is, twelve thousand three hundred forty five point six seven eight.
Consequently, the comma (,) is the system default data delimiter in a CSV file…. In contrast, in South Africa and non-Brexit European countries such as France and Germany, use the comma (,) as the decimal mark in a number such as 12 345,678. Consequently, the semicolon (;) is the CSV data delimiter.
Peer Assess Pro is working actively on a practical solution to the issue of regional language and display settings, such as CSV delimiters. It looks a simple matter, but here there be dragons! RESOLVED. Participants CSV import now accepts both comma (,) and semicolon (;) as the delimiter in a CSV file.
Quick links and related information
FAQ: How do I create a CSV file from a Google Sheet?
2.2 Create the peer assessment Participants CSV
About Comma-separated values (CSV)
Comma-separated values. (2020). In Wikipedia. https://en.wikipedia.org/w/index.php?title=Comma-separated_values&oldid=940422860
decimal point—Wiktionary. (n.d.). Retrieved 25 February 2020, from https://en.wiktionary.org/wiki/decimal_point
Delimiter-separated values. (2019). In Wikipedia. https://en.wikipedia.org/w/index.php?title=Delimiter-separated_values&oldid=916302992
List of text editors. (2020). In Wikipedia. https://en.wikipedia.org/w/index.php?title=List_of_text_editors&oldid=935749166
Zobel, D. (2010, March 11). I have trouble opening CSV files with Microsoft Excel. Is there a quick way to fix this? Paessler Knowledge Base. https://kb.paessler.com/en/topic/2293-i-have-trouble-opening-csv-files-with-microsoft-excel-is-there-a-quick-way-to-fix-this
Zobel, D., & Schoch, G. (2014, November 4). Trouble With Opening CSV Files With Excel? The Comma and Semicolon Issue in Excel Due to Regional Settings for Europe. https://kb.paessler.com/en/topic/2293-i-have-trouble-opening-csv-files-with-microsoft-excel-is-there-a-quick-way-to-fix-this#reply-5193
The Active Warning ‘Email rejected, missing or invalid - Notifications will FAIL to be delivered’ is generated when the email specified in the teamset used to create the peer assessment activity is either
For ANY of the above conditions, the Survey Notifications History will subsequently report the email as having been FAILED to deliver to that recipient.
We have received reports that emails sent to email addresses hosted by certain Microsoft services may be rejected, and then reported by Peer Assess Pro as ‘rejected, missing, or invalid’. The message may be generated even when the email has been confirmed as correct by the student or teacher.
Importantly, when an email address is reported by a Peer Assess Pro Active Warning as ‘invalid’ through being blocked by the recipient’s email service then emails sent from the Peer Assess Pro platform to these emails, even if correct, WILL NOT BE DELIVERED.
Similarly, if an email to a recipient is reported in the Survey Notifications Log as having FAILED to be delivered, that is a symptom the email service of the recipient has rejected or blocked emails from the Peer Assess Pro platform.
Consequently, please avoid using emails to addresses that include
@yahoo.com - sometimes successful
@live.com - sometimes successful
@hotmail.com
@msn.com
@outlook.com
We recommend that in your team composition teamset.csv you send emails to
Your institution’s designated email address for your students
@gmail.com
@me.com
@qq.com
Peer Assess Pro is progressing work to overcome emails being rejected by Microsoft’s email services.
In the meantime, the following table provides recommended actions when you receive the warning ‘Email rejected, missing or invalid’. Furthermore, the Survey Notifications History provides further detail about the status of an email’s delivery to a recipient.
Email status | Explanation | Facilitator’s action |
Sent | Peer Assess Pro has attempted to send the notification to a recipient. The email is still percolating through the internet attempting to find the recipient’s mailbox. | Wait a few minutes. Reload the dashboard then review the Survey Notification History to update the status. |
Delivered | Peer Assess Pro has successfully delivered the email notification to a recipient… but the recipient may not have seen the message yet. | |
Opened | The recipient appears to have opened the email… but you can’t be completely sure! | This advisory depends on settings enabled by the recipient. The OPENED email status is not an entirely reliable indicator. |
Failed - temporary | Peer Assess Pro has made at least one attempt to deliver an email but has not yet made a successful delivery. Further attempts to deliver will be made up to certain limits. | Wait patiently. Check again after 24 hours. |
Failed - permanent | Peer Assess Pro has made several attempts to deliver an email but has not made a successful delivery. No further attempt to deliver will be made. The email may have Failed to deliver because it was (a) Rejected by the recipient’s mail server, or (b) invalid in some aspect or (c) Missing | Try one of the actions as for Rejected, Missing, or Invalid below. Avoid emails such as yahoo.com, @hotmail.com, @msn.com, @outlook.com and @live.com |
Quick links and related information
This FAQ reports the status of emails sent from the platform, such as SENT, DELIVERED, or FAILED:
FAQ - What emails have been sent by the platform?
Adjust the emails for a recipient using this FAQ:
FAQ: How do I correct the Team Composition in a running peer assessment activity?
Context: LMS implementations of Peer Assess Pro.
This Active Warning arises when the arrangement of students and teams on the learning management system fails to match the teams in the launched peer assessment. Peer Assess Pro has effective mechanisms for resolving such issues as late student enrolments, withdrawals and inactive students.
A mismatch arises because one or more of the following events has happened.
1. One or more new class members enrolled on the LMS have not yet been arranged by the teacher into a team on the LMS.
2. Since launching the peer assessment, team membership has been adjusted by the teacher on the LMS for some teams which now fails to match the team arrangement in the peer assessment.
3. Previously-known class members have been removed from the class on the LMS, but still remain registered within the peer assessment arrangement.
4. Teams have been added, deleted, or renamed on the LMS. In general, the teacher should avoid renaming teams in a running activity, as the team’s survey responses may be deleted.
Peer Assess Pro LMS regularly checks that the team membership arrangement on the LMS matches the team arrangement (number of teams, membership of teams) in the currently active peer assessment. When there is a mismatch, Peer Assess Pro
Active Warning 0021 Team arrangement unsynchronised
The arrangement of students and teams on the learning management system fails to match the teams in the launched peer assessment.
Team arrangement unsynchronised
Id 0021
New class members enrolled on the LMS have not yet been arranged into a team on the LMS. Jill ROBERTSON, Jason SMITH.
Team membership has been adjusted on the LMS for some teams that no longer match the peer assessment arrangement. Teams Black Robins, Brown Kiwis, Red Rooks (renamed from Red Ruru)
New team(s) created on LMS: Teams Waxeyes
New class members enrolled on the LMS into new teams. Kael BRIDGES, Jonathan CHANG, Kyleigh COHEN into team Waxeyes.
Previously-known class members have been removed from the class on the LMS: Estrella HAWKINS in Team Brown Kiwis.
Peer Assess Pro provides several actions to resolve the mismatches identified,
A team arrangement refers to the memberships of teams (groups) by participants (teammates, students). Depending on your Learning Management System (LMS) these terms may be used to describe a team arrangement: teamset, grouping (Moodle) or group set (Canvas)
Review team composition
The teacher can review the team arrangement as known by Peer Assess Pro LMS in the current running survey. Additionally, the teacher can selectively choose which teams should be synchronised between the LMS and the Peer Assess Pro platform.
In the Team Composition view, symbols indicate teams that fail to match the arrangement configured on the LMS. The symbol (-) signifies that a team member has been withdrawn from the team on the LMS. In general, it is safe to synchronise such teams.
Most crucially, if a student has been allocated by the teacher on the LMS, subsequent to the peer assessment launch, then the symbol (+) will indicate a discrepancy that must be resolved. Students on the LMS not yet arranged into teams will also be listed in a ghost team named Unassigned, again with the (+) symbol.
The design principle of Peer Assess Pro is that adjustments to team arrangements must be made on the LMS using the standard LMS participant grouping modules. Then the teacher, optionally, commits those adjustments to a running peer assessment, thereby achieving synchronisation.
The teacher should use the standard LMS participants, groups, or groupings modules to view, compare, and adjust the team membership to that which is required, such as adding or removing team members to teams. Following the LMS adjustment, the teacher returns to complete the Synchronise All action to update the corrected team(s) arrangement in the peer assessment. Alternatively, individual teams can be selectively synchronised using Review Team Composition.
The Synchronise All action will update the team arrangement for all teams in the currently-running peer assessment to match the corresponding arrangement of the teams specified on the LMS.
Peer Assess Pro will conserve all responses for the team members as far as possible. If a team gains additional team member(s), then the existing team members will be prompted to submit additional response(s) for the additional team member(s). Specifically, the Email CRITICAL 0013: RESUBMIT peer assessment due to TEAM CHANGE will be dispatched to relevant team members. New class members now synchronised from the LMS into Peer Assess Pro will receive Email CRITICAL 0011: Request to COMPLETE peer assessment. Class members transferred from one team to another will also receive CRITICAL 0011: Request to COMPLETE peer assessment.
If a team loses one or more team members, then the remaining team members do not receive any communication as their responses can still be applied to the revised team arrangement.
A student must not be allocated to two or more teams within one Peer Assess Pro activity. Consequently, if a student in the Peer Assess Pro Activity is allocated on the LMS into two or more teams then the exclamation symbol (!) will display against all instances when you Review Team Composition. Team synchronisation will be forbidden until the student is allocated to one team within the proposed team arrangement.
In general, avoid renaming teams in a running activity! In the special case that a team has been renamed AND the team composition remains the same, then the renamed team can be applied with no further action. Previous responses can be applied to the renamed team.
If a renamed team loses one or more team members, then the remaining team members do not receive any communication as their responses can still be applied to the revised team arrangement.
In all other cases PEER ASSESS PRO may regard a renamed team as a new team. Whilst some responses may be recoverable, the existing team members will be prompted to submit an additional response for the additional team member(s), as per CRITICAL 0013: RESUBMIT peer assessment due to TEAM CHANGE.
The teacher should postpone synchronising the team arrangement until after they have checked and confirmed carefully that the arrangement of students into teams on the LMS is complete and accurate. Otherwise, there is the danger of an abundance of alerts from students or the platform about incorrect or unsynchronised team arrangements.
If the teacher anticipates receiving notification of additional changes to the team arrangement, they might postpone committing to synchronising the updates. However, note that Peer Assess Pro LMS will continue to receive survey responses from members of teams as currently arranged on the survey platform and shown in the Team Composition view.
This example scenario presents the case of a Peer Assess Pro activity launched early in a course. A week later, new students have been enrolled in the class, whilst others have withdrawn. To accommodate these changes, the teacher creates a new team, reassigns some existing team members, and allocates new team members to existing teams. The example shows how the mismatch between the LMS team arrangement is reported to the teacher through Peer Assess Pro’s Active Warning system. Finally, the teacher is supported to make an accurate update to the team arrangement required for their peer assessment activity delivered through the Peer Assess Pro platform.
A peer assessment has been launched with 19 students in 5 teams. The Team Composition status for Peer Assess Pro LMS at the point of launch shows
Black Robins
Kamryn MILLER , Alexander SAMPSON , Mikaela RAY , Ramon MCKNIGHT
Brown Kiwis
Estrella HAWKINS , August DAUGHERTY , Nehemiah MCCONNELL
Grey Warblers
Joslyn HOOVER , Alyvia YANG , Mariyah POLLARD , Arianna SCHROEDER
Pukekos
Dorian SULLIVAN , Elisha NUNEZ , Muhammad HOLT , Skylar MCCLURE
Red Ruru
Alberto UNDERWOOD , Annika KLINE , June MCKINNEY , Jaylee MURRAY
One week later, Estrella HAWKINS has withdrawn from the class (-) and has been removed automatically from the LMS participants’ schedule. The teacher also chose to reallocate Kamryn MILLER from Team Black Robins into Team Brown Kiwis on the LMS.
Two new students have been enrolled into the course LMS, but remain unassigned to teams. Furthermore, the teacher has been alerted by a member of Team Pukekos that their team member Dorian SULLIVAN is inactive (?).
The teacher has renamed Team Red Ruru to Red Rooks, and added a new team Waxeyes comprising new class members Kael BRIDGES, Jonathan CHANG and Kyleigh COHEN. T
Immediately upon recognising the mismatch, Peer Assess Pro will alert the teacher to the mismatch between the LMS version of the team arrangement, and the team arrangement underway on the Peer Assess Pro platform. The alert will be made through the Message Notification feature of their LMS, and through this Active Warning presented on the Teachers Dashboard.
Active Warning 0021:
Team arrangement unsynchronised
The arrangement of students and teams on the learning management system fails to match the teams in the launched peer assessment.
New class members enrolled on the LMS have not yet been arranged into a team on the LMS. Jill ROBERTSON, Jason SMITH.
Team membership has been adjusted on the LMS for some teams that no longer match the peer assessment arrangement. Teams Black Robins, Brown Kiwis, Red Rooks (renamed from Red Ruru)
New team(s) created on LMS: Teams Waxeyes
New class members enrolled on the LMS into new teams. Kael BRIDGES, Jonathan CHANG, Kyleigh COHEN into team Waxeyes.
Previously-known class members have been removed from the class on the LMS: Estrella HAWKINS in Team Brown Kiwis.
In support of the Active Warning, Peer Assess Pro reports the following Team Composition view, highlighting the lack of synchronisation between the LMS and the initial launch. The symbol (-) signifies that, following synchronisation, a student would be dropped from a team. The symbol (+) indicates that a student will be added. Adjustments, contingent upon synchronisation, are highlighted in red.
The (?) symbol denotes a team member or team for which either the Active Warnings CRITICAL 0048: Inactive team member or CRITICAL 0006: Adjusted teamset request have been raised.
Black Robins
Kamryn MILLER (-), Alexander SAMPSON , Mikaela RAY , Ramon MCKNIGHT
Brown Kiwis
Estrella HAWKINS (-), August DAUGHERTY , Nehemiah MCCONNELL, Kamryn MILLER (+)
Grey Warblers
Joslyn HOOVER , Alyvia YANG , Mariyah POLLARD , Arianna SCHROEDER
Pukekos
Dorian SULLIVAN (?), Elisha NUNEZ , Muhammad HOLT , Skylar MCCLURE
Red Rooks < Red Ruru
Alberto UNDERWOOD, Annika KLINE , June MCKINNEY , Jaylee MURRAY
Waxeyes (+)
Kael BRIDGES (+), Jonathan CHANG (+), Kyleigh COHEN (+)
Unassigned (?)
Jill ROBERTSON (+), Jason SMITH (+)
Having examined the previous Team Composition View the teacher, on the LMS, assigns Jill ROBERTSON to Team Black Robins, and Jason SMITH to Brown Kiwis. The teacher leaves Dorian SULLIVAN in Team Pukekos pending their further investigation. In response to the Teacher’s actions, Peer Assess Pro will show an updated view of the Team Composition and mismatches.
Before the teacher initiates synchronisation, the refreshed Team Composition view for the Peer Assess Pro activity will now show the updated team arrangement, combining the intentions newly-stated by the teacher on the LMS, and the current status of the peer assessment.
Black Robins
Kamryn MILLER (-), Alexander SAMPSON , Mikaela RAY , Ramon MCKNIGHT, Jill ROBERTSON (+)
Brown Kiwis
Estrella HAWKINS (-), August DAUGHERTY , Nehemiah MCCONNELL, Kamryn MILLER (+), Jason SMITH (+)
Grey Warblers
Joslyn HOOVER , Alyvia YANG , Mariyah POLLARD , Arianna SCHROEDER
Pukekos
Dorian SULLIVAN (?), Elisha NUNEZ , Muhammad HOLT , Skylar MCCLURE
Red Rooks < Red Ruru
Alberto UNDERWOOD , Annika KLINE , June MCKINNEY , Jaylee MURRAY
Waxeyes (+)
Kael BRIDGES (+), Jonathan CHANG (+), Kyleigh COHEN (+)
The Teacher views the foregoing Team Composition, and is satisfied with the proposed team arrangement. The teacher commits to Synchronise All teams.
After the synchronisation is complete successfully, the resulting refreshed Team Composition is revealed to the Teacher.
Black Robins
Alexander SAMPSON , Mikaela RAY , Ramon MCKNIGHT, Jill ROBERTSON
Brown Kiwis
August DAUGHERTY , Nehemiah MCCONNELL, Kamryn MILLER, Jason SMITH
Grey Warblers
Joslyn HOOVER , Alyvia YANG , Mariyah POLLARD , Arianna SCHROEDER
Pukekos
Dorian SULLIVAN (?), Elisha NUNEZ , Muhammad HOLT , Skylar MCCLURE
Red Rooks
Alberto UNDERWOOD , Annika KLINE , June MCKINNEY , Jaylee MURRAY
Waxeyes
Kael BRIDGES, Jonathan CHANG, Kyleigh COHEN
The two teams Black Robins and Brown Kiwis now include the new class mates Jill ROBERTSON and Jason SMITH, and the relocated Kamryn MILLER. The founding members of teams Black Robins and Brown Kiwis will be alerted to the requirement to submit an updated survey response for their re-configured teams, as will the relocated Kamryn MILLER. Email CRITICAL 0013: RESUBMIT peer assessment due to TEAM CHANGE.
The newly-enrolled class members, Kael BRIDGES, Jonathan CHANG, Kyleigh COHEN, Kael BRIDGES, Jonathan CHANG and Kyleigh COHEN will be advised to submit the peer assessment. Email CRITICAL 0011: Request to COMPLETE peer assessment.
The Teacher will continue their investigations to determine what to do with Dorian SULLIVAN, denoted by (?) in Team Pukekos.
Quick links and related information
FAQ - What is an adjusted team arrangement request? WARNING 0006
FAQ - What is an inactive team member? WARNING 0048
FAQ - What emails have been sent by the platform?
Peer Assess Pro restricts the Teacher to using a standard rubric that yields several benefits.
Feature | Benefit |
Authoritative | The questions used in the survey are based on long-established research about the teamwork capabilities required (a) for effective teamwork by students and (b) by employers. |
In-class progress | Using the same rubric within a class for both formative and summative teammate peer assessment enables progress within the class to be measured. |
Calibration | A standard rubric enables at-risk students and teams to be readily identified, as the rubric provides for comparison of peer assessment results, especially the Peer Assessed Score, against calibrated benchmarks. |
Time-saving | Reduces the time needed to make decisions about what questions to deploy. |
Capacity development | Self-directed learning resources for students and teachers are developed more efficiently when a standardised set of teamwork and leadership capabilities are surveyed. |
Institutional progress | Results from one class can be compared with the results of another class, and institution, according to a standard basis of measurement. |
Validation | A standardised, authoritative rubric supports claims that a course and/or academic programme delivers teamwork and leadership learning outcomes sought by accreditation agencies, such as the Washington Accord Graduate Profile. |
Scholarship | Insights drawn from scholarly research using a standardised assessment rubric can inform creative development of teaching and learning practices in several institutions. |
The ten questions used in the Peer Assess Pro survey, used as the basis for calculating the Peer Assessment Score, are adapted from:
Deacon Carr, S., Herman, E. D., Keldsen, S. Z., Miller, J. G., & Wakefield, P. A. (2005). Peer feedback. In The Team Learning Assistant Workbook. New York: McGraw Hill Irwin.
Quick links and related information
FAQ: What questions are asked in the peer assessment survey?
FAQ: How are peer assessment and personal results calculated and defined mathematically?
Context: Xorro-based survey
If a student mistakenly advises they are in an incorrect team, then the student simply resumes the survey and selects ‘Actually they are correct (Proceed)’
At the start of the Peer Assess Pro survey, a student is given the option to
In the second case, a notification email is immediately despatched to the teacher stating
In the activity <<Activity Name>>, << Student Name>> claims that he/she is in an incorrect team. Please check and update teamset.
Example
In the activity "Ornithologists 101 Formative", Peter MELLALOO claims that he/she is in an incorrect team. Please check and update teamset.
In the normal workflow, the teacher will adjust the team composition according to
FAQ: How do I correct the Team Composition in a running peer assessment activity?
A student may mistakenly advise they are in an incorrect team.
In this case
‘Actually they are correct (Proceed)’
Quick links and related information
FAQ: How do I correct the Team Composition in a running peer assessment activity?
Please do not hesitate to ask us for help.
Patrick Dodd patrick@peerassesspro.com +64 21 183 6315
Peter Mellalieu peter@peerassesspro.com +64 21 42 0118 Skype myndSurfer
We especially welcome your advice on how the app and Reference Guide could be improved. We’d also like to know which features you expect to value highly in your teaching and use of Peer Assess Pro™.
Thank you for your participation.
Frequently Asked Questions
FAQs on the web at https://www.peerassesspro.com/frequently-asked-questions-2/
Quickstart Guide for Peer Assess Pro
https://www.peerassesspro.com/quickstart-guide-for-teachers/
Home/Table of Contents for Reference Manual
Website
https://www.peerassesspro.com/
Quick links and related information
[1] Hyperlinked to Xoro site. Teacher must be a registered Xorro user.
[2] Hyperlinked to the web version of the Reference Guide.
[3] Internal links within the pdf version of Reference Guide.
[4] Conditions of use apply to a free Xorro Account. See Discover Xorro-Q