Manage a Peer Assessment Activity using Xorro
Reference Guide for Teachers and Students
Version 2.9.1 2020-10-27
Peter Mellalieu peter@peerassesspro.com
+64 21 42 0118 Skype myndSurfer
Patrick Dodd patrick@peerassesspro.com
+64 21 183 6315
Follow these steps to register, launch, manage, and download the final gradebook for a Peer Assess Pro peer assessment using the Xorro Survey Management system.
Download hyperlinked at www.peerassesspro.com/quickstart-guide-for-teachers
Launching Peer Assess Pro™ using Xorro-Q
>>> Reference Guide at http://tinyurl.com/papRefPdf
Once logged in to Xorro-Q, you launch a peer assessment activity. During the launch process, you Import a Participants CSV that specifies team members arranged by their team name, login id and their email. The Participants CSV is a comma separated variable (csv) file illustrated below that must contain the column headers shown in the example. Peer Assess Pro emails an activity URL that surveys the peer assessments by each teams’ members. Timely reminders and personalised feedback reports are communicated to the students from Peer Assess Pro using the email addresses you provided in the Participants CSV.
id | first | last | team | group_code |
AMTO01 | Amanda | Tolley | Amanda.Tolley@noreply.com | Bear | BUS123.101/PMell/TutB/2020-05-28/SUM |
ANWO08 | Anna | Worth | Anna.Worth@noreply.com | Bear | BUS123.101/PMell/TutB/2020-05-28/SUM |
HOBR03 | Holly | Brown | Holly.Brown@noreply.com | Bear | BUS123.101/PMell/TutB/2020-05-28/SUM |
ALJO11 | Alice | Jones | Alice.Jones@noreply.com | Panda | BUS123.101/PMell/TutB/2020-05-28/SUM |
GRGR15 | Greta | Green | Greta.Green@noreply.com | Panda | BUS123.101/PMell/TutB/2020-05-28/SUM |
JEWA06 | Jeff | Wang | Jeff.Wang@noreply.com | Panda | BUS123.101/PMell/TutB/2020-05-28/SUM |
BOWI12 | Bob | Wilson | Bob.Wilson@noreply.com | Tiger | BUS123.101/PMell/TutB/2020-05-28/SUM |
HEJO19 | Henry | Jones | Henry.Jones@noreply.com | Tiger | BUS123.101/PMell/TutB/2020-05-28/SUM |
JOSM13 | John | Smith | John.Smith@noreply.com | Tiger | BUS123.101/PMell/TutB/2020-05-28/SUM |
>>> Download example CSV, EXCEL, or Google Sheet
>>> More FAQs at www.peerassesspro.com/frequently-asked-questions-2
FAQ - Show me a quick video overview of the whole Peer Assess Pro system
FAQ - I’m having problems importing my participants csv
FAQ - How do I correct the Team Composition in a running peer assessment activity?
FAQ - What is the purpose of peer assessment?
FAQ - What questions are asked in the peer assessment survey?
Everyone | ||
FAQs on the web at http://tinyurl.com/papFAQ | ||
Download poster https://www.peerassesspro.com/infographic/
>>> Hyperlinked chart at http://tinyurl.com/papChart
Ask us for help, give us feedback, and request additional features.
https://www.peerassesspro.com/contact-us/
Patrick Dodd patrick@peerassesspro.com +64 21 183 6315
Peter Mellalieu peter@peerassesspro.com +64 21 42 0118 Skype myndsurfer
Mellalieu, P. J. (2020). How to teach using group assignments: The 7 step formula for fair and effective team assessment. Peer Assess Pro. https://www.peerassesspro.com/ebook
Example Peer Assessment Participants CSV File 2
Example Survey Questions for a Team Member 2
Most Frequently Asked Questions (FAQs) 3
Teammate peer assessment contributes to fair and effective team assignments 4
Teachers Process Flowchart: Overview 5
Support, Feedback and Contact 5
PEER ASSESS PRO REFERENCE GUIDE 6
1. Login to your Xorro HOME page 20
1.1 First time users: Register 21
Register a new Xorro Teacher’s Account as a Free Facilitator 21
Getting started with Xorro Q 21
Extended free trial for New Zealand higher education institutions 21
1.2 Login from your registered Xorro Account 22
1.3 Orient yourself to the Xorro HOME Dashboard 23
1.4 Orient yourself to the Peer Assess Pro platform 25
Review the Peer Assess Pro facilitators dashboard 26
Overview of the steps required to launch a peer assessment 27
1.5 Peer Assess Pro system flowchart detail 28
2. Launch Peer Assessment activity 29
2.2 Create the peer assessment Participants CSV 31
Alternative Participants CSV templates 31
Instructions and column explanations for the peer assessment Participants CSV 32
Requirements for a peer assessment Participants CSV file 34
Create a CSV version of your Participants CSV file 34
Why won’t Xorro load my Participants CSV file? 35
Your spreadsheet editor will typically NOT create a CSV file, unless... 35
Good practice hint: Create distinctive group codes for every peer assessment activity you launch 35
Large, multi-cohort streams in a class 36
2.3 Launch and create the peer assessment activity 37
Select ACTIVITIES from the top menu bar 37
Good practice hint: Avoid using the Xorro default Due Date 38
The Due Date date is advisory only 40
View the Peer Assess Pro Teacher’s dashboard 41
Invite team members to respond and other automated activities 42
2.4 Use a Teamset Group to launch a peer assessment 45
From the Xorro HOME page select the PARTICIPANTS page 45
Select ‘Import Participants’ 45
Browse to your Team Members Group CSV file 46
Load, check and confirm correct team membership, then Import 46
Check class and team membership 47
3. Manage the Peer Assessment Activity 49
3.1 Action responses to warnings 50
Note: Changes to a Xorro Group have NO EFFECT on current Team Composition 52
3.2 Automated and manual notifications 52
3.4 Select the Personal Result Calculation Method 54
3.5 Review class, team, and individual statistics 56
Good practice hint: How to identify at risk students 57
The Individual Personal Snapshot 57
Four possible views of the Individual Personal Snapshot 60
3.6 Publish provisional Personal Results to team members 65
Results hidden when insufficient responses 66
4. Finalise the peer assessment activity 68
4.2 Publish Finalised Results to students 69
4.3 Download Teacher’s Gradebook of Results 70
4.4 Finalise the Activity … irrevocably! 72
Launch peer assessment activity 75
Manage the peer assessment activity 75
Responding to Active Warnings 76
Definitions, calculations, and examples 76
The purpose of peer assessment 78
Undertaking the peer assessment 78
Using the results from peer assessment for better performance 79
How peer assessment affects personal results 79
FAQ: What is the purpose of peer assessment? 80
Determination of course personal result 80
Criteria for peer assessment in Peer Assess Pro™ 81
Peer Assess Pro assesses competencies valued by employers 81
FAQ: When and how is the peer assessment conducted? 83
Formative assessment: optional but valuable 83
FAQ: How do I provide useful feedback to my team members? 86
FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern? 88
Symptoms of an unfair assessment 88
Steps to address an unfair peer assessment 88
A note on appealing a peer assessment result 89
Prevention is better than cure 90
FAQ: How do I interpret the feedback results I've received from the peer assessment? 92
FAQ: I don’t understand what my teammates are trying to tell me. How do I ask for better feedback? 93
FAQ: What steps can I take to get a better personal result? 94
Use your institution’s academic support services 94
Raise your Peer Assessed Score 95
How do I address proactively the challenges of team work? 95
Learning constructively from mid-course peer assessment feedback 96
FAQ: What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process? 98
Examples: Highly specific and individualized information 99
1. Low quality assessor rating 99
3. Outlier individual rating 100
4. Mismatched self-assessment 100
Example: Better feedback. Better teams 101
Which teams will raise the Active Warning: Low quality team rating? 102
Which teams tend to have a higher team result? 103
Which teams have worked most productively as a team? 103
Active Warnings, thresholds parameters, and program logic 105
FAQ: Give me a quick overview of how to launch a Peer Assess Pro™ activity through Xorro 106
FAQ: What are the design objectives, key features, and benefits of the Peer Assess Pro development? 108
Peer Assess Pro™ is a work in progress 109
FAQ: How do I find the Peer Assess Pro Xorro Teacher’s dashboard? 111
Alternative method: ACTIVITIES: Running Activities 111
FAQ: How do I navigate the PARTICIPANTS page for Peer Assess Pro? 114
Orientation note: Select an existing Group 115
Inactive functions in PARTICIPANTS page 116
FAQ: How do I correct the Team Composition in a running peer assessment activity? 117
Take care! Here there be dragons!! 117
Correct the team composition 118
FAQ: Can I create a peer assessment activity without having all my teams correctly identified by team name and/or team membership? 120
FAQ: How do I create a CSV file from a Google Sheet? 121
Sample of participants csv file opened using a text editor 122
FAQ: How do I view a demonstration version of Peer Assess Pro? 123
FAQ: How do I correct the participants (team members) in a group already uploaded to Xorro? 124
Update a group’s team members for future use 124
Correct the team members associated with an existing Xorro TeamSet Group 125
FAQ: Where may I view the most recent version of the user guides? 127
Work in progress Google DOCS development version 127
Frequently Asked Questions for teachers and team members 128
Teachers Process Flowchart 128
FAQ: How do I decide which Personal Result method to apply in my peer assessment activity 129
FAQ: How are peer assessment and personal results calculated and defined mathematically? 135
Calculation methods that exclude a team result 136
Calculation methods that incorporate a team result from team outputs 136
FAQ: How do students know where and when to complete the peer assessment activity then review their results? 137
Automated communications to students 137
Alternative mode for student access to assessment and results 138
FAQ: How do I view and experience what the students experience? 139
View your student’s personal results directly from your Teacher’s Dashboard 139
View your students’ experience of the Peer Assess Pro™ survey 139
Enter your Participants’ URL into your browser 139
Select the activity you wish to experience 140
Login in using the Identification (id) of a student in the Team List Group used to create the activity 140
View a survey ready and waiting for responses 141
View a student’s published results 142
View the peer assessment survey for a demonstration class 144
FAQ: Why are different terms used to display peer assessment results in the Xorro and previous Google versions of Peer Assess Pro™? 145
FAQ: How do I take action on the Warnings presented in the Peer Assess Pro™ Teacher’s Dashboard?’ What if I ignore the Warnings? 146
Critical and catastrophic warnings! 146
Optional emails generated for team members 147
FAQ: When, why, and how do I Refresh and Update Results? 148
FAQ: What questions are asked in the peer assessment survey? 150
Example Peer Assessment Survey: Quantitative 151
Example Peer Assessment Survey: Qualitative 152
FAQ: How is the Peer Assessed (PA) Score calculated? 153
The self-assessment is excluded from calculating PA Score 153
Mathematical definition of Peer Assessed Score, PA Score 154
Example calculations of Peer Assessed Score 157
Alternative mathematical formulations of PA Score 159
Calculation from Average Rating 159
Calculation from Average Team and Leadership Contributions 160
FAQ: How is the self-assessment used to calculate Peer Assessed Score? 161
Spider chart of individual and averaged team peer ratings 161
Index of Realistic Self-Assessment (IRSA) 162
FAQ: How is the Peer Assessed Index (PA Index) calculated? 163
Mathematical definition of Peer Assessed Index 163
Example calculations of Peer Assessed Index 164
FAQ: How is the Indexed Personal Result (IPR) calculated? 166
Mathematical definition of Indexed Personal Result 166
Example calculations of Indexed Personal Result 167
FAQ: How is the Normalised Personal Result (NPR) calculated? 169
Mathematical definition of Normalised Personal Result 170
Example calculations of Normalised Personal Result 171
Impact of adjusting the Spread Factor on Normalised Personal result 173
FAQ: How is the Rank Based Personal Result (RPR) calculated? 175
Mathematical definition of Rank-Based Personal Result 176
Example calculations of Rank-Based Personal Result 177
Example calculation with tied ranks 179
FAQ: How is Standard Peer Assessed Score (SPAS) calculated? 180
Design features of Standard Peer Assessed Score 181
Example calculations of Standard Peer Assessed Score 184
Example charts for Standard Peer Assessed Score 186
Assumptions about Standard Peer Assessed Score 187
The impact of gaming peer assessment 188
FAQ: What is the influence on Standard Peer Assessed Score (SPAS) if a team rates ALL its members with a Peer Assessed Score of 100? 188
FAQ: Would a student receive the same Standard Peer Assessed Score (SPAS) if rated in another class? 189
FAQ: What is Employability? How is it calculated? 190
Mathematical calculation of Employability 190
Conditioning transformations to de-emphasise unsubstantiated precision 191
Example calculations of Employability 191
FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated? 194
Mathematical definition of the Index of Realistic Self Assessment 194
Example calculations of the Index of Realistic Self Assessment 195
Why an IRSA of 100 is not a perfect score! 196
FAQ: How do I interpret measures of realistic self-assessment? 198
Interpreting the Index of Realistic Self Assessment (IRSA) 198
Developing an exceptionally realistic self image, ERSA 199
What are the benefits of having an Exceptionally Realistic Self-Image? 199
What can get in the way of having an Exceptionally Realistic Self-Image? 199
How do I develop my Exceptionally Realistic Self-Image, ERSI? 200
FAQ: How is an outlier peer assessment rating identified? WARNING 0042 202
Threshold for warning of outlier individual peer rating 204
Alternative mathematical calculation of Assessor Impact 204
Alternative example calculations 205
FAQ: What is a mismatched self-assessment (IRSA)? WARNING 0040 207
Threshold for warning of mismatched self-assessment 208
Recommended action for facilitator 209
FAQ: What is a low quality team rating? WARNING 0050 210
Threshold for warning of low quality team rating 211
Recommended action for facilitator 212
FAQ: What is a low quality assessor rating? WARNING 0300 214
Threshold for warning of low quality assessor rating 214
Recommended action for facilitator 216
FAQ: What is a valid assessed team? WARNING 0022 217
Results not displayed to members of non-valid assessed teams 217
How many valid and invalid teams do I have? 218
Recommended action for facilitator 218
FAQ: What is an ‘at risk’ team member? WARNING 0044 220
Recommended action for facilitator 220
Threshold for warning of ‘at risk’ team member 221
Alternative approaches to identifying at risk students 225
FAQ - What emails have been sent by the platform? 230
Survey notifications history 230
Track-and-trace of emails to participants 230
FAQ: What is the content of emails sent by Peer Assess Pro to Participants? 233
Preview email from Active Warnings 233
Preview all emails available for sending 233
Table of email subjects sent to participants 235
Table of email body text sent to participants, listed by Email ID and Subject 237
FAQ: What is the content of emails sent by Peer Assess Pro to Facilitators? 247
Table of email subjects sent to facilitators 248
Table of email body text sent to facilitators, listed by Email ID and Subject 249
FAQ: How do I login to my peer assessment Activity URL 257
Successful login through Activity URL 258
FAQ: I am unable to login. My login failed 259
Investigation and remedies for login failure 260
You entered your ID incorrectly. 260
Your teacher or facilitator has entered your ID incorrectly 261
The Xorro Activity related to the Activity URL has not yet reached its Start Date 261
The Xorro Activity related to the Activity URL has been Finalised and Finished. 262
The Xorro Activity related to the Activity URL has been Abandoned 263
The institution manager for Xorro has not maintained payment of the subscription to use Xorro and/or Peer Assess Pro 263
An exceptional system fault has occurred with the Xorro participants database entry for your ID: duplicate identical ids 264
FAQ: Can I adjust the Start Date or Due Date for a running activity? 266
The good news: The Due Date date is advisory only 267
Advise students of your extended deadline 267
Worst case scenario: Abandon the peer assessment 267
FAQ - I’m having problems importing my participants csv 270
What are the common problems when importing a participants file? 270
Is my Participants CSV in the correct format for launching a peer assessment activity? 272
Sample of participants csv file opened using a text editor 272
Sorted Participants CSV viewed in Google Sheets 273
Error notifications upon upload of a participants CSV to a peer assessment 275
Examples of error notifications upon upload of a participants CSV to a peer assessment 276
Comprehensive list of potential errors when attempting to import participants csv 277
Potential errors in a participants csv 277
About Comma-separated values (CSV) 279
FAQ - Problems editing and creating participants CSV files 281
Trouble opening csv files with Excel due to regional settings 281
Solution 1: Use a simple text editor to find and replace semicolons with commas 282
Illustration using Mac TextEdit 282
Solution 2: Adjust Language and Region to use point as decimal marker 286
Illustration using Mac OS on an Apple computer 286
FAQ - How do I fix an invalid, missing or failed email delivery? WARNING 0026 290
Corrective action: avoid these emails 290
Corrective action: use these emails 291
Sign up as a Free Facilitator to trial the use of Peer Assess Pro using the Xorro-Q interface:
https://www.xorro.com/free_accounts/pap/new
For related information relevant to registering as a new facilitator:
New Zealand higher education institutions have an extended free trial period for the use of Peer Assess Pro. This free access is under an arrangement with Te Ako Aotearoa.
For further details contact Patrick Dodd at the offices of Peer Assess Pro.
After you login, The your Xorro HOME Dashboard page shows will display, as shown in Section 1.3 Orient yourself to the Xorro HOME Dashboard
Now proceed to follow the steps in the Quickstart Guide, or the detailed explanations in Section 2. Launch Peer Assessment Activity
Quick links and related information
Section 1.3 Orient yourself to the Xorro HOME Dashboard
FAQ: How do I find the the Peer Assess Pro Teacher’s dashboard?
Your Xorro HOME Dashboard page shows
Quick links and related information
FAQ: How do I find the the Peer Assess Pro Teacher’s dashboard?
Peer Assess Pro system flowchart detail
Peer Assess Pro system flowchart detail http://tinyurl.com/papChart
Each process box in the flowchart pdf version of the flowchart links directly to the specific page in this Reference Guide that explains that step in the process.
View this short video illustrating many of the features, benefits, and processes involved in using the Peer Assess Pro platform (6 minutes).
http://tinyurl.com/digitalFlyBy
These are the key features of the Facilitators Dashboard accessed through the Xorro ACTIVITIES tab when you have launched a peer assessment activity.
PDF with hyperlinks at Xorro Peer Assess ProTM Teachers Process Flowchart http://tinyurl.com/papChart
Create a file containing your class list that shows every team member organised into their teams. The required file format is Comma Separated Variables (CSV). This is your Participants CSV file. A sample of the file format is shown in Section 2.2 Create the peer assessment Participants CSV file
Use any of these following templates to adapt and create your Participants CSV file using your preferred editor.
After editing the template, remember to create a CSV file type using SAVE AS CSV, DOWNLOAD AS CSV or EXPORT AS CSV, depending on your spreadsheet editor.
For a registered Xorro user, use this link to launch a new peer assessment activity. You will be presented with an option to import directly your Participants CSV.
https://qf.xorro.com/pap/launches/new
If your CSV refuses to load, or the activity fails to create, review the detailed steps in the next sections to ensure your CSV is specified correctly.
FAQ - I’m having problems importing my participants csv
Check carefully that the specifications detailed in the INSTRUCTIONS and COLUMN EXPLANATIONS presented within the sample.csv template are followed strictly.
Use a spreadsheet editor, such as Google Sheets, Excel or Numbers to produce a file that contains columns of data with these column headers id, first, last, email, team, and group_code. Precise INSTRUCTIONS and COLUMN EXPLANATIONS for each of these data are detailed below.
Use any of these templates to adapt and create your Participants CSV (comma separated variables file) using your preferred editor. The templates contain the example data and instructions shown below.
In the sample files, only the group BUS123.101/PMell/TutB/2020-05-28/SUM is a valid teamset suitable for processing by Peer Assess Pro. This is the only group that specifies membership of teams by the students in the class, the teams being Panda, Bear and Tiger.
Sample peer assessment Participants CSV
id | first | last | team | group_code | |
ANWO08 | Anna | Worth | ARTS123.204/WShak/2021-02-28 | ||
GRGR15 | Greta | Green | ARTS123.204/WShak/2021-02-28 | ||
AMTO01 | Amanda | Tolley | Amanda.Tolley@noreply.com | Bear | BUS123.101/PMell/TutB/2020-05-28/SUM |
ANWO08 | Anna | Worth | Anna.Worth@noreply.com | Bear | BUS123.101/PMell/TutB/2020-05-28/SUM |
HOBR03 | Holly | Brown | Holly.Brown@noreply.com | Bear | BUS123.101/PMell/TutB/2020-05-28/SUM |
ALJO11 | Alice | Jones | Alice.Jones@noreply.com | Panda | BUS123.101/PMell/TutB/2020-05-28/SUM |
GRGR15 | Greta | Green | Greta.Green@noreply.com | Panda | BUS123.101/PMell/TutB/2020-05-28/SUM |
JEWA06 | Jeff | Wang | Jeff.Wang@noreply.com | Panda | BUS123.101/PMell/TutB/2020-05-28/SUM |
BOWI12 | Bob | Wilson | Bob.Wilson@noreply.com | Tiger | BUS123.101/PMell/TutB/2020-05-28/SUM |
HEJO19 | Henry | Jones | Henry.Jones@noreply.com | Tiger | BUS123.101/PMell/TutB/2020-05-28/SUM |
JOSM13 | John | Smith | John.Smith@noreply.com | Tiger | BUS123.101/PMell/TutB/2020-05-28/SUM |
THWI18 | Thomas | Windsor | Thomas.Windsor@noreply.com | Tiger | BUS123.101/PMell/TutB/2020-05-28/SUM |
ANWO08 | Anna | Worth | Anna.Worth@noreply.com | COMP123.201/PDod/TutA/2020-10-01 | |
HOBR03 | Holly | Brown | Holly.Brown@noreply.com | COMP123.201/PDod/TutA/2020-10-01 | |
JOSM13 | John | Smith | John.Smith@noreply.com | COMP123.201/PDod/TutA/2020-10-01 |
INSTRUCTIONS |
1. Organise your participants data into the columns corresponding to those shown in columns A to F, the first 6 columns headed 'id' through 'email'. You might find it helpful to paste your data from row 17, below the sample data provided in rows 2 through 16. The sample data provided demonstrates ten unique individuals (ids), organised into three different groups. One group contains a further three teams. A group might comprise all members of a class, or subdivisons such as streams, cohorts, sections, or tutorial groups. In the group called BUS123.101/PMell/TutB/2020-05-28/SUM the participants are subdivided further into three different teams, Bear, Panda and Tiger. Only group BUS123.101/PMell/TutB/2020-05-28/SUM is a Xorro teamset suitable for a peer assessment activity. A group is not a team. A group (such as a class) may contain several teams, in which case that's a Xorro teamset. |
2. If you are preparing a separate file, ensure you use exactly the same column headers for your list as shown in row 1. That is, 'id', 'first', 'last', 'email', 'team', 'group_code'. These headers are not case sensitive. The sequence of column headings is NOT IMPORTANT. You may optionally include additional headers and columns of data. This data will be ignored by Xorro. Data may be sorted by any of the columns. |
3. Read carefully the COLUMN EXPLANATIONS, below, for each type of data. Some data is optional, and can be skipped, as shown for group_code ARTS123.204/WShak/2021-02-28 |
4. Delete the sample data, immediately below the header row. That is, delete everything between row 2 and row 16. CRITICAL: CHECK you do not have duplicate ids in the same group_code. CHECK you do have all the ids in your class allocated to to a group, and, optionally, a team |
5. If you have used this page as your template, you may DELETE this 'instructions' column. That is, delete anything not part of your data. Keep the column headers. The headers must be on row 1 of your file. |
6. Save (Download, Export, Save As) the file as a CSV, giving it an appropriate filename. |
7. From Xorro-Q, browse to PARTICIPANTS, then upload the CSV file. Alternatively, when you Launch a Peer Assessment Activity, you can IMPORT directly the CSV to create or update the activity. From this sample file, upon upload three groups would be created in Xorro: ARTS..., BUS.... and COMP.... Only one of the groups is a teamset containing the three teams Bear, Panda, and Tiger. |
8. COLUMN EXPLANATIONS |
id - Compulsory field. Identifier for this participant, must be unique for the entire institution. For a peer assessment activity, this is the participant's login id. No blanks or characters such as #@$%&*()+ |
first - Compulsory field. Participant's first name |
last - Compulsory field. Participant's last name |
email - Optional field. The participant's email. Ideally required for a peer assessment activity when you require autogenerated warnings and notifications from Peer Assess Pro. |
team - Optional field. Required for peer assessment activity. The name of the team in which the participant is a member. The participant can be a member of NO MORE than one team within the same group. A participant may belong to different teams in different groups. |
group_code - Optional field. Required for a peer assessment activity. The code for the group (ie course, class, stream, cohort) into which the participant is being enrolled. If the participant is in multiple groups, supply a separate line for each group in which the participant is a member. Good practice. Append to your root code, such as BUS123.101, abbreviations that indicate the teacher, activity date (start or due), subdivision (stream, cohort), summative or formative. Note that Anna Worth is enrolled in three groups and in one team within group BUS123.101/PMell/TutB/2020-05-28/SUM |
After editing the template, remember to create a CSV version of your file. Depending on your editor, the appropriate command is:
FILE… SAVE AS … TEXT CSV
FILE… DOWNLOAD AS … Commas separated values (.csv)
FILE… EXPORT AS CSV
FILE… EXPORT TO… CSV
First, see FAQ - I’m having problems importing my participants csv
In frequent cases, using the FILE… SAVE command in your spreadsheet editor will produce a file with the incorrect file format, such as .xls, .sheet, or .numbers.
Xorro will reject those file formats. Xorro accepts and loads only .csv.
Follow this advice
FAQ: How do I create a CSV file from a Google Sheet
We advise creating a new, unique group_code for each Xorro Activity you create, even for repeat peer assessments within the same class term or semester.
Use a group_code like this
BT123.101/PJM/2020-03-28/FORM
We suggest your group_code include these elements as per the example above:
We recommend your resulting group_code should distinguish uniquely this semester’s mid-semester formative peer assessment(s) from last semester’s end of class summative where, perhaps, the same institutional class code could have a different set of student names.
The group_code is specified in the Participants CSV file you import prior to launching a Peer Assess Pro™ Activity.
In the general case, a very large class could comprise several cohorts, streams or tutorial sets, each subclass containing several teams conducting one or more peer assessment activities. Consequently, your group_code should help distinguish these separate peer assessment activities. For example,
BT123.101/PJM/TutB/2020-05-28/SUM
Consider two teachers at the same institution teaching the same course but with different tutorial groups. If they use the same goup_code, such as BT101, they will load their own team sets into the same Xorro Participants’ Group, additively, thereby causing mutual confusion and dismay. Similarly, a teacher using the same group_code from term to term, semester to semester, and year to year will experience similar grief.
Quick links and related information
FAQ - I’m having problems importing my participants csv
FAQ: How do I correct the Team Composition in a running peer assessment activity?
FAQ: How do I correct the participants (team members) in a group I uploaded?
FAQ: How do I create a CSV file from a Google Sheet
In summary
Enter the following details, in this sequence
Set a realistic Due Date that is your target for when you expect and want most students to have completed the peer assessment. In practice, typical Due Dates are set to within four days to seven days beyond the Start Date.
The Due Date is used by Peer Assess Pro to generate automatically:
If you use the Xorro default Due Date, which currently is NOW, the Start Date, you will not receive the benefits of the automated processes conducted by Peer Assess Pro that are triggered by a practical Due Date.
The ‘Due Date’ date is advisory only. Students can CONTINUE to submit responses beyond the Due Date UNTIL the teacher Finalises the activity. After the Finalisation Date, the students will have no more than two weeks to review their results.
FAQ: How do I adjust the Due Date or deadline?
The short answer is ‘You can’t adjust the Due Date!’ You don’t need to!
After setting the Start At and Due Dates, select Create Activity .
Double check your Start Date and Due Date carefully!
Once you Create Activity you cannot adjust the Start Date. The peer assessment Survey and the Email notifications to students requesting their response are created when the Start Date is reached. Furthermore, the email advises the students the Start Date and Due Date.
Therefore, an adjustment to the Start Date would confuse the students as the Participant Activity URLs would be announced to students. Those Activity URLs could become unavailable to the students if dates were adjustable.
For a similar reason, you cannot adjust the Due Date. However, the ‘Due Date’ date is advisory only. Students can CONTINUE to submit responses beyond the Due Date UNTIL the teacher Finalises the activity.
FAQ: Can I adjust the Start Date or Due Date for a running activity?
In short, No! But in a ‘worst case scenario’ you can abandon the activity and launch a new activity. Review the foregoing FAQ for details on how to Abandon a running Peer Assess Pro activity.
Peer Assess Pro Teacher’s Dashboard
When the Start Date occurs, Peer Assess Pro automates several activities:
A unique peer assessment survey is created for every team and team member
Quick links and related information
FAQ - I’m having problems importing my participants csv
FAQ: How do I correct the Team Composition in a running peer assessment activity?
FAQ: Can I adjust the Start Date or Due Date for a running activity?
FAQ: How do I view a list of the participants (team members) in the group I uploaded?
FAQ: How do I view and experience what the students experience?
FAQ: How do I find the the Peer Assess Pro Teacher’s dashboard?
This is an alternative approach to launching a peer assessment activity. This is a two stage process where you can
(Image to come)
This uploads your Participants CSV within which you have classified your students into teams, as detailed in Section 2.2 Create the peer assessment Participants CSV file
Note that multiple teamset groups may be created using this import process. This is potentially useful for managing peer assessment in large, multi-stream classes.
You should see a list of all the students belonging to the class for whom you wish to run the peer assessment activity.
Note: The message ‘Exists’ or ‘Conflict’ means that the id (Identification) code has already been identified within your institution, or a previous Group you have uploaded. Carry on!
At this point you are unable to confirm the team membership of your team class. You must first launch a peer assessment activity selecting (one of) the Group Codes that existed within the original Participants CSV.
Quick links and related information
FAQ - I’m having problems importing my participants csv
FAQ: How do I view a list of the participants (team members) in the group I uploaded?
FAQ: How do I view or change the participants (team members) in a group I uploaded?
FAQ: How do I correct the Team Composition in a running peer assessment activity?
FAQ: Can I adjust the Start Date or Due Date for a running activity?
In short, No! Please check carefully your Start Date and Due Dates before you Create Activity.
Active Warnings show when you need to take action to remedy an issue during execution of the peer assessment activity.
In the following example, one member of Team Brazilia has completed the assessment of their four team members. Consequently, a warning is generated for Team Brazilia that the number of responses from the team is insufficient for presenting valid results. In contrast, all four team members of Team Kublas have completed the assessment.
The warnings displayed in this case are
Click through the warning to gain advice on how to remedy the situation. For example, you can remind the students to complete the survey. Emails are automatically generated and sent on your behalf to all or selected students.
Quick links and related information
FAQ: What is the content of emails sent by Peer Assess Pro?
FAQ - What emails have been sent by the platform?
Upon commencing the peer assessment survey, team members are asked first to confirm that the team members identified or their team are correct. If not, the student initiates a request notification to the teacher to readjust their team’s membership.
Once the peer assessment activity has been launched, you can modify the team composition as per the following FAQ.
FAQ: How do I correct the Team Composition in a running peer assessment activity?
Changes to a Xorro Group will have NO EFFECT on a currently running activity, unless you Finalise then Abandon the activity. Then re-launch a new activity with the revised Group. This is an extreme response, and should not generally be required, if you follow the previous FAQ.
Quick links and related information
Students who have NOT completed the survey are sent an email reminder 72 hours, 24 hours and 12 hours before the Due Date.
Similarly, if a student is required to resubmit a response because a team has been reconstituted, an automatic reminder is sent.
Quick links and related information
FAQ: What is the content of emails sent by Peer Assess Pro?
FAQ - What emails have been sent by the platform?
The Team Results for each team must be entered should you intend to select any of these methods to calculate the Personal Result.
After you have entered or revised your Team Results, communicate the Personal Results to your class using Publish or ‘Update’ button.
Team Results are not used to calculate:
Upon entering Team Results, the Peer Assess Pro platform selects automatically the Normalised Personal Result (NPR) method for calculating participants’ Personal result. A Scale Factor of 1.0 is selected.
The Personal Result Calculation Method calculates the Personal Result you will award to each team member.
When you first enter Team Results, the Peer Assess Pro platform selects automatically the Normalised Personal Result (NPR) method for calculating participants’ Personal result. A Scale Factor of 1.0 is selected.
To adjust the Personal Result Calculation Method and/or adjust the Scale Factor
Quick links and related information
FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?
You can explore progress and final results at the class, team, and individual level.
In the Class Results, select a Bucket Range to identify the specific students lying within the range of a histogram bar chart.
Before reviewing results, see:
FAQ: When, why, and how do I ‘Recalculate Results’?
Example class statistics
In any of the tables, you may
The Individual Personal Snapshot enables you to view all data related to one student. The Student View version of the Personal Snapshot shows exactlt the report the student will receive when the teacher Publishes the results of the current Peer Assess Pro activity.
However, the teacher may wish to view how the results will appear to students BEFORE they are Published. Consequently, there are four possible views of an Individual Personal Snapshot. They are variations on the following example. The four views are explained later.
Example Individual Personal Snapshot (1 of 3)
Example Individual Personal Snapshot (2 of 3)
Example Individual Personal Snapshot (3 of 3)
Note there are four possible views of an Individual Personal Snapshot.
If the view is not yet Published, the student will see this remark.
Results unpublished
The same message will be also be displayed if the team is not a valid assessed team, even if the results have been Published to the class as a whole.
Select an individual team to probe the results of its team members. Sort by Peer Assessed Score or Index of Realistic Self Assessment. Then you can quickly review the Individual Personal Snapshot of each team member as part of your diagnosis to identify ‘star performers’ , ‘at risk’ team members, and those with outlier degrees of over confidence or underconfidence.
Example Team Statistics
(To come)
(To come)
There are many advanced statistics and charts you can view. Furthermore, from ‘Available Actions’ you can Download Full Statistics to conduct more detailed investigations beyond the scope of what we have conceived.
Quick links and related information
FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?
Results of the peer assessment are hidden from team members until you initiate Publish Survey on the Peer Assess Pro Teacher’s dashboard.
Before Publishing, see:
FAQ: When, why, and how do I ‘Recalculate Results’?
The foregoing ‘Refresh and Recalculate’ steps provide you with the opportunity to quality review results before publishing and republishing personal results and qualitative peer feedback comments. In short, as the peer assessment activity progresses towards the due date, results ARE NOT automatically updated and made available for viewing by the students.
Take Care! Once an activity is Published, the results can never be unpublished. However, you may re-publish results if new responses are submitted and/or you make adjustments to Team Results, Team Composition, etc. To reiterate, even if interim results have been published to students, as the peer assessment activity continues to progress towards the due date, results ARE NOT automatically updated and made available for viewing by the students.
Results will be hidden from the teacher and ALL team member in teams where less than one-half of team members have submitted the peer assessment. Peer assessment results are possibly not valid and representative at this stage of the survey activity processing. For small teams, at least 3 team members must have submitted a response. That is, team sizes of 3, 4, 5 and 6 team members require at least three team members to have peer assessed each other. A team of 7 or 8 requires a minimum of 4 responses. Team members who have already submitted a response will ALSO be advised their results are hidden until more of their team members have submitted responses.
Quick links and related information
FAQ: How do I view and experience what the students experience?
Survey responses from Team Members are received and available for incorporation into the Peer Assessment activity UNTIL the you explicitly Finalise the Survey. Even responses submitted after the Due Date announced to students, at the launch of the Activity, will be available for incorporation UNTIL the survey is Finalised deliberately by the Teacher. Until Finalisation, you can request a student to reconsider. They will then optionally resubmit their responses.
From the Peer Assess Pro Teacher’s Dashboard, select either
Example Gradebook Summary Statistics
Example Gradebook Full Statistics
Quick links and related information
Everyone | ||
FAQs on the web at http://tinyurl.com/papFAQ | ||
Launch peer assessment activity
Manage the peer assessment activity
Definitions, calculations, and examples
FAQ: When and how is the peer assessment conducted?
FAQ: What is the purpose of peer assessment?
FAQ: What questions are asked in the peer assessment survey?
FAQ: How are peer assessment and personal results calculated and defined mathematically?
FAQ: Is the self-assessment used to calculate Peer Assessed Score?
FAQ: Give me a quick overview of how to launch a Peer Assess Pro™ activity through Xorro
FAQ: How do I navigate the PARTICIPANTS page for Peer Assess Pro?
FAQ: How do I view and experience what the students experience?
FAQ - I’m having problems importing my participants csv
FAQ: How do I create a CSV file from a Google Sheet?
FAQ: How do I correct the participants (team members) in a group already uploaded to Xorro?
FAQ: Can I adjust the Start Date or Due Date for a running activity?
FAQ: Can I adjust the Start Date or Due Date for a running activity?
FAQ: How do I correct the Team Composition in a running peer assessment activity?
FAQ: What is the content of emails sent by Peer Assess Pro to Participants?
FAQ: What is a valid assessed team?
FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?
FAQ: When, why, and how do I ‘Update and Recalculate Results’?
FAQ: What happens if [a student tries] to 'game' (fool? play? disrupt?) the peer assessment process?
FAQ: How do I advise a student who feels they have been unfairly treated?
FAQ - What emails have been sent by the platform? (Notifications History)
FAQ: What is the content of emails sent by Peer Assess Pro to Participants?
FAQ: What is a valid assessed team? WARNING 0022
FAQ - How do I fix an invalid, missing or failed email delivery? WARNING 0026
FAQ: What is a mismatched self-assessment (IRSA)? WARNING 0040
FAQ: How is an outlier peer assessment rating identified? WARNING 0042
FAQ: What is an ‘at risk’ team member? WARNING 0044
FAQ: What is a low quality team rating? WARNING 0050
FAQ: What is a low quality assessor rating? WARNING 0300
FAQ: How are peer assessment and personal results calculated and defined mathematically?
FAQ: How is the Peer Assessed (PA) Score calculated?
FAQ: Is the self-assessment used to calculate Peer Assessed Score?
FAQ: How is the Peer Assessed Index (PA Index) calculated?
FAQ: How is the Indexed Personal Result (IPR) calculated?
FAQ: How is the Normalised Personal Result (NPR) calculated?
FAQ: How is the Rank Based Personal Result (RPR) calculated?
FAQ: How is Standard Peer Assessed Score (SPAS) calculated?
FAQ: What is Employability? How is it calculated?
FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?
FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?
FAQ - I’m having problems importing my participants csv
FAQ: How do I contact people at Peer Assess Pro?
FAQ: Where may I view the most recent version of the user guides?
FAQ: What are the design objectives, key features, and benefits of the Peer Assess Pro development?
The purpose of peer assessment
Undertaking the peer assessment
Using peer assessment results for better performance
How peer assessment affects personal results
FAQ: What is the purpose of peer assessment?
FAQ: How are peer assessment and personal results calculated and defined mathematically?
FAQ: What questions are asked in the peer assessment survey?
FAQ: I am unable to login. My login failed
FAQ: How do I login to my peer assessment Activity URL
FAQ: When and how is the peer assessment conducted?
FAQ: How do I provide useful feedback to my team members?
FAQ: How do I view and experience what the students experience?
FAQ: Is the self-assessment used to calculate Peer Assessed Score?
FAQ: What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process?
FAQ: How do I interpret the feedback results I've received from the peer assessment?
FAQ: How do I interpret measures of realistic self-assessment?
FAQ: What steps can I take to get a better personal result?
FAQ: Is the self-assessment used to calculate Peer Assessed Score?
FAQ: I don’t understand what my teammates are trying to tell me. How do I ask for better feedback?
FAQ: What is Employability? How is it calculated?
FAQ: How are peer assessment and personal results calculated and defined mathematically?
FAQ: Is the self-assessment used to calculate Peer Assessed Score?
FAQ: What steps can I take to get a better personal result?
FAQ: What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process?
Peer assessment is an educational activity in which students judge the performance of their peers, typically their teammates. Peer assessment takes several forms including
The ability to give and receive constructive feedback is an essential skill for team members, leaders, and managers.
Consequently, your teacher has chosen to use Peer Assess Pro™ to help you provide developmental feedback to your team members, for both formative and/or summative purposes.
The goal of developmental feedback is to highlight both positive aspects of performance plus areas for performance improvement. The result of feedback is to increase both individual and team performance (Carr, Herman, Keldsen, Miller, & Wakefield, 2005).
Additionally, your teacher may use the quantitative results calculated by Peer Assess Pro™ to determine your Personal Result for the team work conducted by your team. Your Personal Result may contribute to the final (summative) assessment grade you gain for the course in which Peer Assess Pro™ is applied.
In general, your Personal Result is calculated from two factors:
There are many possible criteria for assessing your contribution to your team’s work. Peer Assess Pro has chosen to place equal weight on two groups of factors based on a well-established instrument devised by Deacon Carr, Herman, Keldsen, Miller, & Wakefield (2005), Task Accomplishment, and Contribution to Leadership and team processes:
The selection of the criteria used in the Peer Assess Pro is reinforced by the results from a recent survey that asked employers to rate the importance of several competencies they expected to see in new graduates from higher education. The figure shows teamwork, collaboration, professionalism, and oral communications rate amongst the most highly needed Career Readiness’ Competencies (CRCs) sought by employers. All these Career Readiness competencies rate at least as ‘Essential’, with Teamwork and Collaboration rating almost Absolutely Essential (National Association of Colleges and Employers), 2018).
Employers rate their essential need for Career Readiness Competencies
Source: National Association of Colleges and Employers (NACE). (2018). Figure 42, p. 33.
Quick links and related information
FAQ: What questions are asked in the peer assessment survey?
FAQ: How are peer assessment and personal results calculated and defined mathematically?
FAQ: How is the Peer Assessed (PA) Score calculated?
References
Deacon Carr, S., Herman, E. D., Keldsen, S. Z., Miller, J. G., & Wakefield, P. A. (2005). Peer feedback. In The Team Learning Assistant Workbook. New York: McGraw Hill Irwin.
National Association of Colleges and Employers (NACE). (2018). Job Outlook 2019. Bethlehem, PA. https://www.naceweb.org/
The best practice for conducting peer assessment in an academic course follows several stages.
The midpoint formative peer assessment is an optional element of peer assessment within the classroom. As a minimum, the formative peer assessment gives the team members experience of the Peer Assess Pro™ mechanism including the questions that will be used to conduct the final, summative peer assessment.
More importantly, the midpoint formative assessment helps ensure that team members have the opportunity to respond proactively to the peer feedback they receive immediately the peer assessment activity concludes. Through undertaking appropriate corrective action mid-way through the course, team members have the opportunity to raise their peer assessment rating, their team’s results, and, therefore, their end of course personal results.
The intention of formative assessment is that, ideally, a team member should face no surprises when they receive their final personal result and peer assessment feedback at the conclusion of the course. For instance, a free-riders should receive clear feedback that the rest of their team observes they are free-riding. Consequently, the free-rider should learn in a timely manner that they will be penalised at the concluding summative assessment unless they remediate their behaviour. It is equally important that an overachieving student who does most of the work is given timely feedback that they need to learn to involve and engage the other team members in the team’s planning and execution of tasks. The Peer Assess Pro™ survey specifically targets these aspects of leadership and team process contributions. This particular style of overachieving student should be identified through the peer assessment ratings they receive.
To minimise the risk of surprises, it is important, therefore, that the peer assessment you provide to your team members at the midpoint of a team activity is
Quick links and related information
FAQ: What questions are asked in the peer assessment survey?
FAQ: How do I provide useful feedback to my team members?
FAQ: How do I view and experience what the students experience?
FAQ: How do I interpret the feedback results I've received from the peer assessment?
FAQ: How are peer assessment and personal results calculated and defined mathematically?
FAQ: Is the self-assessment used to calculate Peer Assessed Score?
It is essential that the peer assessment a team member provides to their team members through peer assessment is:
Ohland et al (2012) provide a table of Behaviorally Anchored Ratings covering high and low contributions to team effectiveness. The table provides some guidance to team members about how they might give accurate, effective, and productive feedback to their team members through peer assessment.
Examples of high and low contributions to team effectiveness | ||
HIGH | CONTRIBUTION | LOW |
|
| |
| INTERACTION |
|
| KEEPING FOCUS |
|
| CAPABLE |
|
Source: Ohland et al., (2012) | ||
Adapted by Mellalieu (2017) from Ohland, M. W., Loughry, M. L., Woehr, D. J., Bullard, L. G., Felder, R. M., Finelli, C. J., … Schmucker, D. G. (2012). APPENDIX B: Behaviorally Anchored Rating Scale (BARS) Version, from Comprehensive Assessment of Team Member Effectiveness. Academy of Management Learning & Education, 11(4), 609–630. Retrieved from http://amle.aom.org/content/11/4/609.short |
For teachers: How do I advise a student who feels they have been unfairly treated?
Here are some symptoms that you may have been treated unfairly by one or more team mates in their peer assessment of you:
If you believe you may have been unfairly treated, these are the steps you should pursue, in this order of action
An appeal against a peer assessment result is likely to fail if one or more of the following circumstances have prevailed:
Take these steps to avoid a mismatch between the peer assessment result you expect, and the result you receive.
Quick links and related information
How peer assessment affects personal results
FAQ: How are peer assessment and personal results calculated and defined mathematically?
FAQ: Is the self-assessment used to calculate Peer Assessed Score?
FAQ: What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process?
Using the results from peer assessment for better performance
FAQ: How do I interpret the feedback results I've received from the peer assessment?
FAQ: I don’t understand what my teammates are trying to tell me. How do I ask for better feedback?
FAQ: How do I interpret measures of realistic self-assessment?
FAQ: What steps can I take to get a better personal result?
(To be published)
Quick links and related information
FAQ: How do I interpret measures of realistic self-assessment?
Begin by viewing this video. Watch especially for the question that is introduced soon after minute 15 by Harvard University professor Sheila Heen.
Heen, S. (2015). How to use others’ feedback to learn and grow. TEDx. Retrieved from https://www.youtube.com/watch?v=FQNbaKkYk_Q
As Heen and Stone observe
“Feedback is less likely to set off your emotional triggers if you request it and direct it. So donʼt wait until your annual performance review. Find opportunities to get bite-size pieces of coaching from a variety of people throughout the year. Donʼt invite criticism with a big, unfocused question like “Do you have any feedback for me?” Make the process more manageable by asking a colleague, a boss, or a direct report,
“Whatʼs one thing you see me doing (or failing to do) that holds me back?”
That person may name the first behavior that comes to mind or the most important one on his or her list. Either way, youʼll get concrete information and can tease out more specifics at your own pace.” (Heen & Stone, 2014)
Quick links and related information
Heen, S., & Stone, D. (2014). Find the Coaching in Criticism. Harvard Business Review, 9. Retrieved from https://medschool.duke.edu/sites/medschool.duke.edu/files/field/attachments/find-the-coaching-in-criticism.pdf
Your Personal Result is determined from a combination of your Team Result and your Peer Assessed Score. Consequently, to raise your Personal Result you need to apply balanced effort to raising both these contributing factors.
Typically, your Team Result is earned from its assignment outputs, such as a report, and/or a presentation. Consequently, the grade for the Team Result is determined by the teacher, based on the rubric (marking guideline) they apply to assess your team’s outputs. Ensure you understand the assignment elements and how each will be assessed. Seek out exemplars of good practice. Pursue the guidance found in:
Mellalieu, P. (2013, March 15). Creating The A Plus Assignment: A Project Management Approach (Audio). Innovation & chaos ... in search of optimality website: http://pogus.tumblr.com/post/45403052813/this-audio-tutorial-helps-you-plan-out-the-time
In addition to your teacher and their assistant tutors, your academic institution will offer personal and group coaching to guide you on the specific success factors related to the type of assignment you are pursuing. Schedule appointments to make use of these support facilities early in your project. Locate the online resources these coaching support services have curated for your guidance.
Group and team projects present special challenges of coordination, motivation, communication and and leadership. These challenges are normal! Furthermore, an essential part of your job as team member is to overcome proactively these challenges as part of your academic learning journey.
As you overcome these challenges you will achieve several benefits directly instrumental in raising your Personal Result:
You will also develop team work and leadership competencies that will both raise your future employability, and your effectiveness in future teamwork, as discussed in:
FAQ: What is the purpose of peer assessment?
Whilst there are many resources to help address the challenges of team work in academic settings, we suggest you familiarise yourself with these resources early in your team project. Since “Any fool can learn from their own mistakes. It takes genius to learn from the mistakes of others” (Einstein), be proactive rather than foolish in learning effective team working skills from:
Turner, K., Ireland, L., Krenus, B., & Pointon, L. (2011). Collaborative learning: Working in groups. In Essential Academic Skills (2nd ed., pp. 193–217).
Carr, S. D., Herman, E. D., Keldsen, S. Z., Miller, J. G., & Wakefield, P. A. (2005). The Team Learning Assistant Workbook.
Good practice peer assessment management by your teacher will provide you with two opportunities for peer assessment and peer feedback through your course, formative and summative.
Your first, mid-course, formative assessment provides you with early advice about your strengths and opportunities for development as perceived by your team members. Make use of this formative feedback at the earliest opportunity as you proceed towards the conclusion of your team work, and your final, summative peer assessment. Usually, this final, summative assessment is where you earn the significant contribution to your course grade from the Personal Result earned from your Peer Assessed Score awarded by your team members.
Consequently, take proactive action following the mid-course formative assessment through referring to:
FAQ: How do I interpret the feedback results I've received from the peer assessment?
Maybe you don’t understand or don’t agree with the feedback your teammates are providing. In that case, refer to
FAQ: I don’t understand what my teammates are trying to tell me. How do I ask for better feedback?
Quick links and related information
The purpose of peer assessment
FAQ: What is the purpose of peer assessment?
Undertaking the peer assessment
Using peer assessment results for better performance
How peer assessment affects personal results
FAQ: How are peer assessment and personal results calculated and defined mathematically?
FAQ: What steps can I take to get a better personal result?
FAQ: What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process?
FAQ: Is the self-assessment used to calculate Peer Assessed Score?
What happens if a team member attempts to 'game' the peer assessment process?
The designers of Peer Assess Pro have many decades’ experience working with students. We know the tricks that students attempt to play with peer assessment. We have anticipated the tricks, so Peer Assess Pro warns the teacher that a trick may be being played. Furthermore, the teacher receives highly specific and student individualized information about each incident. The teacher may then undertake overt or covert action to address the issue to which they have been alerted. For example, the trick-playing student or team may then receive a request to reconsider and resubmit their peer assessment. In more extreme incidents, the student or team may receive an invitation to visit the teacher for a counselling consultation.
Here follow a few of the ‘tricks’ that Peer Assess Pro identifies and warns the teacher about during the survey process. Examples follow later.
Here are some examples of the highly specific and individualized Active Warnings a teacher receives about each incident.
Madison may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 100 and low range 3. Team Alpha
Ben may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 86 and low range 5. Team Bravo
This message warns the teacher that the team member has given everyone a near perfect Peer Assessed Score or a similar score (narrow range). Practically, from the student’s point of view they are ‘wasting their votes’. If everyone is scored with the same or similar score then students who have contributed substantially to the team’s result will not be adequately recompensed. Furthermore, if EVERY team member pursued this same approach, then every team member would be awarded the Team Result. In this case, the team member just looks stupid in the eyes of the teacher. Furthermore, the team member fails to gain practice at being a leader where giving accurate assessments of team members’ contributions is a valued management competency.
Team Bravo may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 98 and low range 8.
Team Echo may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 94 and low range 10.
These messages warns the teacher that the team collectively may have arranged to give everyone a near perfect Peer Assessed Score or the same score. Practically, from the students’ point of view, this trick is a waste of time. If everyone is scored with the same score, or a perfect Peer Assessed Score of 100, then every team member will be awarded the Team Result … which is usually not 100. The team members just look stupid in the eyes of the teacher. Furthermore, they may not receive useful qualitative feedback and ratings that help guide focussed development of their future productivity in team assignments and their future professional work in teams.
The warning highlights a situation where the team members appear to be inconsistent in rating high, medium and low contributors to the team’s process and results.
This example is a symptom there maybe some disruptive team dynamics or bullying within the team.
Harrison Ford assessed Steven Spielberg awarding a PA Subscore of 38. Compared with the average rating by the other team members of 70 this subscore DEPRESSED the PA Score to 64 by 7 PA Score Units. Team Alpha
This message warns the teacher there maybe some favouritism between friends or allies.
Donald Trump assessed Vladimir Putin awarding a PA Subscore of 90. Compared with the average rating by the other team members of 57 this subscore RAISED the PA Score to 64 by 7 PA Score Units. Team Charlie
Anna self-assessment of 68 is OVERCONFIDENT compared with the peer assessment of 34 given by others in team Charlie. IRSA = 51
This message warns the teacher that the team member has a very much higher opinion of their performance than is evidenced by the rating provided by their peers. The teacher may request an interview with the student to explore the reasons for this divergence, and how the student can develop a more realistic self-assessment.
Alternatively, the team member may be being scape-goated by the remainder of the team, and that possibility will be discussed with the team member for whom this warning is raised.
Anna has been rated amongst the lowest in class. Low Recommendation 2.3 and/or low Peer Assessment Score 34. Team Alpha
This message warns the teacher that the team member is rated very poorly when compared with most of the class. It’s often a symptom of little or no attendance or contribution by the team member, which the teacher will verify through examining the qualitative feedback provided by the team members. Again, the teacher may request an interview with the student to explore the reasons for this divergence, and how the student can develop a more realistic self-assessment.
Examine the following teacher’s dashboard graphic revealing a real class that undertook a peer assessment.
Teachers dashboard: visible identification of teams with low quality team rating
Team 1, 14, 13, 2, 7, 5, and 11. Over ½ of the teams in the class!
Observation: This class was poorly briefed on how to make the best use of peer assessment and feedback. With a better briefing, less than 10% of teams will raise this warning.
The lower Team Results are associated with teams that had a low quality team rating. Apart from Team 15, all teams with an adequate quality team rating had a Team Result equal to or greater than the Class median Team Result of 73.3. For example, Team 10 (Team Result 90) through to Team 4 (Team Result 76.7) according to the sort by Range in the foregoing table.
Team 10, with a Team Result of 90 is clearly a high performing team. The moderately low Range of PA Scores (10) across the team suggests IN THIS CASE that everyone contributed relatively equally and effortfully towards a great Team Result. Reminder: The Team Result is awarded by the teacher: it is independent of the Peer Assessed Scores of the team.
However, Team 3 is also a good candidate for being a fair and productive team. They engaged honestly with peer assessment, awarding a high spread of Peer Assessed Scores (Range 18.8) an a team average PA Score (78.3). This team average was not outrageously high, in contrast to teams 1 (100!), 14 (100), 13, 2, 7, 5. Furthermore, Team 3 earned the class median Team Result of 73.3, which appears then allocated according to the peer assessed contribution of the team members. This fair distribution is illustrated in the following graph and table. Team Member Charlie earned the highest Personal Result of 81, whilst Able earned 65.3. Similar reasoning applies to Team 6 to a slightly lesser degree, since the Range is not so wide.
Note from the following graph how teams 14, 5, 13, 1, 2 and 7 are again glaringly identified in the Teachers Dashboard as outlier teams poorly engaged with the peer assessment process: the low vertical spread in the graph. This low vertical spread in the Personal Result (NPR in this case) derives from the low range of Peer Assessed Scores across each team.
With this admittedly small case size, we advance the proposition that ‘Better feedback leads to better teams’. And/Or ‘Better teams give better feedback!’. In conclusion, let’s say Better feedback. Better teams.
Teachers dashboard: a fairly productive team
Quick links and related information
FAQ: What is the purpose of peer assessment?
FAQ: How do I provide useful feedback to my team members?
FAQ: How do I interpret the feedback results I've received from the peer assessment?
FAQ: I don’t understand what my teammates are trying to tell me. How do I ask for better feedback?
The following section explains how the teacher should respond to the Active Warnings displayed on their dashboard. The thresholds parameters and program logic for raising the warnings are also provided.
If this is your first time using Peer Assess Pro, we recommend strongly that you glance briefly our Frequently Asked Questions so you are prepared to answer your own and your students' concerns - https://www.peerassesspro.com/frequently-asked-questions-2/
Download a pdf of the Quickstart Guide and this Reference Guide here - http://tinyurl.com/papRefPdf
Contact
Patrick Dodd - https://www.peerassesspro.com/contact/
Quick links and related information
View the web Quickstart Guide at tinyurl.com/pdfQuickWeb
FAQ: How do I contact people at Peer Assess Pro?
FAQ: Where may I view the most recent version of the User Guide?
FAQ: What are the design objectives, key features, and benefits of the Peer Assess Pro development?
Our overall objectives for Peer Assess Pro™ are
We appreciate your participation in this pre-market release of our substantially revised Peer Assess Pro™ in conjunction with the Xorro advanced quiz and survey platform.
As we proceed through this pre-market refinement phase we respond almost daily to your suggestions for improving both the software applications and user documentation. These improvements are implemented at anytime whilst we undergo our Beta Development phase. We anticipate that our implementations are robust enough to prevent loss of your data and wasting your time. We crave your forgiveness if we have been over optimistic in keeping Murphy’s Law at a distance.
You need not take any action to use the latest versions of the Peer Assess Pro™ Xorro Teacher’s Dashboard. Those updates happen in the background and will automatically use any data and activities you have initiated. However, if you use the PDF version of this user guide, you will need to update regularly to the latest version here.
Quick links and related information
FAQ: where may I view the most recent version of the Reference Guide?
If you quit your browser then wish to return to the Teachers Dashboard
From HOME Tab
From Activities Tab: Running Activities
Quick links and related information
Note the list of ‘All participants’ currently known to Xorro in your institution.
Note a list of all other Groups uploaded by other Teachers in your institution. A group is a list of participants, such as students in a class. The minimum requirement for a Group is id, first name, last name.
However, for a peer assessment activity a Group must include team membership for all team members. This team membership is not required for most other Xorro activities. Accordingly, Groups set up for other teachers or by other teachers will rarely contain the correct team membership data required for your Peer Assess Pro™ activity.
Select Group ClassAM101.6. This group selection displays a list of about 25 students in the class titled AM101.6
Quick links and related information
FAQ: How do I correct the participants (team members) in a group I uploaded?
FAQ: How do I correct the Team Composition in a running peer assessment activity?
Ensure you read ALL of this FAQ before proceeding.
If you make a mistake in this process the consequence may lead to unrecoverable, complete loss of all responses received to date
Apply this process when, for a launched, running peer assessment activity, you need to make these adjustments:
Select the ‘Team Composition’ button for the running Peer Assessment Activity for which you wish to adjust the team composition.
During the re-import, the changes to the teamset will be presented to you so that you can check and confirm the adjustment process. Take care!
Upon completion of the re-import process, the running Peer Assess Pro Activity will continue.
All students in teams affected by a change in composition are now required to resubmit their peer assessment responses. Reason: They now have different team members to rate. The remaining teams of the class will be unaffected. There responses remain submitted and evident within Peer Assess Pro.
Team members will be notified of their need to re-submit by an automatically generated email from Peer Assess Pro.
You cannot change the participants in the Xorro Group used to create the running activity, as explained in the FAQ:
FAQ: How do I correct the participants (team members) in a group already uploaded to Xorro?
Reason: whenever a Xorro activity is created a snapshot is taken of the Group used to create the activity. From that moment this snapshot, known as a Xorro Teamset, is inextricably connected with the activity. That activity-specific teamset can be updated only during a running activity through the FAQ detailed above, through the Team Composition section of the Peer Assess Pro dashboard.
In the image above, the Group used to create the peer assessment activity is BT101. Any changes made to that group WILL NOT affect the running activity. The teamset created from the Group BT101 is denoted 2019-02-24 BT101 by Beta Beta. That name indicates what date the teamset was created, from which Group, and by whom.
You can add, swap or delete delete team members anytime before launching the activity, and anytime before the peer assessment activity is finalised.
Good Practice Hint. Get your team composition list absolutely correct before the activity is launched and made available for response by your students. Reason: All students in teams affected by a change in composition will be required to resubmit their peer assessment responses. The students now have different team members to rate. However, the remaining teams of the class will be unaffected.
Quick links and related information
FAQ: How do I correct the Team Composition in a running peer assessment activity?
Confirm that the Participants CSV file you created is in the correct format. Open the Participants CSV using a text editor (Apple Textedit, or Microsoft Windows Notepad).
The format should appear as illustrated below. Note that
id,first,last,group_code,team,email,
BOWI12,Bob,Wilson,123.101,Tiger,Bob.Wilson@xorroinstitution.com,
ALJO11,Alice,Jones,123.101,Panda,Alice.Jones@xorroinstitution.com,
JOSM13,John,Smith,123.101,Tiger,John.Smith@xorroinstitution.com,
JOSM13,John,Smith,123.202,,John.Smith@xorroinstitution.com,
GRGR15,Greta,Green,123.101,Panda,Greta.Green@xorroinstitution.com,
GRGR15,Greta,Green,123.204,,,
HEJO19,Henry,Jones,123.101,Tiger,Henry.Jones@xorroinstitution.com,
AMTO01,Amanda,Tolley,123.101,Bear,Amanda.Tolley@xorroinstitution.com,
JEWA06,Jeff,Wang,123.101,Panda,Jeff.Wang@xorroinstitution.com,
HOBR03,Holly,Brown,123.101,Bear,Holly.Brown@xorroinstitution.com,
HOBR03,Holly,Brown,123.202,,Holly.Brown@xorroinstitution.com,
THWI18,Thomas,Windsor,123.101,Tiger,Thomas.Windsor@xorroinstitution.com,
ANWO08,Anna,Worth,123.101,Bear,Anna.Worth@xorroinstitution.com,
ANWO08,Anna,Worth,123.202,,Anna.Worth@xorroinstitution.com,
ANWO08,Anna,Worth,123.204,,,
Quick links and related information
FAQ - I’m having problems importing my participants csv
A Beta Test demonstration site has been established with these credentials:
Browse to: https://qf.staging.xorro.com/
Enter: Username BetaTest, Password Secret
This Beta Test User is established for you to view. But don’t touch to hard!
View
WARNING! HERE THERE BE DRAGONS!!
If a peer assessment activity is launched and running then you cannot update team membership details in that running activity using the procedure described here!!! Instead, apply the procedure described here FAQ: How do I correct the Team Composition in a running peer assessment activity?
Reason: whenever a Xorro activity is created a snapshot is taken of the Group used when creating the activity. From that moment this snapshot, known as a Xorro Teamset, is inextricably connected with the activity. That activity-specific teamset can only be updated during a running activity through the following FAQ.
Quick links and related information
FAQ: How do I correct the Team Composition in a running peer assessment activity?
Quickstart Guide for Peer Assess Pro: Xorro. (2019, March 6). Peer Assess Pro. http://tinyurl.com/pdfQuickWeb
Pdf version: http://tinyurl.com/pdfQuick
Login and orientation. (2019). Auckland: Peer Assess Pro.
Launch a Peer Assess Pro Activity. (2019). Auckland: Peer Assess Pro.
Student survey experience. (2019). Auckland: Peer Assess Pro.
Peer Assess Pro. (2019, March 5). Manage a Peer Assessment Activity using Xorro: Reference Guide for Teachers. Auckland: Peer Assess Pro
Web version http://tinyurl.com/papRefWeb2
Pdf version http://tinyurl.com/papRefPdf
Feel welcome to make suggestions or ask questions using the Comment feature of the Google Docs development version. Shows work in progress improvements.
Frequently Asked Questions (FAQs) (2019). In Manage a Peer Assessment Activity using Xorro: Reference Guide for Teachers [web]. Auckland, New Zealand: Peer Assess Pro. http://tinyurl.com/papFAQ
Peer Assess Pro. (2019). Xorro Peer Assess ProTM Teachers Process Flowchart: Overview and Detail. http://tinyurl.com/papChart
Quick links and related information
The choice of calculation method for determining a team member’s personal result is determined by the teacher's preference for compensating more strongly team members who have contributed significantly to their teams, and under-rewarding team members who are peer assessed as weak contributors. The figure illustrates the statistical features, such as team average, range, and standard deviation, associated with each method.
Alternative calculation methods for Personal Result (PR) illustrating effect on team average and spread for a given Team Result
The teacher can select either the Peer Assessed Score (PA Score) or Peer Assessed Index (PA Index) if they wish to exclude a team result in calculating the Personal Result (PR).
More usually, the Peer Assessed Score and Team Result (TR) are combined mathematically to produce a Personal Result. There are three alternative methods. As the figure illustrates, the Indexed Personal Result (IPR) is the least discriminating method, whilst the Rank-Based Personal Result (RPR) is the most discriminating in terms of favouring significant team contributors and penalising weak contributors. Most teachers select the Normalised Personal Result, often with a spread factor of 1.5 to 2.0.
In contrast to the graphical illustration earlier, the following table summarises the example calculations presented through a series of FAQ that present the mathematical definition and example calculations for each method.
Comparison of Personal Results calculated by several methods in a team of four members
ASSESSEE | ||||||
ASSESSOR | Bridget | Julian | Lydia | Nigella | Mean | Range |
Rank Reversed | 1 | 2 | 4 | 3 | ||
Peer Assessed Score, PA Score | 54 | 74 | 82 | 78 | 75 | 28 |
Peer Assessed Index, PA Index | 66 | 90 | 100 | 95 | 88 | 34 |
Team Result, TR | 50 | 50 | 50 | 50 | 50 | 0 |
Indexed Personal Result, IPR | 33 | 45 | 50 | 48 | 44 | 17 |
Normalised Personal Result, NPR (SpreadFactor = 1) | 39 | 51 | 56 | 54 | 50 | 17 |
Normalised Personal Result, NPR (Spreadfactor = 2) | 28 | 52 | 62 | 58 | 50 | 34 |
Rank-Based Personal Result, RBR | 20 | 40 | 80 | 60 | 50 | 60 |
Source: FAQ: How are peer assessment and personal results calculated and defined mathematically?
Definitions and features of calculation methods used in Peer Assess Pro
Attribute (X1) | Abbreviation (X1) | Definition (X1) |
Peer Assessed Score | PA Score | A relative measure of the degree to which a team member has contributed to their team's overall achievement, team processes, and leadership. The Peer Assessed Score (PA Score) is calculated for each team member directly from their Average Team Contribution (ATC) and Average Leadership Contribution (ALC). That is, from the ten components of Team and Leadership contribution survey in the peer assessment. A Peer Assessed score is generally used to compare the relative contribution of students WITHIN the same team, rather than BETWEEN teams. The Team Result has NO impact on the value of the Peer Assessed Score. Values for the PA Score range from zero through 100. |
Peer Assessed Index | PA Index | The Peer Assessed Score (PA Score) is indexed upwards so that the person in the team with the highest Peer Assessed Score is awarded a Peer Assessed Index of 100. All other team members receive a proportionally lower PA Index in the ratio PA Score / max(PA Score). The Team Result has NO impact on the value of the Peer Assessed Index. |
Team Result | TR | The result awarded to the team for the outputs of their work. The teacher typically derives the Team Result (TR) from grades for team reports, presentations, and results of Team Readiness Assurance Tests. The teacher may select to combine a student's Peer Assessed Index (PA Index) with their team's Team Result (TR) to calculate a Personal Result (PR) for each student, reflecting their relative contribution to the Team Result as assessed by their peer team members. Peer Assess Pro enables the teacher to select from several methods to combine the Team Result and Peer Assessed Index (PA Index) to produce a Personal Result: the Indexed Personal Result (IPR), the Normalised Personal Result (NPR), and the Rank Based Personal Result (RPR). |
Measures of a student's personal result | ||
Personal Result | PR | A student's personal result gained from combining their Peer Assessed Index (PA Index) and, optionally, their Team Result (TR). The teacher selects from one of several Calculation Methods to calculate the Personal Result that incorporates the Team Result. These methods are Indexed Personal Result (IPR), Normalised Personal Result (NPR), and Rank-Based Personal Result (RPR). The choice of method is determined by the teacher's preference for compensating more strongly students who have contributed significantly to their teams, and under-reward students who are peer assessed as weak contributors. Figure 1 illustrates the statistical features, such as team average, range, and standard deviation, associated with each method. The IPR is the least discriminating method, whilst the RPR is the most discriminating in terms of favouring significant team contributors and penalising weak contributors, as the figure illustrates. |
Indexed Personal Result | IPR | The Indexed Personal Result is calculated from the Team Result (TR) combined with the student's specific Peer Assessed Index (PA Index). The Indexed Personal Result method awards the Team Result to the TOP RATED student in the team, since, by definition, their Peer Assessed Index is 100. All remaining students in the same team earn the Team Result downwards, directly proportional to their PA Index. The Indexed Personal Result calculation means that NO team member can earn an Indexed Personal Result greater than the Team Result. That is, values for the Indexed Personal Result range from zero up to the Team Result. |
Normalised Personal Result | NPR | The Normalised Personal Result is calculated from the Team Result combined with the student's specific Indexed Personal Result (IPR). However, in contrast to the IPR method, the Normalised Personal Result method awards the AVERAGE student in the team the Team Result (TR). All remaining students are awarded a Personal Result ABOVE or BELOW the Team Result depending on whether their IPR is above or below that team's average. Features of the Normalised Personal Result are that (a) In contrast to the IPR method, the Normalised Personal Result method calculates a Personal Result ABOVE the Team Result for the above-average peer rated students in the team (b) The average of the team's Normalised Personal Results matches the Team Result (c) The spread of the team's Normalised Personal Results matches the spread of the Indexed Personal Results (IPR) that is calculated for that team. Spread is measured by the standard deviation statistic. . Optional feature: To enhance the effect of rewarding high contributors and penalising weak contributors the tutor can increase the Spread Factor (SF) from the default value of 1.0. Increasing the Spread Factor increases the spread of the results centred around the Team Result. However, an increase in the Spread Factor will maintain a team average NPR that matches that team's Team Result. A Spread Factor of 1.5 to 2.0 is recommended, especially in classes where team members are reluctant to penalise weak contributors and/or reward the highest contributors through their peer assessment rating responses. Values for the NPR range from zero to 100. Calculations that exceed these ranges are clipped to fit within zero to 100 |
Rank Based Personal Result | RPR | The Rank Based Personal Result is calculated from the Team Result combined with the student's specific Rank Within Team based on that student's Peer Assessed Score. Like the Normalised Personal Personal Result the RPR method awards the AVERAGE student in the team the Team Result. All remaining students are awarded a personal result above or below the Team Result depending on whether their Rank Within Team is above or below that team's middle-ranked student. Features of the Rank Based Personal Result (PR) calculation method are that (a) A team's RPR values are spread over a MUCH WIDER range than the NPR and IPR methods. Small differences in PA scores within a team are amplified significantly by this method (b) In contrast to the IPR method, the RPR method calculates a Personal Result significantly ABOVE the Team Result for the top ranked student in the team (c) Like the NPR method, the average of the team's RPR values matches the Team Result. Values for the Rank Based Personal Result range from zero to 100. Calculations that exceed these ranges are clipped to fit within the range zero to 100. |
Note that in the Xorro version of Peer Assess Pro, we have renamed the following Personal Result Methods from those used in the Google Docs version of Peer Assess Pro.
Renaming of terms for Peer Assess Pro
Peer Assess Pro | Abbreviation | Google Peer Assess Pro | Abbreviation |
Peer Assessed Score | PA Score | Team Based Learning Score | TBL Score Score |
Peer Assessed Index | PA Index | Team Based Learning Index | TBL Index |
Quick links and related information
FAQ: How are peer assessment and personal results calculated and defined mathematically?
A teacher has several alternative calculation methods to determine a personal result from a team member’s Peer Assess Pro assessment. The teacher will usually advise team members about the method they have chosen.
The teacher’s choice of calculation method for a personal result is determined by the teacher's preference for
These choices are illustrated in this figure.
A student’s Personal Result emerges from the Teacher’s choice of Calculation Method, relative Peer Assessed Score, and Team Result
The teacher can select either the Peer Assessed Score (PA Score) or Peer Assessed Index (PA Index) if they wish to exclude the team result in calculating the personal result.
Quick links and related information
FAQ: How is the Peer Assessed (PA) Score calculated?
FAQ: How is the Peer Assessed Index (PA Index) calculated?
More usually, the Peer Assessed Score (PA Score) and team result are combined through one of three methods. The following methods are listed in order of increasing impact for compensating more strongly students who have contributed significantly to their teams, and under-rewarding students who are peer assessed as weak contributors
FAQ: How is the Indexed Personal Result (IPR) calculated?
FAQ: How is the Normalised Personal Result (NPR) calculated?
FAQ: How is the Rank Based Personal Result (RPR) calculated?
Quick links and related information
FAQ: What factors are measured in the peer assessment survey?
FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?
The Teachers Process Flowchart: Detail illustrates the points throughout the peer assessment process where emails are sent to students to advise them
In most cases, the emails are generated automatically by the Peer Assess Pro system. In the case of warnings, the teacher has the option of initiating an email request to a student, or ignoring that warning.
Copies of all emails are sent to the teacher whose Xorro account was used to launch the activity
When you create and launch a Peer Assess Pro™ Peer Assessment activity in Xorro AND the Start Date has been reached:
Alternatively, the teacher can direct students to the Participant URL shown at the top left of the Xorro HOME page. The student must then select from a list the correct peer assessment activity for their response. The teacher may deliver other Xorro-based test activities from which the student must select the correct Peer Assess Pro™ activity distinguished by the Activity Title specified by the teacher.
FAQ: How do I view and experience what the students experience?
FAQ: What questions are asked in the peer assessment survey? in the peer assessment survey?
xxx UNDER DEVELOPMENT xxx
The student will see this view when all of the following are TRUE:
Note that students can continue to submit responses AFTER the Due Date UNTIL the teacher has Finalised the activity.
The student will be able to see their Personal Results when all the following are true:
A student with a Xorro Plus account may view his results any time after the Activity is Finalised by the Teacher.
The student views
Example results for a student
Xxx TO DO xxx
Quick links and related information
FAQ: What questions are asked in the peer assessment survey? in the peer assessment survey?
FAQ: How is the Peer Assessed (PA) Score calculated?
The following terms have been renamed from the Google version of Peer Assess Pro for Peer Assess Pro
Renaming of terms for Peer Assess Pro
Peer Assess Pro | Abbreviation | Google Peer Assess Pro | Abbreviation |
Peer Assessed Score | PA Score | Team Based Learning Score | PA Score Score |
Peer Assessed Index | PA Index | Team Based Learning Index | PA Index |
Quick links and related information
FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?
In general, see the Sections
Warning messages are under constant development and refinement as we respond to facilitators’ and team members’ experience of Peer Assess Pro.
These warnings must be resolved, otherwise utterly invalid results will arise, and students’ time will be wasted completing incorrect surveys.
Example: The composition of a team needs adjusting, see
See Adjusting team composition
Peer Assess Pro will not be able to present results for all teams unless these warnings are resolved.
Example: Insufficient responses from a team are received
See Results hidden from team members and teacher
Example: Enter Team Results:
Advisory warnings do not affect critically the operation of Peer Assess Pro. However, the teacher would be prudent to review the details to ensure that peer assessments have been conducted fairly and honestly.
Example: Overgenerous or parsimonious ratings by a team member.
FAQ: How is an outlier peer assessment rating identified?
Several warnings give the facilitator the option to despatch an email to students advising them of exceptional conditions and requesting their action. For example
The criteria used to generate these warnings, and the recommended response by the facilitator is detailed in this section:
For example, in the case of a Mismatched self-assessment, the team member is invited to meet with the teacher to explore the reasons for the mismatch, and develop approaches to narrow the gap.
Quick links and related information
FAQ: What is the content of emails sent by Peer Assess Pro?
You select Refresh and Update results when
The most important reason is that you as a teacher MUST be able to review results BEFORE displaying (publishing) results to students. After examining the results to date, you might publish an interim snapshot of the results for view by students.
Students may review the interim results and raise issues such as a questionable peer assessment rating, such as scapegoating. Alternatively, you may need to adjust a Team Result, or experiment with another method of Personal Result Calculation.
In this situation, we have presumed you do not want new responses, nor adjustments to be immediately viewable by students. In particular, you need the opportunity to review the effect of adjustments before explicitly publishing revised results to students.
Quick links and related information
The Peer Assess Pro survey measures one overall assessment, Recommendation, followed by ten quantitative ratings, then several qualitative questions.
The ten quantitative ratings are used to calculate the Peer Assessment Score (PA Score). The ten ratings are categorized into two classes: Contribution to Task, and Contribution to Leadership and Teamwork, as shown in the example survey below.
In addition, two qualitative questions are asked that request examples of behaviours supporting the quantitative ratings in relation to Contribution to Task, and Contribution to Leadership and Teamwork. Finally, the assessor is asked to provide Development Feedback. That is, advice that would help the team member improve their future contribution to the team.
Quick links and related information
FAQ: How is the Peer Assessed (PA) Score calculated?
FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?
The ten questions used as the basis for calculating the Peer Assessment Score are adapted from:
Deacon Carr, S., Herman, E. D., Keldsen, S. Z., Miller, J. G., & Wakefield, P. A. (2005). Peer feedback. In The Team Learning Assistant Workbook. New York: McGraw Hill Irwin.
My name is: | I am rating my team member: | ||||
My Team name is: | Team Member A | ||||
Team Member B | |||||
Team Member C | |||||
Self | |||||
Recommendation | How likely is it that you would recommend this team member to a friend, colleague or employee? 1 = Highly unlikely, 5 = Extremely likely | ||||
Contribution to Task Accomplishment | |||||
Rate the team member on a 5-point scale. Rating scale: 1 = Almost never, 2 = Seldom, 3 = Average, 4 = Better than most, 5 = Outstanding Rate your typical or average team member a mid-level rating of 3. | |||||
Initiative | Shows initiative by doing research and analysis. Takes on relevant tasks with little prompting or suggestion. | ||||
Attendance | Prepares for, and attends scheduled team and class meetings. | ||||
Contribution | Makes positive contributions to meetings. Helps the team achieve its objectives. | ||||
Professionalism | Reliably fulfils assigned tasks. Work is of professional quality. | ||||
Ideas and learning | Contributes ideas to the team's analysis. Helps my learning of course and team project concepts. | ||||
Contribution to Leadership and Team Processes | |||||
Focus and task allocation | Keeps team focused on priorities. Facilitates goal setting, problem solving, and task allocation to team members. | ||||
Encourages contribution | Supports, coaches, or encourages all team members to contribute productively. | ||||
Listens and welcomes | Listens carefully and welcomes the contributions of others. | ||||
Conflict management and harmony | Manages conflict effectively. Helps the team work in a harmonious manner. | ||||
Chairmanship | Demonstrates effective leadership for the team. Chairs meetings productively. |
Peer Assessment Survey: Feedback to the team member Submit one copy of this form for each team member |
My name is: |
I am a member of Team Number and Name: |
I am assessing (student’s name): |
Contribution to Task Accomplishment For the team member you have assessed, provide specific examples of productive or ineffective behaviours related to your ratings of Contribution to Task Accomplishment. For example, shows initiative; attends meetings; makes positive contributions; helps team achieve objectives; is reliable; contributes quality work; contributes to learning of course concepts. Further examples here http://tinyurl.com/BARSOhland |
Contribution to Leadership and Team Processes For the team member you have assessed, provide specific examples of productive or ineffective behaviours related to your ratings of Contribution to Leadership and Team Processes. For example: keeps team focused on priorities; supports, coaches and encourages team members; listens carefully; manages conflict effectively; demonstrates effective leadership. |
Development feedback What specific behaviours or attitudes would help your team member contribute more effectively towards your team's accomplishments, leadership, and processes? Please provide specific positive or constructive feedback that could enable the team member to improve their behaviour productively. Considering your team member's strengths, how could that person coach other team members to acquire similar strengths for Task Accomplishment, Team Processes, and Leadership? |
Source: Peer Assess Pro (2019). |
The Peer Assessed Score, PA Score, is a relative measure of the degree to which a team member has contributed to their team's overall achievement, team processes, and leadership.
A Peer Assessed Score is generally used to compare the relative contribution of students WITHIN the same team, rather than BETWEEN teams. The Team Result has NO impact on the value of the Peer Assessed Score.
The PA Score is calculated for each team member directly from summing the ten ratings of Team and Leadership Contribution surveyed in the peer assessment. The sum of ratings is adjusted by scale factors to give values for the PA Score that range from zero through 100.
The Peer Assessed Score is an essential factor used as the basis for calculating several alternative measures of Personal Result including the Peer Assessed Index (PA Index), Indexed Personal Result (IPR), Normalised Personal Result (NPR), and Rank Based Personal Result (RPR).
The self-assessment conducted by a team-member is EXCLUDED from the calculation of their Peer Assessed Score. The self-assessment, PA (self), is used to enable the student to compare their self-perception with that of their team members, and the class as a whole. One method of comparison, the IRSA, is based on the ratio as detailed in the FAQ:
FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?
There are ten Peer Rating components awarded by each Assessor, a, to each Assessee, s, in the team of t members. The mathematical task is to combine all these ratings into one Peer Assessed Score for each team member.
The Peer Assessed SubScore is defined as the peer assessment score awarded by Assessor a to Assessee s:
Where
= the Peer Rating for each of the ten peer assessment components, r, submitted by the Assessor a for the assessed team member, the Assessee, s. The student’s self-assessment is excluded from the calculation of the PA Score. The Recommendation rating is excluded from calculation of the PA Score.
To ensures the PA Score ranges from zero through 100 the following features are required in the above formula:
The Peer Assessed Score, for team members s is the mean of the PA Subscores awarded by the other (t - 1) team members to the team member s.
Where
t = the number of team members in the team in which s is a team member.
= the peer assessment score awarded by Assessor a to Assessee s, mathematically defined earlier.
Note that Peer Assessed Score takes NO account of the team’s Team Result. The Team Result is accounted for in the Indexed Personal Result (IPR), Normalised Personal Result (NPR) and Rank-Based Personal Result (RPR) methods discussed elsewhere.
An example calculation is shown below. In the first table, the team member Bridget (ASSESSEE) is rated by her three team members (ASSESSORS), plus her own self-rating. The subsequent tables show the calculation of the Peer Assessment Score for all four team members based on all team members’ assessment ratings. The long-form calculations show in detail the arithmetic calculations.
Quick links and related information
FAQ: What questions are asked in the peer assessment survey?
Alternative but equivalent methods for calculating the Peer Assessed Score are detailed below in the section:
Alternative mathematical formulations of PA Score
Example table of assessments for assessed team member Bridget
ASSESSEE: Bridget | ASSESSOR: Ratings by team member: | |||||
Team Name: Kubla | Bridget (Self) | |||||
Julian | ||||||
Lydia | ||||||
Nigella | ||||||
Mean Rating | ||||||
Contribution to Task Accomplishment | ||||||
Rating scale: 1 = Almost never, 2 = Seldom, 3 = Average, 4 = Better than most, 5 = Outstanding | ||||||
Initiative | Shows initiative by doing research and analysis. Takes on relevant tasks with little prompting or suggestion. | 5 | 5 | 3 | 1 | 9/3 |
Attendance | Prepares for, and attends scheduled team and class meetings. | 4 | 4 | 4 | 1 | 9/3 |
Contribution | Makes positive contributions to meetings. Helps the team achieve its objectives. | 4 | 5 | 5 | 1 | 11/3 |
Professionalism | Reliably fulfils assigned tasks. Work is of professional quality. | 4 | 3 | 4 | 1 | 8/3 |
Ideas and learning | Contributes ideas to the team's analysis. Helps my learning of course and team project concepts. | 5 | 5 | 5 | 1 | 11/3 |
Contribution to Leadership and Team Processes | ||||||
Focus and task allocation | Keeps team focused on priorities. Facilitates goal setting, problem solving, and task allocation to team members. | 5 | 5 | 3 | 1 | 9/3 |
Encourages contribution | Supports, coaches, or encourages all team members to contribute productively. | 4 | 4 | 4 | 1 | 9/3 |
Listens and welcomes | Listens carefully and welcomes the contributions of others. | 5 | 5 | 3 | 1 | 9/3 |
Conflict management and harmony | Manages conflict effectively. Helps the team work in a harmonious manner. | 4 | 4 | 4 | 1 | 9/3 |
Chairmanship | Demonstrates effective leadership for the team. Chairs meetings productively. | 5 | 5 | 5 | 1 | 11/3 |
SubTotal | SubTotal = Task + Leadership | 45 | 45 | 40 | 10 | # 95/30 ( 3.167) |
Peer Assessed Score | PA Score = (2.5 x SubTotal ) - 25 | * 87.5 | 87.5 | 75 | 0 | 54.2 |
* The self-assessment ratings are excluded from calculation of the PA Score. So, 54.2 = (87.5 + 75 + 0) / 3 # Alternatively, PA Score = (25 x Mean Rating) - 25. So, 54.2 = 25 x 95/30 - 25 = (25 x 3.167) - 25 |
Suppose that the Peer Assessed Scores determined from all four team members rating each other appear as follows. Bridget’s PA Scores are copied from the previous table, forming the second vertical column here.
Since
Now consider the Assessment by Lydia of Bridget
In the previous table, note how Nigella rated Bridget with the minimum possible rating of one for all ten components. By definition, that gives a PA Score of zero. Similarly, if an assessor had rated a team member the maximum rating of five across all ten components, then a PA Score of 100 would have resulted.
Peer Assessed Sub-Scores for a team of four members
ASSESSEE | ||||
ASSESSOR | Bridget | Julian | Lydia | Nigella |
Bridget | 87.5 | 62.5 | 75 | 72.5 |
Julian | 87.5 | 92.5 | 87.5 | 82.5 |
Lydia | 75 | 82.5 | 77.5 | 80 |
Nigella | 0 | 77.5 | 82.5 | 82.5 |
Now the PA Score for each ASSESSEE team member is calculated from the mean of the PA SubScores provided by the other ASSESSORS in their team, as shown in the following table. The self-assessments of each ASSESSOR are excluded from the calculation. For example, the PA Score for Nigella is determined as follows from the ratings by her three teammates Bridget, Julian and Lydia:
Since
Then for Nigella
Calculation of Peer Assessed (PA) Scores for a team of four members
ASSESSEE | ||||
ASSESSOR | Bridget | Julian | Lydia | Nigella |
Bridget | - | 62.5 | 75 | 72.5 |
Julian | 87.5 | - | 87.5 | 82.5 |
Lydia | 75 | 82.5 | - | 80 |
Nigella | 0 | 77.5 | 82.5 | - |
Peer Assessed Score | 54.2 | 74.2 | 81.7 | 78.3 |
Note how Nigella’s rating of Bridget (PA Score = 0) seems an outlier when compared with the much higher ratings given by Julian and Lydia (7.5 and 75). Peer Assess Pro warns the teacher when outlier ratings like this occur.
This outlier issue is discussed in
FAQ: How is an outlier peer assessment rating identified?
The following equations provide the identical mathematical result for the calculation of PA Score.
Where:
Average Rating is the average rating of an assessed student s averaged over all the ten components of the rating for that student, by their team members. The Average Rating lies between 1 and 5.
The factor (-1) adjusts the Average Rating value to zero through 4. The scale factor 100 /4 adjusts the PA Score to lie between zero and 100.
Notice from the first table showing ratings of Bridget that the average rating across all ten components contributing to her Peer Assessment Score given by her three team members was shown as
Therefore, the PA Score is calculated directly from the average rating:
Finally,
Where:
ATC and ALC are the average ratings for the five components that comprise the Task and Leadership contributions, respectively.
Mathematically:
ATC and ALC range over the values 1 through 5. The factor (-1) adjusts those values from zero through 4. The scale factor 50/4 (= 12.5) ensures that the PA Score achieves a range from zero to 100.
Quick links and related information
FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?
FAQ: How are peer assessment and personal results calculated and defined mathematically?
The self-assessment conducted by a team-member when they rate their team members is EXCLUDED from the calculation of that team member’s Peer Assessed Score. Instead, their self-assessment, PA (self), is used to enable the team member to compare their self-perception with that of their team members, and the class as a whole. This comparison is provided to the team member through a SPider Chart and the calculation of their Index of Realistic Self Assessment (IRSA).
The Spider Chart shows each of their eleven ratings provided by themself, compared with the average of the ratings provided to them by their peer team members. The class average ratings for each of the 11 factors are also provided. In this example, the team member has significantly UNDERRATED themself on nearly all factors (innermost plots), when compared with the ratings provided by their team members (orange).
Spider Chart comparison of self and other team members’ contribution ratings
Another method of comparison, the IRSA, is based on the ratio
as detailed in the FAQ:
FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?
For the team member illustrated in the foregoing Spider Chart, their Peer Assessed Score, PA Score, is 92 and their self-assessed Score, PA (self), is 75. The ratio results in the Index of Realistic Self Assessment (IRSA) 122 = 100 x 92 / 75.
An IRSA between 75 and 95 is typical of about 2/3 of team members in a class. About ⅙ of team members achieve an IRSA below 75. Such people appear to assess their team members excessively OVERCONFIDENT in their abilities. In contrast, an IRSA above 95 suggests the team member has a tendency to UNDERESTIMATE their team contribution when contrasted with the assessment perceived by their team members.
Quick links and related information
FAQ: How are peer assessment and personal results calculated and defined mathematically?
The Peer Assessed Index is defined such that the team member with the maximum PA Score for each team is assigned a PA Index of 100. All other team members in the same team are scaled in relation to the maximum PA Score for that group.
In a gradebook of results, the PA Index is useful for identifying the team members most highly rated by their peers, as they have PA Indexes in the 90 to 100 range. In combination with the Team Result, the PA Index is used to calculate the Indexed Personal Result, (IPR), Normalised Personal Result, (NPR) and Rank-Based Personal Result (RPR).
Where
= the Peer Assessed Score for a team member s in team t, as defined in: FAQ: How is the Peer Assessed (PA) Score calculated?
= the maximum value of PA Score found across all members in team t.
Consider a team of four team members, whose PA Scores are shown in the following table. Lydia has a PA Score of 82, the highest for the team. Therefore, Lydia’s PA Index is 100, by definition.
Calculation of Peer Assessed Index (PA Index) for a team of four members
ASSESSEE | ||||
ASSESSOR | Bridget | Julian | Lydia | Nigella |
Bridget | - | 62.5 | 75 | 72.5 |
Julian | 87.5 | - | 87.5 | 82.5 |
Lydia | 75 | 82.5 | - | 80 |
Nigella | 0 | 77.5 | 82.5 | - |
Peer Assessed Score | 54 | 74 | 82 | 78 |
Peer Assessed Index | 66 | 90 | 100 | 95 |
Bridget has a PA Score of 54, the lowest for the team. Therefore, since
Note that, as expected
The data for the previous table is drawn from
FAQ: How is the Peer Assessed (PA) Score calculated?
Quick links and related information
FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?
FAQ: How is the Peer Assessed (PA) Score calculated?
The Indexed Personal Result (IPR) is calculated from the Team Result (TR) combined with the team member’s specific Peer Assessed Index (PA Index). The Indexed Personal Result method awards the Team Result to the TOP RATED team member in the team, since, by definition, their Peer Assessed Index is 100. All remaining team members in the same team earn the Team Result downwards, directly proportional to their PA Index.
The definition of Indexed Personal Result means that NO team member can earn an Indexed Personal Result greater than the Team Result. That is, values for the Indexed Personal Result range from zero up to the Team Result. Consequently, the IPR disadvantages team members who have been rated unfavourably by their peers. However, no reward is made for the team member(s) who have been rated as the most contributing team members. In contrast, the Normalised Personal Result and Rank-Based Personal Result do award a Personal Result above the Team Result for those team members who contribute above average to the team’s outputs, as assessed by their peers.
For each team member s, in their team, t
Where
= the team result awarded by the teacher for the outputs of team t
= the Peer Assessed Index for the team member s, as defined in
FAQ: How is the Peer Assessed Index (PA Index) calculated?
Suppose that the following team has a Team Result, TR, of 50 and Peer Assessed Indexes previously calculated as follows. The example data is taken from:
FAQ: How is the Peer Assessed Index (PA Index) calculated?
Calculation of Indexed Personal Result in a team of four members
ASSESSEE | ||||
ASSESSOR | Bridget | Julian | Lydia | Nigella |
Peer Assessed Score, PA Score | 54 | 74 | 82 | 78 |
Peer Assessed Index, PA Index | 66 | 90 | 100 | 95 |
Team Result, TR | 50 | 50 | 50 | 50 |
Indexed Personal Result, IPR | 33 | 45 | 50 | 47.5 |
Bridget has a PA Index of 66, the lowest for the team. Therefore, since
In contrast, Lydia has the highest PA Score in the team, and hence a PA Index of 100. Therefore
The IPR for Lydia is equivalent to the Team Result, 50, as defined.
Quick links and related information
FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?
FAQ: How is the Peer Assessed Index (PA Index) calculated?
FAQ: How is the Peer Assessed (PA) Score calculated?
The Normalised Personal Result, NPR, is calculated from the Team Result combined with the team member’s specific Indexed Personal Result (IPR). The Normalised Personal Result method awards the average student in the team the Team Result (TR). All remaining students are awarded a Personal Result above or below the Team Result depending on whether their IPR is above or below that team's average IPR.
Features of the Normalised Personal Result method are that
Use the Normalised Personal Result method with a high Spread Factor if you
For each team member s, in their team, t
Where
the team result awarded by the teacher for the outputs of team t
That is, the mean value of the IPR values found for team t, containing n team members.
= a factor chosen optionally by the teacher that will S T R E T C H each team’s intrinsic spread of NPRs, as measured by the team’s standard deviation of NPR results. The default Spread Factor is 1.0. However a Spread Factor of between 1.5 and 2.o is recommended.
Values of NPR are trimmed to be within the range zero to 100.
Suppose that the following team has a Team Result, TR, of 50 and Indexed Personal Result previously calculated as follows. This first example illustrates a Spread Factor of 2.0. The example data is taken from:
FAQ: How is the Indexed Personal Result (IPR) calculated?
Calculation of Normalised Personal Result in a team of four members
Spreadfactor = 2.0
ASSESSEE | |||||
ASSESSOR | Bridget | Julian | Lydia | Nigella | Mean |
Peer Assessed Score, PA Score | 54 | 74 | 82 | 78 | |
Peer Assessed Index, PA Index | 66 | 90 | 100 | 95 | |
Team Result, TR | 50 | 50 | 50 | 50 | |
Indexed Personal Result, IPR | 33 | 45 | 50 | 48 | 44 |
Correction Factor (Spreadfactor = 2) | -22 | +2 | +12 | +8 | 0 |
Normalised Personal Result, NPR (Spreadfactor = 2) | 28 | 52 | 62 | 58 | 50 |
Bridget has a PA Index of 66, the lowest for the team.
The for the four-member team is 44, calculated from ¼ x (33 + 45 + 50 + 48).
Since
Then
In contrast, the Normalised Personal Result for Lydia, with her IPR of 50, is calculated as follows:
Note how Lydia’s NPR of 62 is above the team Result of 50. Note also how the mean of the NPR values across the team is 50 = (28 + 52 + 62 + 58)/4, identical to the Team Result of 50.
The previous example showed calculations of NPR using a Spread Factor of 2.0. The following table shows the results of calculating the Normalised Personal Result for the team using a more modest Spread Factor of 1.0.
Note the following:
The default Spread Factor is 1.0. However a Spread Factor of between 1.5 and 2.o is recommended.
Calculation of Normalised Personal Result in a team of four members
SpreadFactor = 1.0
ASSESSEE | |||||
ASSESSOR | Bridget | Julian | Lydia | Nigella | Mean |
Peer Assessed Score, PA Score | 54 | 74 | 82 | 78 | |
Peer Assessed Index, PA Index | 66 | 90 | 100 | 95 | |
Team Result, TR | 50 | 50 | 50 | 50 | 50 |
Indexed Personal Result, IPR | 33 | 45 | 50 | 48 | 44 |
Correction Factor (SpreadFactor = 1) | -11 | +1 | +6 | +4 | 0 |
Normalised Personal Result, NPR (SpreadFactor = 1) | 39 | 51 | 56 | 54 | 50 |
Quick links and related information
FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?
FAQ: How is the Peer Assessed Index (PA Index) calculated?
FAQ: How is the Peer Assessed (PA) Score calculated?
FAQ: How are peer assessment and personal results calculated and defined mathematically?
The Rank Based Personal Result is calculated from the Team Result combined with the student's specific Rank Within Team based on that student's Peer Assessed Score. Like the Normalised Personal Personal Result the RPR method awards the AVERAGE student in the team the Team Result. All remaining students are awarded a personal result above or below the Team Result depending on whether their Rank Within Team is above or below that team's middle-ranked student.
Features of the Rank Based Personal Result (RPR) calculation method are that
For student s in their team t with n team members
Where
the team result awarded by the teacher for the outputs of team t
= the reversed rank of the team member s in team t where the team member with the lowest Peer Assessed Score in that team is defined as 1. Equal ranks are permitted.
= numbers of members in team t
Values of RPR are trimmed to be within the range zero to 100.
Suppose that the following team has a Team Result, TR, of 50 and Peer Assessed Scores previously calculated as follows. The example data is taken from:
FAQ: How is the Peer Assessed (PA) Score calculated?
Calculation of Rank-Based Personal Result in a team of four members
ASSESSEE | |||||
ASSESSOR | Bridget | Julian | Lydia | Nigella | Mean |
Peer Assessed Score, PA Score | 54 | 74 | 82 | 78 | |
Rank (Reversed) | 1 | 2 | 4 | 3 | |
Share Fraction | 1/10 | 2/10 | 4/10 | 3/10 | |
Team Result, TR | 50 | 50 | 50 | 50 | 50 |
Rank-Based Personal Result, RBR | 20 | 40 | 80 | 60 | 50 |
First calculate the sum of ranks for the team of four members, n = 4. This number is the denominator for calculating the ShareFraction for each team member.
Consequently, there are 10 ‘pieces of cake’ to be shared amongst the 4 team members, in proportion to their reversed rank.
Bridget has a PA Score of 64, the lowest for the team. Her rank in the team, is therefore 1.
Note how the second-ranked student, Julian receives double the ShareFraction, and, consequently, double the RPR than does Bridget
Lydia, the top-ranked student, = 4, receives four times the RBR that Lydia received,
= 80.
Note how the mean of the RBR values matches the Team Result for team t of 50.
=
=
=
=
Note that, by definition, the sum of the ShareFractions across the team is exactly 100 %.
The following example shows a case where two team members have the same Peer Assessed Score of 74. Note how Lydia has a reverse rank of 4, not 3. The Google RANK function, for example, with the optional is_ascending flag set to 1 demonstrates this ranking behaviour.
Calculation of Rank-Based Personal Result with tied scores
ASSESSEE | |||||
ASSESSOR | Bridget | Julian | Lydia | Nigella | Mean |
Peer Assessed Score, PA Score | 54 | 74 | 82 | 74 | |
Rank (Reversed) | 1 | 2= | 4 | 2= | |
Share Fraction | 1/9 | 2/9 | 4/9 | 2/9 | |
Team Result, TR | 50 | 50 | 50 | 50 | 50 |
Rank-Based Personal Result, RBR | 22 | 44 | 89 | 44 | 50 |
Quick links and related information
FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?
FAQ: How is the Peer Assessed Index (PA Index) calculated?
FAQ: How are peer assessment and personal results calculated and defined mathematically?
How do we compare students within a class, and between classes based on their Peer Assessed Scores?
The short answer is: We can use the Peer Assessed Score to compare students ONLY within their team. A PA Score above that specific team’s average PA Score suggests that team member has contributed more than a team member with a lower PA score.
A Peer Assessed Score of 90 indicates that a student in the same team has contributed clearly more to their team’s outcomes than a student in the same team with a Peer Assessed Score of 30. However, a Peer Assessed Score achieved by a student in one team does not meaningfully compare with the Peer Assessed Score of a student in another team. A Peer Assessed Score of 60 in Team t1 is no better nor worse than a PA Score of 90 achieved by a student in another team t2.
We cannot conclude from comparing the Peer Assessed Score which is the better student in terms of team contribution and/or leadership when the students are from different teams. Why? Some students and teams diligently commit to rating each other so the average student in their team does rate ⅗ on each of the ten items in the peer assessment survey, as intended. Meanwhile, other teams believe they are all above average, having come from their local equivalent of Lake Wobegon. By chance and/or good team functioning, some teams achieve that desired state where all members work productively and effectively together: the Holy Grail of the Dream Team. Other teams comprising high performers can conversely fall into the desolation of dismal performance characterised by the Apollo Syndrome (Belbin).
The long answer is that through applying appropriate data analytics, we can develop three related numbers that enable comparison of peer assessed team members both within and between classes, and over time. These measures are Standard Peer Assessed Score (SPAS), Employability, and Net Employability. In essence, the data analytic processes can be likened to a forensic photoanalyst attempting to read an automobile’s number plate. Imagine the original photo image has been photographed through smog, on a dark night, from a far distance, with a low resolution setting, using a poor quality lense and poor imaging sensor. But through advanced algorithms that remove background noise, amplify relevant signals, and enhance clarity, a readable, useful image can be discerned, as illustrated in the example of from Acclaim Software
Source: Acclaim Software. (2015). Forensics - Recovering the Most Detail from Your Image - Focus Magic. http://www.focusmagic.com/forensics-tutorial.htm
The Standard Peer Assessed Score (SPAS) is our first measure designed to enable a more realistic relative comparison of peer assessment ratings between members of a whole class. The Standard Peer Assessed Score combines normalised values of the Recommendation and Peer Assessed Score for each team member. The normalisation applies several data analytic processes to correct for the biases introduced by some students and teams in their rating. The SPAS approach is not perfect, but it’s a start. Furthermore, the determination of Standard Peer Assessed Score is a necessary precursor to the calculation of Employability and Net Employability, discussed elsewhere.
The whole-of-class values of Standard Peer Assessed Score for a particular class response dataset are targeted to have these features:
Mean: 50
Standard Deviation: 20
Maximum possible range: from 0 to 100
By virtue of the definition of the Standard Peer Assessed Score, the following effects occur by design:
One half of the class values of Standard Peer Assessed Score will fall in the range zero to 50 (below the target average). Naturally, the remaining one half of values will fall in the range 50 to +100 (above average).
Approximately ⅔ of Standard Peer Assessed Scores in the class will lie between 30 and 60. That is, within one standard deviation of the mean value of 50. More accurately, if SPAS was normally distributed, then 68.3 percent of the class dataset values of SPAS will lie between plus and minus one standard deviation of the mean.
Approximately ⅙ of students in the class will receive a Standard Peer Assessed Score value of either greater than 60, or less than 30. More precisely, 15.9 percent of values will lie in each of these ranges.
Finally, given the wonders of the normal distribution, 95% of all class members will lie in the range of SPAS 10 through 90. That implies that a student with a SPAS above 90 is in the top 2.5 % of members of the class. Conversely a student with SPAS less than 10 is in the bottom 2.5% of the class. This knowledge allows the teacher to more reliably identify their star students, and students at risk, rather than relying simply on Peer Assessed Score.
The general approach to creating the Standard Peer Assessed Score is to apply z-score normalisation to a student’s (raw) Recommendation, R and Peer Assessed Score, PAS. The two z-scores () are added, then re-scaled to achieve, for the class dataset as a whole, the target mean,
of 50, and target standard deviation,
of 20 required for the SPAS statistic. Note that the result of z-score normalisation for any data set is such that the normalised data has a mean of zero and standard deviation of 1.0, detailed later.
The Standard Peer Assessed Score for student s is defined as
Where
= Target mean for the SPAS statistic, by definition a constant of 50
=Target standard deviation for the SPAS statistic, by definition a constant of 20
= a correction factor to ensure the standardisation process achieves the target standard deviation,
. The factor is required because in practice the distributions of the raw data are not normally distributed, but tend to have strong negative skew, due to such factors as the Lake Wobegon effect mentioned earlier. A factor of 1.2 has been found appropriate in practice.
= The z-score normalisation of the Recommendation rating for student s by their team members.
= The z-score normalisation of the Peer Assessed Score rating for student s by their team members.
= Recommendation rating awarded to student s by their team members in team t
= Peer Assessed Score awarded to student s by their team members in team t
= estimate of the class mean Recommendation rating derived over all valid assessed teams in the class responses dataset
= estimate of the class standard deviation of the class Recommendation ratings derived over all valid assessed teams in the class responses dataset
= the population mean of the Peer Assessed Scores derived over all team members in the team t in which student s is a member.
= the population standard deviation of the Peer Assessed Scores derived over all team members in the team t in which student s is a member.
Notes
Divisor of 2. The sum of the two z-score normalised functions, each with unit standard deviation, gives a resulting distribution with standard deviation of 2.0. Consequently the divisor of 2 is required in the calculation of SPAS so that has a mean of zero and standard deviation of 1.
Trimming. Values of Standard Peer Assessed Score that calculate above +100 are trimmed down to +100. Similarly, values of Standard Peer Assessed Score that calculate below 0 are trimmed up to 0.
The following table shows example calculations of Standard Peer Assessed Score for three students in two teams, A and B. Note how Michael Mass (Team A) and Lydia Loaded (Team B) have both been awarded the same Peer Assessed Score 0f 50 by their team members. However, because of their different team means and standard deviations, the z-score normalisations realise +1 and +2 respectively.
As part of the journey towards calculating SPAS, the intermediate calculations of the combined z-scores provide the basis for calculating the percentage proportion of the entire class who would fall below that combined z-score. This can be interpreted as the percentage of the class who would recommend the specific team member to an employee, a colleague, or another team. This percentage is rounded conservatively to produce the student’s Employability rating, the detailed methodology for which is detailed in the FAQ
FAQ: What is Employability? How is it calculated?
Example calculations for Standard Peer Assessed Score (SPAS) and Employability
Student, s | Peter Johns | Michael Mass | Lydia Loaded |
Recommendation, | 2.0 | 4.5 | 3.0 |
Mean of class Recommendation, | 3.0 | 3.0 | 3.0 |
Standard deviation of class Recommendation, | 0.5 | 0.5 | 0.5 |
Normalised Recommendation, | (2-3)/0.5 = -2 | (4.5-3)/0.5 = +3 | (3.0-3.0)/0.5 = 0 |
Peer Assessed Score, | 30 | 50 | 50 |
Team, | A | A | B |
Mean of team Peer Assessed Score, | 40 | 40 | 20 |
Standard Deviation of of team Peer Assessed Score, | 10 | 10 | 15 |
Normalised Peer Assessed Score, | (30-40)/10 = -1 | (50-40)/10 = +1 | (50-20)/15 = +2 |
Combined z-scores, | (-2-1)/2 = -1.5 | (+3+1)/2 = +2 | (0+2)/2 = +1 |
Target Standard Deviation, | 20 | 20 | 20 |
Correction factor, | 1.2 | 1.2 | 1.2 |
Target mean, | 50 | 50 | 50 |
Standardised Peer Assessed Score, SPAS | 50 + 1.2 x 20 x (-1.5) = 50 - 36 = 14 | 50 + 1.2 x 20 x 2 = 50 + 48 = 98 | 50 + 1.2 x 20 x 1 = 50 + 24 = 74 |
Proportion of class below Combined z-score | 0.5 + GAUSS(-1.5) = 0.5 - 0.4332 = 6.9% | 0.5 + GAUSS(+2) = 0.5 + 0.4772 = 97% | 0.5 + GAUSS(+1) = 0.5 +0.34 = 84% |
Employability | 10 | 90 | 80 |
The following figures show a Standard Peer Assessed Score histogram, and the histograms for the Recommendation and Peer Assessed Score data that contribute to the Standard Peer Assessed Score chart.
Figure 1. Histogram of Recommendation
Mean = 3.7, standard deviation = 0.53
Figure 2: Histogram of Peer Assessed Score
Mean = 67, standard deviation = 11.3
Figure 3: Standard Peer Assessed Score histogram
Mean = 0, Standard deviation = 20
The calculation of Standard Peer Assessed Score assumes several conditions, described as follows.
The statistical distributions of the Recommendation and Peer Assessed Scores (PA_Score) are assumed to be normally distributed. In practice, the distributions are typically asymmetric with negative skew. See Figures 1 and 2 earlier.
The Recommendation score awarded to a student s1 in team t1 are assumed to be absolutely comparable to a similar Recommendation score awarded to another student s2 in another team t2. In other words, a Recommendation score of 3.5 awarded to student s1 in team t1 means exactly the same for student s2 in team t2 if they are also awarded a Recommendation score of 3.5. Similarly, a difference in Recommendation ratings of 1.0 unit means the same in any team. In practice, the Recommendations made by one team may not be consistent with the Recommendation values assigned by another team. However, given that Recommendation is a ‘top of mind’ peer assessment done at the start of the Peer Assess Pro survey, we think it is a reasonable approximation. Consequently, the Recommendation values are z-score normalised using the mean and standard deviation of the entire class of responses.
In contrast, in normalising the Peer Assessed Score it is well recognised that different teams award quite different Peer Assessed Scores to a students who would ordinarily achieve the same Peer Assessed Score in an ideal world of perfect raters. Consequently, it is assumed that each team possesses a uniform, random mix of student capabilities drawn from the entire class. Therefore, all things being equal, one would expect that the mean and standard deviations of each team’s Peer Assessed Score would be equivalent. However, in practice, this equivalence is rarely observed. Consequently, the need arises to z-score normalise the Peer Assessed Score for each team to achieve a set of nor aloised Peer Assessed Scores with mean zero and standard deviation 1 FOR EACH TEAM.
The Peer Assessed Score awarded to a student s1 in team t1 is assumed NOT to be comparable to similar Peer Assessed Score that might be awarded to another student x in team y. Why? Some teams honestly peer assess each, whilst others attempt to ‘game’ the peer assessment process, such as awarding everyone above average, or even the full 5/5 rating for each of the team peer assessment factors. In contrast, it is assumed that the Peer Assessed Score of the average student in team t1 should be adjusted to match the peer rating of the average student rated in another team t2, even though the arithmetic value of the (original) Peer Assessed Scores usually differ. The same reasoning applies to the spread of Peer Assessed Score values within teams, namely, that the best team members in team t1 should be rated comparably with the best team member in team t2, even if their Peer Assessed Scores differ. Consequently, the Peer Assessed Scores WITHIN a team are scaled to match the relative values within other teams through normalisation using each team’s mean and standard deviation.
In that case, the z-score normalised Peer Assessed Score for every team member is set to 0.5.
A Future option to consider: Exclude students from consideration for receiving calculation of their SPAS in the case of a ‘misguided team’, identified as
In general, ‘NO’. A student is motivated differently in each of the classes the take. The luck of the draw is that they may work with a superior or inferior team, who will rate them relatively differently.
Quick links and related information
FAQ: What is Employability? How is it calculated?
FAQ: How are peer assessment and personal results calculated and defined mathematically?
For a specific Peer Assess Pro assessment, Employability is the statistical probability that team members from the class would recommend the specific team member to an employee, a colleague, or another team.
Employability is a proprietary measure defined by Peer Assess Pro™ drawn from the calculation of a student’s Standard Peer Assessed Score (SPAS). SPAS combines a student’s Peer Assessed Score and their Recommendation score, through various statistical treatments such as z-score normalisation. The resulting Employability score is a statistical probability, ranging from 5 to 95 percent. Employability is the best available estimate of the degree to which team members from the class in which the student has participated in a team project would recommend that specific team member to an employee, a colleague, or another team.
Where
= the employability for student s, ranging over values from 5 to 95 in steps of 5.
= the Gaussian distribution. The statistical probability that a random variable, z, drawn from a normal distribution, will lie between the mean and z standard deviations above (or below) the mean. The GAUSS function returns values between -0.5 and +0.5
is the combined z-score resulting from combining the z-score normalisation of the Recommendation
and Peer Assessed Scores
for student s, as explained in the mathematical calculations for the Stand Peer Assessed Score. Through the process of normalisation,
has a mean of zero, standard deviation 1, which is the required input for the GAUSS function.
is a mathematical function that rounds one number to the nearest integer multiple of another. In the case of Employability, m = 5. For example,
and
. The MROUND function coupled with the attenuation factor of 95 achieves a step interval of 5 units.
The constant 0.5, adds the probability that a z-score lies between minus infinity and the mean, which is, by definition, 50%.
The following transformations are applied to remove the impression of an over-precise measure of Employability, and reduce the possibilities of elation or despair in response to extreme values of Employability. Specifically, we apply a Principle of Conservatism the result of which is that Employability is conditioned to lie between 5 to 95, and rounded to increase in steps of 5, rather than the theoretically possible values of zero to 100, with apparently infinite precision!
The MROUND to the closest multiple of 5 coupled with attenuation by 95 achieves the step interval of 5 units.
The constant 2.5 is a translation factor that compensates for the shift downwards in mean values on account of the 95 attenuation factor.
The following table shows example calculations for of Employability based on the most likely range of possible values for combined z-scores arising from the generation of Standardised Peer Assessed Score, SPAS
The subsequent graph shows the data from the calculations of Employability charted against Combined z-scores.
As an example, consider a student achieving a SPAS of zero, arising from their combined z-score of -3. According to the normal distribution, less than 1 in 1000 students would recommend this student, as indicated by the proportion of the class who would fall below a combined z-score of -3. The calculation of Employability generously raises the assessment of the student suggesting that 5 % of the class would recommend them! The same conservativism happens at the other extreme, where a brilliantly contributing student (eg above a Combined z-score of +2) achieves an Employability of 95%, whereas if the normal distribution was to believed, they might expect 98% of the class to recommend them.
Example calculations of Employability from Combined z-scores
Combined z-scores, | -3 | -1.5 | -1 | -0.5 | 0 | 0.5 | 1 | 1.5 | 2 | 3 |
-0.50 | -0.43 | -0.34 | -0.19 | 0 | 0.19 | 0.34 | 0.43 | 0.48 | 0.50 | |
Standardised Peer Assessed Score, SPAS | -22 | 14 | 26 | 38 | 50 | 62 | 74 | 86 | 98 | 122 |
Standardised Peer Assessed Score, SPAS (Trimmed to 0 to 100) | 0 | 14 | 26 | 38 | 50 | 62 | 74 | 86 | 98 | 100 |
Proportion of class below Combined z-score | 0.1% | 7% | 16% | 31% | 50% | 69% | 84% | 93% | 98% | 99.9% |
Employability | 5 | 10 | 20 | 30 | 50 | 70 | 80 | 90 | 95 | 95 |
Quick links and related information
FAQ: How is Standard Peer Assessed Score (SPAS) calculated?
FAQ: How are peer assessment and personal results calculated and defined mathematically?
Having a good sense of who you are enables you to build upon your strengths and correct your weaknesses. In turn, that can make you more successful at work and in your personal life. You are able to better understand, predict and cope with others more effectively. You can better distinguish valid and invalid informal and formal feedback from others. You are more likely to select (and achieve!) realistic personal goals. (‘ERSI: Exceptionally Realistic Self-Image’, 2012)
The Index of Realistic Self Assessment (IRSA) is a first step in providing data upon which to develop an Exceptionally Realistic Self-Image (ERSI).
The Index of Realistic Self Assessment (IRSA) is a ratio-based measure of the extent to which a team members’ SELF assessment is matched by the assessment of the OTHER members of your team.
Where
= the Peer Assessed Score assigned that student by their team members
= the Peer Assessed Score a student has assessed themself
IRSA typically lies in the range 50 to 120. However, theoretically, IRSA could lie between zero and infinity. IRSA values generally calculate as:
IRSA is calculated only when these two conditions occur:
Extreme values of IRSA are notified in the teacher’s Active Warnings, as detailed in the FAQ
FAQ: What is a mismatched self-assessment (IRSA)?
The data for the following table is drawn from
FAQ: How is the Peer Assessed (PA) Score calculated?
Calculations of the Index of Realistic Self Assessment for four team members
ASSESSEE | ||||
ASSESSOR | Bridget | Julian | Lydia | Nigella |
Bridget | 87 | 62.5 | 75 | 72.5 |
Julian | 87.5 | 93 | 87.5 | 82.5 |
Lydia | 75 | 82.5 | 78 | 80 |
Nigella | 0 | 77.5 | 82.5 | 82 |
Peer Assessed Score (others) | 54 | 74 | 82 | 78 |
Peer Assessed Score (self) | 87 | 93 | 78 | 82 |
Index of Realistic Self Assessment | 62 | 80 | 105 | 95 |
Indication | Overconfident | Typical | Underconfident | Typical (Borderline underconfident) |
Lydia has been assessed by others with a PA Score of 82. Her self-assessment has produced her of 78. Therefore, since
Lydia’s IRSA of 105 indicates that she is an outlier when compared with most team members in a typical class. Specifically, she is underconfident in terms of assessing her strengths when compared with how others perceive her.
From our experience using Peer Assess Pro in many classes, we find most team members overrate themself when compared with how their team members rate them. This overrating results in a self-assessed Peer Assessed Score typically 7 to 10 points higher than the Peer Assessed Score awarded by the other members of that same team. This phenomenon of overrating of one’s self assessment is well-established in the literature, termed self-enhancement bias (See, for instance, (Loughnan et al., 2011). Informally, self-enhancement bias is also known as the Lake Wobegon Effect, a phenomenon observed in a fictional town “where all the women are strong, all the men are good looking, and all the children are above average." (‘Lake Wobegon effect’, n.d.; ‘Lake Wobegon: The Lake Wobegon Effect’, 2017).
Quick links and related information
FAQ: What is a mismatched self-assessment (IRSA)?
FAQ: What is a valid assessed team?
FAQ: How do I interpret measures of realistic self-assessment?
Lake Wobegon effect. (n.d.). Retrieved 25 July 2017, from http://psychology.wikia.com/wiki/Lake_Wobegon_effect
Lake Wobegon: The Lake Wobegon Effect. (2017). In Wikipedia. Retrieved from https://en.wikipedia.org/w/index.php?title=Lake_Wobegon&oldid=787029148#The_Lake_Wobegon_effect
Loughnan, S., Kuppens, P., Allik, J., Balazs, K., de Lemus, S., Dumont, K., … Haslam, N. (2011). Economic Inequality Is Linked to Biased Self-Perception. Psychological Science, 22(10), 1254–1258. https://doi.org/10.1177/0956797611417003
From our experience using Peer Assess Pro in many classes, we find most team members overrate themself when compared with how their team members rate them. This overrating results in a self-assessed Peer Assessed Score typically 7 to 10 points higher than the Peer Assessed Score awarded by the other members of that same team. This phenomenon of overrating of one’s self assessment is well-established in the literature, termed self-enhancement bias (See, for instance, (Loughnan et al., 2011). Informally, self-enhancement bias is also known as the Lake Wobegon Effect, a phenomenon observed in a fictional town “where all the women are strong, all the men are good looking, and all the children are above average." (‘Lake Wobegon effect’, n.d.; ‘Lake Wobegon: The Lake Wobegon Effect’, 2017).
The usual tendency of team members is to apply a self-enhancement bias when rating themselves using Peer Assess Pro. Consequently, we can interpret Index of Realistic Self Assessment (IRSA) scores in one of three ways: typical team members, overestimated, and underestimated.
An IRSA score between 75 and 95 suggests the assessed team member understand realistically their team contribution when contrasted with the assessment perceived by other team members. A score between 75 and 95 is typical of about 2/3 of team members in a class.
An IRSA below 75 suggests the assessed team member OVERESTIMATES their team contribution when perceived by other team members. An index below 75 suggests the team member undertake action to understand proactively their areas for development by informally soliciting further feedback and guidance from their team members. About ⅙ of team members achieve an index of below 75.
An IRSA above 95 suggests the assessed team member has a tendency to UNDERESTIMATE their team contribution when contrasted with the assessment perceived by other team members. The team member should consider developing more confidence in applying and displaying their strengths. About ⅙ of team members achieve an index of above 95.
An Index of Realistic Self Assessment that is not in the ‘typical’ range of 75 to 95 suggests that the team member take active steps to
A three-step programme to develop an Exceptionally Realistic Self-Image includes
Quick links and related information
FAQ: What is a mismatched self-assessment (IRSA)?
FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?
ERSI: Exceptionally Realistic Self-Image. (2012). Orange County Human Resource Services Portal. Retrieved from http://bos.ocgov.com/hr/hrportal/docs/docs_hr_leadership_forum/minutes_2012/minutes_030812/ersi.doc
Lake Wobegon effect. (n.d.). Retrieved 25 July 2017, from http://psychology.wikia.com/wiki/Lake_Wobegon_effect
Lake Wobegon: The Lake Wobegon Effect. (2017). In Wikipedia. Retrieved from https://en.wikipedia.org/w/index.php?title=Lake_Wobegon&oldid=787029148#The_Lake_Wobegon_effect
Loughnan, S., Kuppens, P., Allik, J., Balazs, K., de Lemus, S., Dumont, K., … Haslam, N. (2011). Economic Inequality Is Linked to Biased Self-Perception. Psychological Science, 22(10), 1254–1258. https://doi.org/10.1177/0956797611417003
If one member of a team submits a peer assessment for an assessee ‘materially different’ than the assessments given by the other team members, this difference gives rise to an Active Warning in Peer Assess Pro titled
Critical Warning 0042 Outlier individual rating
A team member has assessed another team member very differently than the other team members.
The extended detail for the Active Warning displays one or more messages such as:
Harris assessed Michael awarding a PA Subscore of 38. Compared with the average rating by the other team members of 70 this subscore DEPRESSED the PA Score to 64 by 7 PA Score Units. Team Alpha
Josef assessed Alvin awarding a PA Subscore of 100. Compared with the average rating by the other team members of 66 this subscore RAISED the PA Score to 73 by 7 PA Score Units. Team Alpha
An Outlier individual rating warning will be raised ONLY if the impact on the assessee’s Peer Assessed Score is raised or lowered by more than 5 PA units outside the average rating given by the other members of the team.
The warning will be generated only for members of a valid assessed team, as detailed in
FAQ: What is a valid assessed team?
Consider team Alpha containing 5 members where Adam has been assessed with the following Peer Assessed Subscores by the other four team members.
Impact of removing one Assessor from the calculation of Peer Assessed Score for Adam
Assessee | Assessor | PA Subscore | Team Size | PA Score | PA Score Exclusive | Assessor Impact | Impact Direction |
Adam | Edward | 53 | 5 | 73 | 80 | -7 | DEPRESSED |
Adam | Mary | 63 | 5 | 73 | 77 | -4 | |
Adam | Stephanie | 78 | 5 | 73 | 72 | 1 | |
Adam | Josef | 100 | 5 | 73 | 64 | 9 | RAISED |
Adam has a Peer Assessed Score of 73, calculated from his four team members subscores as follows:
= (Edward + Mary + Stephanie + Josef) / (Team size - 1)
= (53 + 63 + 78 + 100) / (5 - 1)
= 294 / 4
= 73.5
To determine the impact of Edward’s assessment of Adam, we can calculate the Peer Assessed Score Adam would receive from just the other three members as follows:
PA Score Exclusive = (Mary + Stephanie + Josef) / (Team size - 2)
= (63 + 78 + 100) / 3
= 241 / 3
= 80.3
Therefore, the Assessor’s impact is the difference between the whole-of-team’s originally-calculated PA Score and the PA Score Exclusive
Impact = PA Score - PA Score Exclusive
= 73.5 - 80.3
= - 6.8
We observe that the impact of Edward’s relatively low assessment of Adam has an impact that DEPRESSED Adam’s overall Peer Assessed Score by about 7 PA Units.
Peer Assessed Pro presents the following detail:
Edward assessed Adam awarding a PA Subscore of 53. Compared with the average rating by the other team members of 80 this subscore DEPRESSED the PA Score to 73 by 7 PA Score Units. Team Alpha
In contrast, we see Josef’s rating of 100 had an impact that raised Adam’s Peer Assessment score. The following detailed outlier warning is presented:
Josef assessed Adam awarding a PA Subscore of 100. Compared with the average rating by the other team members of 64 this subscore RAISED the PA Score to 73 by 9 PA Score Units. Team Alpha
The threshold for raising this Warning in Peer Assess Pro is +/- 5 PA Score units, the ThresholdOutlier constant. That is, if one assessor’s rating would affect the PA Score awarded to an assessee by more than 5 units, then the Outlier Warning will be raised.
In the previous example, the impact on Adam by assessors Mary and Stephanie is within the ThresholdOutlier constant of 5 PA Score units, so no outlier warning message is generated for these two assessors.
A more elegant method for calculating the Assessor Impact follows. First, calculate the Peer Assessed Score for Assessed student s, excluding the PA Subscore awarded by Assessor a.
Where
t = the number of team members in the team in which s is a team member.
= the Peer Assessed Score for assessed student s
= the peer assessment score awarded by Assessor a to Assessee s
The Assessor Impact, of removing Assessor a’s assessment of Assessee s is
Consider Edward’s assessment of Adam using the data from the table above.
Quick links and related information
FAQ: How is the Peer Assessed (PA) Score calculated?
FAQ: What is a valid assessed team?
If a team member submits a self-assessment that is ‘materially different’ than the assessments given by their other team members, this difference gives rise to an Active Warning in Peer Assess Pro titled
Critical Warning 0040 Mismatched self-assessment
A team member's self assessment is materially different to the peer assessment given by their team
The extended detail for the Active Warning displays one or more messages such as:
Gregor’s self-assessment of 63 is UNDERCONFIDENT compared with the peer assessment of 93 given by others in team Charlie. IRSA = 148
Daphne’s self-assessment of 68 is OVERCONFIDENT compared with the peer assessment of 34 given by others in team Alpha. IRSA = 51
The warning will be generated only for members of a valid assessed team, as detailed in
FAQ: What is a valid assessed team?
Furthermore, the warning will only be generated when a student has completed their self-assessment as part of their peer assessment submission.
The Peer Assess Pro system constant ThresholdIrsaUnderconfident is defined as 115. Values greater than or equal to ThresholdIrsaUnderconfident will raise the UNDERCONFIDENT active warning. In general, about 7 % to 16 % of students will be flagged with this warning.
The Peer Assess Pro system constant ThresholdIrsaOverconfident is defined as 75. Values less than or equal to ThresholdIrsaOvererconfident will raise the OVERCONFIDENT active warning. In general, about 7 % to 16 % of students will be flagged with this warning.
The Mismatched self-assessment warning is raised from the value of the Index of Realistic Self Assessment (IRSA) that is calculated for each student.
See FAQ
FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?
The warning of UNDERCONFIDENT is raised when IRSA for a student is greater than 115, the ThresholdIrsaUnderconfident.
The warning of OVERCONFIDENT is raised when IRSA for a student is less than 75, the ThresholdIrsaOverconfident.
Sample of several Peer Assessed Scores and self-assessments
Name | PA Score | PA Self | IRSA | Confidence |
Abel | 96.7 | 100 | 96.7 | |
Baker | 100.0 | 82.5 | 121.2 | UNDERCONFIDENT |
Charlie | 82.1 | 70 | 117.3 | UNDERCONFIDENT |
Daphne | 34.2 | 67.5 | 50.6 | OVERCONFIDENT |
Edward | 95.8 | 87.5 | 109.5 |
Consider the case of Daphne
Since 50.6 is less than 75, then Daphne’s self-assessment is regarded as OVERCONFIDENT. Consequently, the Mismatched self-assessment warning is raised.
The extended detail message is:
Daphne’s self-assessment of 68 is OVERCONFIDENT compared with the peer assessment of 34 given by others in team Alpha. IRSA = 51
For UNDERCONFIDENT and OVERCONFIDENT students, Peer Assess Pro generates an email that the teacher can send optionally. The email recommends the student arrange an appointment to meet with the teacher to explore the reasons for the variation in self and others’ peer assessment.
Quick links and related information
FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?
FAQ: How do I interpret measures of realistic self-assessment?
FAQ: What is an ‘at risk’ team member? WARNING 0044
FAQ: How is the Peer Assessed (PA) Score calculated?
FAQ: What is a valid assessed team?
Suppose a team collectively submits a set of peer assessments that are both
This feature is an indication that the team may have engaged unconstructively with the peer assessment process. When these conditions are both fulfilled, an Active Warning in Peer Assess Pro is generated:
Critical Warning 0050 Low quality team rating
A team may have engaged unconstructively with peer assessment
The extended detail for the Active Warning displays one or more messages such as:
Team Alpha may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 100 and low range 0.
Team Bravo may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 96 and low range 8.
Team Charlie may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 96 and low range 8.
The warning will be generated only for members of a valid assessed team, as detailed in
FAQ: What is a valid assessed team?
The Peer Assess Pro system constant ThresholdTeamAverage is defined as 90.
The Peer Assess Pro system constant ThresholdTeamRange is defined as 11.
The Active Warning, YES, is generated for a team when both conditions are true:
Suppose Team Mike contains 6 members, whose Peer Assessed Scores are shown below. The Average Peer Assessed Score and Range of Peer Assessed Scores are calculated.
Peer Assessed Scores for members of Team Mike
Name | Peer Assessed Score |
Annie | 93.75 |
Emma | 92.55 |
Joe | 90.85 |
Freddie | 92.50 |
Tammy | 95.88 |
Tilly | 88.32 |
Team Average | 92 = 553 / 6 |
Team Range | 8 = 95.88 - 88.32 |
The Team Average Peer Assessed Score and Team Range are examined for every team. A low quality team rating is identified for those teams that breach the Threshold parameters defined below.
Identification of low quality team ratings
Team | Team Average Peer Assessed Score | Team Range | Low Quality Team Rating |
Alpha | 100 | 0 | YES |
Mike | 92 | 8 | YES |
November | 87 | 8 | NO |
Oscar | 95 | 9 | YES |
Papa | 85 | 9 | NO |
Quebec | 95 | 10 | YES |
Romeo | 92 | 12 | NO |
In general, a team that has this warning may have engaged unconstructively with peer assessment. Most team members have not entered the spirit of the peer assessment process. They may have attempted to ‘game’ the peer assessment by giving everyone well above typical or average ratings.
Peer Assess Pro provides the facilitator with the option to send out an email to all members of the team suggesting they may wish to reconsider their ratings. Furthermore, the students are encouraged to provide qualitative evidence in support of the ratings they have provided.
In a small proportion of teams, it is possible that a high performing team will ALSO have this Active Warning generated. In a high performing team all team members contribute effectively to the results and team processes. This outcome will be evident to the teacher through the team gaining a high Team Result for their submitted work.
The following graph shows a large first year university class of 848 students who have undertaken their first, formative experience of Peer Assess Pro. Of the 84 valid teams, 24 teams are identified as potentially having a low quality team rating.
In a case like this, the teacher might consider guiding the class of students towards more constructive and discriminating peer assessment before undertaking the final, summative peer assessment. For example, remind students of the purpose of peer feedback, and how to provide useful feedback.
Quick links and related information
FAQ: What is a valid assessed team?
FAQ: What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process?
FAQ: What is the purpose of peer assessment?
FAQ: How do I provide useful feedback to my team members?
FAQ: What is a low quality assessor rating?
Suppose a team member submits a set of peer assessments that are both
This feature is an indication that the team member may have engaged unconstructively with the peer assessment process. When these conditions are both fulfilled, an Active Warning in Peer Assess Pro is generated:
Critical Warning 0300 Low quality assessor rating
An assessor may have engaged unconstructively with peer assessment.
The extended detail for the Active Warning displays one or more messages such as:
Tony may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 100 and low range 0. Team Alpha
Kathy may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 85 and low range 8. Team Bravo
The warning will be generated only for members of a valid assessed team, as detailed in
FAQ: What is a valid assessed team?
The Peer Assess Pro system constant ThresholdAssessorAverage is defined as 85.
The Peer Assess Pro system constant ThresholdAssessorRange is defined as 9.
The Active Warning is generated for an assessor when BOTH conditions are true:
Suppose Kathy a member of Team Bravo assesses all her fellow team members as follows:
Peer Assessed SubsScores assessed by Kathy in Team Bravo
Name | Peer Assessed Sub Score |
Garry | 90 |
Dan | 87.5 |
Sunny | 82.5 |
Freddie | 82.5 |
Robby | 82.5 |
Average | 85 = 425 / 5 |
Range | 7.5 = 90 - 82.5 |
The Average Peer Assessed Score and Range are examined for every assessor. A low quality assessor rating is identified for those individuals that breach the threshold parameters defined above..
Identification of low quality individual assessor ratings
Assessor | Average PA Score (awarded) | Range (PAS Units) | Low quality assessor rating |
Tony | 100 | 0.0 | Y |
Andy | 93 | 0.0 | Y |
Jess | 75 | 0.0 | N |
Johnny | 93 | 2.5 | Y |
Chance | 82 | 2.5 | N |
Kathy | 85 | 7.5 | Y |
Zara | 83 | 7.5 | N |
Riley | 97 | 10.0 | N |
In general, an individual with this warning may (or may not!) have engaged unconstructively with peer assessment. The team member may not have entered the spirit of the peer assessment process. They may have attempted to ‘game’ the peer assessment by giving everyone well above typical or average ratings.
Peer Assess Pro provides the facilitator with the option to send out an email to the assessor suggesting they may wish to reconsider their ratings. Furthermore, the student is encouraged to provide qualitative evidence in support of the ratings they have provided.
In a small proportion of teams, it is possible that a member of a high performing team will ALSO have this Active Warning generated. In a high performing team all team members contribute effectively to the results and team processes. Consequently, it is reasonable to expect a high average Peer Assessed Score to be awarded most members, with a concurrent low range. This outcome will be evident to the teacher through the assessor’s team gaining a high Team Result for their submitted work.
Quick links and related information
FAQ: What is a valid assessed team?
FAQ: What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process?
FAQ: What is the purpose of peer assessment?
FAQ: How do I provide useful feedback to my team members?
FAQ: What is a low quality team rating?
Peer Assess Pro restricts the display of results to teachers and students when a small number of peer assessments from a team has been submitted - less than three, or less than half. This restriction gives rise to an Active Warning in Peer Assess Pro titled
Critical Warning 0022 Insufficient team responses
The number of responses from a team is insufficient for presenting valid results.
The following extended detail is provided
Alpha has received 2 team member responses. Minimum required 3 from team size 4
Bravo has received 3 team member responses. Minimum required 4 from team size 6
Peer Assess Pro restricts the display of results to valid assessed teams. The notion of a valid assessed team is to prevent the display of results to students (and facilitators) when a small number of peer assessments from a team has been submitted. Such a low response situation could distort the reliability and accuracy of both the team’s peer assessment and personal result calculations, and ACTIVE WARNING messages for a team. Consequently, class statistics such as mean, maximum, range, and standard deviation are calculated only for team members that are designated as part of a valid assessed team.
Students can only view results if they belong to a valid assessed team.
A facilitator may only view results from valid assessed teams.
The Teacher’s Dashboard Active Warnings and (i) Information button inform you of the number of valid teams and valid assessments throughout the progress of managing the peer assessment responses. The Active Warning enables you to ‘hunt down’ the teams that have not yet achieved valid status.
Peer Assess Pro generates an email that the teacher can send optionally to members of non-valid team who have not yet responded. The email reminds students to respond by the Due Date.
For teams with five or fewer members, a valid assessed team must have peer ratings from at least three members of the team. For teams with six or more team members ‘just over’ half the team members must peer assess. The required minimum number of team members who must rate within a particular team of size n members is defined as:
Where
= the minimum number of team members required to rate within a particular team
is a function that selects the maximum of the calculated values
is a function that calculates the integer value of the result
For teams of size 0, 1 and 2 peer assessment results are not calculated. The default Personal Pesult in these circumstances is the Team Result.
Team size, | Required minimum assessors, | Proportion of whole team |
3 | 3 | 100% |
4 | 3 | 75% |
5 | 3 | 60% |
6 | 4 | 66% |
7 | 4 | 57% |
8 | 5 | 62% |
9 | 5 | 56% |
10 | 6 | 60% |
Quick links and related information
FAQ: How is the Peer Assessed (PA) Score calculated?
FAQ: How are peer assessment and personal results calculated and defined mathematically?
An ‘at risk’ team member has been rated amongst the bottom 10 % of students in the class as measured by either the Peer Assessed Score OR Recommendation.
A low rating on either of these assessments gives rise to an Active Warning in Peer Assess Pro titled
Critical Warning 0044 At risk team member
A team member has been rated amongst the lowest in class.
The following extended detail is provided
Anne Smith has been rated amongst the lowest in class. Low Recommendation 2.3 and/or low Peer Assessment Score 34. Team Alpha.
Peer Assess Pro generates an email that the teacher can send optionally to team members with a low rating. The email requests that the team member make an appointment to meet promptly with the teacher to discuss their peer assessment results so they can develop a more productive contribution to the team's future outputs, processes and leadership.
The facilitator should view the personal snapshot of low rated team members examining the qualitative feedback given. You should expect that the qualitative feedback will confirm the low Recommendation or Peer Assessed Scores. It will be helpful to have reviewed these snapshots prior to your interviewing and counselling the at risk students who visit you.
The Peer Assess Pro system constants ThresholdPaScore and ThresholdRecommendation are defined to identify approximately the bottom 10 % of students in the class.
Where
The Active Warning is generated for an assessor when either condition is true:
The test is conducted only when the team has sufficient assessments to qualify as a valid assessed team.
Consider a class with the following histogram of Peer Assessed Scores.
The following statistics are determined from this class of 27 students
Now,
Therefore,
The two students Joslyn HOOVER and Kamryn MILLER will be identified as ‘at risk’ team members.
The following table is a sample of the bottom ranked students in class, according to PA Score.
Identification of at risk students
The foregoing data will generate the following WARNING messages.
Joslyn HOOVER has been rated amongst the lowest in class. Low Peer Assessment Score 0.8. Team Grey Warblers.
Kamryn MILLER has been rated amongst the lowest in class. Low Peer Assessment Score 29.1. Team Black Robins
The NORMINV function presumes the distribution of Peer Assessed Scores is normally-distributed. In practice, we find that the Peer Assessed Score histogram is typically skewed towards the 100 point end of the scale, as illustrated above. However, we do find the distribution of the Recommendations is more likely to be normally distributed, as the following example from the same class survey illustrates.
Now
Therefore,
From the table below, the student Muhammad HOLT will be identified as ‘at risk’ in addition to the two students Joslyn HOOVER and Kamryn MILLER identified using the Peer Assessed Score criterion.
In summary, the foregoing two tests will generate the following WARNING messages.
Joslyn HOOVER has been rated amongst the lowest in class. Low Peer Assessment Score 0.8 and/or low Recommendation 1.0. Team Grey Warblers.
Kamryn MILLER has been rated amongst the lowest in class. Low Peer Assessment Score 29.1 and/or low Recommendation 2.3. Team Black Robins
Muhammed HOLT has been rated amongst the lowest in class. Low Peer Assessment Score 54.1 and/or low Recommendation 2.0. Team Black Robins
The teacher has several graphical approaches to identifying the most at risk students in their class
FAQ: What is a mismatched self-assessment (IRSA)? WARNING 0040
Identifying at risk students from sorted table of Recommendations
Identifying at risk students from sorted table of Peer Assessed Scores with concurrent examination of Self Assessment
Quick links and related information
FAQ: What steps can I take to get a better personal result?
FAQ: What is a valid assessed team?
FAQ: How is the Peer Assessed (PA) Score calculated?
FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?
FAQ: What is a mismatched self-assessment (IRSA)? WARNING 0040
Notifications History shows the email notifications that have been sent by the Peer Assess Pro platform to participants. The history also records event notifications sent by email to the Facilitator. The delivery status of the email is designated as SENT, DELIVERED or FAILED.
Notifications History shows emails that are sent automatically by the Peer Assess Pro platform, and those initiated by the facilitator in response to Active Warnings.
The Notifications History feature is presented at the very bottom of the Facilitator Dashboard. Click on the column Emails sent, then Message/View to examine the email sent to a specific recipient.
The Notifications History is helpful for audit purposes such as when a student denies receiving an email from you.
The delivered status of the message is designated as
The internet will take a few minutes, even hours to confirm the final state of SENT emails. You’ll need to REFRESH or RELOAD your Running Activity to update the status of the Notifications History.
Overview of Survey Notification History feature in Peer Assess Pro
Delivered status of messages sent from Peer Assess Pro platform
Quick links and related information
FAQ: What is the content of emails sent by Peer Assess Pro to Participants?
FAQ: What is the content of emails sent by Peer Assess Pro to Facilitators?
FAQ - How do I fix an invalid, missing or failed email delivery? WARNING 0026
In the Active Warnings section of the Facilitators Dashboard, select Preview Email
Note that a record of the email sent from Active Warnings is recorded in the Survey Notifications History. See FAQ - What emails have been sent by the platform?
The first table shows the SUBJECT title of the email generated in response to automated various events and responses to warnings activated by the Facilitator.
The second table shows the detailed content of each email. The Facilitator can, of course, copy this template text, modify, and send their own email.
Some emails are automatically generated by the Peer Assess Pro platform, such as 0011 Request to COMPLETE peer assessment and 0013 RESUBMIT peer assessment due to TEAM CHANGE.
Other emails are sent under the direction of the Facilitator when they respond to an Active Warning. Examples include 0103 WARNING Request to RECONSIDER peer assessment: Assessor unconstructive
The content of email generated by Peer Assess Pro undergoes regular review and improvement. Details may not match exactly the detail presented here.
Quick links and related information
FAQ: What is the content of emails sent by Peer Assess Pro to Facilitators?
FAQ - What emails have been sent by the platform?
FAQ - How do I fix an invalid, missing or failed email delivery? WARNING 0026
Email ID - Priority - Short Descriptor SUBJECT |
0011 CRITICAL - Participant - Request to COMPLETE peer assessment SUBJECT - Please complete peer assessment due by << Due Date >>. <<Activity Title>> |
0012 CRITICAL - Participant - REMINDER to complete peer assessment SUBJECT - REMINDER! Please complete peer assessment due by << Due Date >>. <<Activity Title>> |
0013 CRITICAL - Participant - RESUBMIT peer assessment due to TEAM CHANGE SUBJECT - RESUBMIT! Please complete peer assessment due by << Due Date >>. <<Activity Title>> |
0020 CRITICAL - Participant - ABANDONED Peer Assessment activity. SUBJECT - ABANDONED peer assessment for peer assessment due by << Due Date >>. <<Activity Title>> |
0103 WARNING - Participant - Request to RECONSIDER peer assessment: Assessor unconstructive SUBJECT - Request to reconsider peer assessment due by << Due Date >>. <<Activity Title>> |
1001 ADVISORY - Participant - Personal results PUBLISHED and available to view SUBJECT - Please view your personal results for peer assessment due by << Due Date >>. <<Activity Title>> |
1002 ADVISORY - Participant - REVISED personal results published and available to view SUBJECT - REVISED RESULTS! Please view your personal results for peer assessment due by << Due Date >>. <<Activity Title>> |
1003 ADVISORY - Participant - FINALISED personal results published and available to view SUBJECT - FINALISED RESULTS! Please view your personal results for peer assessment <<Activity Title>>. Available until << finalisation date + 2 weeks >> |
1004 ADVISORY - Participant - Personal results PUBLISHED but NOT available to view SUBJECT - Incomplete submissions from your team for peer assessment due by << Due Date >>. <<Activity Title>> |
1005 ADVISORY - Participant - FINALISED personal results published but NOT available to view SUBJECT - FINALISED RESULTS: Incomplete submissions from your team for peer assessment due by << Due Date >>. <<Activity Title>> |
Email ID - Priority - Recipient - Short Descriptor SUBJECT - Subject Detail |
0011 - CRITICAL - Participant - Request to COMPLETE peer assessment SUBJECT - Please complete peer assessment due by << Due Date >>. <<Activity Title>> Dear <<team member>>, Please complete the Peer Assess Pro peer assessment activity << Activity Title>> for your team before << Due Time>> << Due Date >>. To complete the activity, please visit the Activity URL << Activity Specific URL>>. The peer assessment requires a Login ID. Usually, the Login ID will be your student id, unless your teacher has advised an alternative. The Activity URL will become available for your responses from << Activity Start Time>> << Activity Start Date >>. Team membership check The following are your team members. If there is a mistake in this list please urgently advise your teacher the correct composition, using the email listed below. << Team Name>> <<List of Team members>> Further information For further information about preparing for, and using the results of the peer assessment, please visit https://www.peerassesspro.com/resources/resources-for-students/. You may find the answers to these Frequently Asked Questions helpful at this stage: FAQ: What is the purpose of peer assessment? FAQ: What questions are asked in the peer assessment survey? FAQ: How do I provide useful feedback to my team members? FAQ: I am unable to login. My login failed Do not reply to this email. Rather, contact your teacher whose email is listed below. Sent by Peer Assess Pro on behalf of << Teacher fullname >> << Teacher email >> |
0012 - CRITICAL - Participant - REMINDER to complete peer assessment SUBJECT - REMINDER! Please complete peer assessment due by << Due Date >>. <<Activity Title>> Dear <<team member>>, The peer assessment activity for << Activity Title>> will soon become unavailable for you to complete. Therefore, please complete the Peer Assess Pro peer assessment activity for your team before << Due Time>> << Due Date >>. To complete the activity, please visit the Activity URL << Activity Specific URL>>. The peer assessment requires a Login ID. Usually, the Login ID will be your student id, unless your teacher has advised an alternative. The Activity URL became available for your responses from << Activity Start Time>> << Activity Start Date >>. Further information For further information about preparing for, and using the results of the peer assessment, please visit https://www.peerassesspro.com/resources/resources-for-students/. You may find the answers to these Frequently Asked Questions helpful at this stage: FAQ: What is the purpose of peer assessment? FAQ: What questions are asked in the peer assessment survey? FAQ: How do I provide useful feedback to my team members? FAQ: I am unable to login. My login failed Do not reply to this email. Rather, contact your teacher whose email is listed below. Sent by Peer Assess Pro on behalf of << Teacher fullname >> << Teacher email >> |
0013 - CRITICAL - Participant - RESUBMIT peer assessment due to TEAM CHANGE SUBJECT - RESUBMIT! Please complete peer assessment due by << Due Date >>. <<Activity Title>> Dear <<team member>>, You may have already completed the Peer Assess Pro peer assessment for <<Activity Title>> due before << Due Time>> << Due Date >>. I regret to advise that I require you to resubmit your survey. You response submitted to date has been deleted from the analysis. The reasons for this request may be due to a change to the membership of your team, such as a deletion or addition of a team member. To complete the activity, please visit the Activity URL << Activity Specific URL>>. Please resubmit your peer assessment for <<Activity Title>> due before << Due Time>> << Due Date >>. We apologise for your inconvenience. Team membership check The following are your team members. If there is a mistake in this list please urgently advise your teacher the correct composition, using the email listed below. << Team Name>> <<List of Team members>> Further information For further information about preparing for, and using the results of the peer assessment, please visit https://www.peerassesspro.com/resources/resources-for-students/. You may find the answers to these Frequently Asked Questions helpful at this stage: FAQ: How do I login to my peer assessment Activity URL FAQ: How do I provide useful feedback to my team members? FAQ: I am unable to login. My login failed Do not reply to this email. Rather, contact your teacher whose email is listed below. Sent by Peer Assess Pro on behalf of << Teacher fullname >> << Teacher email >>Team membership check |
0020 - CRITICAL - Participant - ABANDONED Peer Assessment activity. SUBJECT - ABANDONED peer assessment for peer assessment due by << Due Date >>. <<Activity Title>> Dear <<team member>>, You and your team members were invited to participate recently in the peer assessment for << Activity Title>> due << Due Date >>. The Teacher ABANDONED the activity on << Abandoned Date >> due to exceptional circumstances. Please disregard any previous interim published results. I apologise for your inconvenience. Do not reply to this email. Rather, contact your teacher whose email is listed below. Sent by Peer Assess Pro on behalf of << Teacher fullname >> << Teacher email >> |
0103 - WARNING - Participant - Request to RECONSIDER peer assessment: Assessor unconstructive SUBJECT - Request to reconsider peer assessment due by << Due Date >>. <<Activity Title>> Dear <<team member>>, You recently completed the peer assessment of << Activity Title>>. However your teacher noted that your individual responses suggest you have not engaged constructively with the peer assessment process. Specifically, you may have: - Rated all team members over a narrow range and/or - Rated all team members overgenerously and/or - The qualitative comments in your feedback failed to justify the ratings you provided. If you feel that your ratings and feedback are justified, you need take no further action. For example, such high ratings may be justified for a team with evidence of exceptionally high performance on its tasks and outputs. Alternatively, if you wish to resubmit a more accurate survey, please use the URL below to submit a replacement peer assessment survey. Please take special care to provide useful and accurate qualitative feedback that will help your team member(s) and teacher understand the ratings you have provided. To complete the activity, please visit the Activity URL << Activity Specific URL>>. Complete the revised Peer Assess Pro peer assessment activity for your team before << Due Time>> << Due Date >>. Further information For further information about preparing for, and using the results of the peer assessment, please visit https://www.peerassesspro.com/resources/resources-for-students/. You may find these answers to these Frequently Asked Questions helpful at this stage: FAQ: How do I provide useful feedback to my team members? FAQ: What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process? FAQ: Is my self-assessment used to calculate my Peer Assessed Score? Do not reply to this email. Rather, contact your teacher whose email is listed below. Sent by Peer Assess Pro on behalf of << Teacher fullname >> << Teacher email >> |
1001 - ADVISORY - Participant - Personal results PUBLISHED and available to view SUBJECT - Please view your personal results for peer assessment due by << Due Date >>. <<Activity Title>> Dear <<team member>>, You recently completed the peer assessment of << Activity Title>>. You may now view your Personal Result and feedback. Please visit the Activity URL << Activity Specific URL>>. Your results will be available for you to view for a period of two weeks following the finalisation of the activity. If you have specific questions or concerns about your Personal Results please contact the teacher promptly so that a remedy can be determined. Further information For further information about using the results of the peer assessment, please visit https://www.peerassesspro.com/resources/resources-for-students/. You may find these answers to these Frequently Asked Questions helpful at this stage: FAQ: How do I interpret the feedback results I've received from the peer assessment? FAQ: What steps can I take to get a better personal result? FAQ: I don't understand what my teammates are trying to tell me. How do I ask for better feedback? FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern? Do not reply to this email. Rather, contact your teacher whose email is listed below. Sent by Peer Assess Pro on behalf of << Teacher fullname >> << Teacher email >> |
1002 - ADVISORY - Participant - REVISED personal results published and available to view SUBJECT - REVISED RESULTS! Please view your personal results for peer assessment due by << Due Date >>. <<Activity Title>> Dear <<team member>>, You recently completed the peer assessment of << Activity Title>>. You may now view your revised Personal Result and feedback. Please visit the Activity URL << Activity Specific URL>>. Your results may have been revised from those previously made available to you. Reasons for revisions include: - A change in Team Results - Late peer assessment responses - An adjustment to the method the teacher has used to calculate your personal result. Your results will be available for you to view for a period of two weeks following the finalisation of the activity. If you have specific questions or concerns about your Personal Results please contact the teacher promptly so that a remedy can be determined. Further information For further information about using the results of the peer assessment, please visit https://www.peerassesspro.com/resources/resources-for-students/. You may find the answers to these Frequently Asked Questions helpful at this stage: FAQ: How do I interpret the feedback results I've received from the peer assessment? FAQ: What steps can I take to get a better personal result? FAQ: I don't understand what my teammates are trying to tell me. How do I ask for better feedback? FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern? Do not reply to this email. Rather, contact your teacher whose email is listed below. Sent by Peer Assess Pro on behalf of << Teacher fullname >> << Teacher email >> |
1003 - ADVISORY - Participant - FINALISED personal results published and available to view SUBJECT - FINALISED RESULTS! Please view your personal results for peer assessment <<Activity Title>>. Available until << finalisation date + 2 weeks >> Dear <<team member>>, You recently completed the peer assessment of << Activity Title>>. You may now view your final Personal Result and feedback. Please visit the Activity URL << Activity Specific URL>>. Your results are available for you to view for a period of two weeks following the finalisation of the activity. That is, from now until << Finalisation date + two weeks >>. Your results may have been revised from those previously made available to you. The revisions may have been due to: - A change in Team Results - Late peer assessment responses - An adjustment to the method the teacher has used to calculate your personal result. If you have specific questions or concerns about your Personal Results please contact the teacher promptly so that a remedy can be determined. Further information For further information about using the results of the peer assessment, please visit https://www.peerassesspro.com/resources/resources-for-students/. You may find the answers to these Frequently Asked Questions helpful at this stage: FAQ: How do I interpret the feedback results I've received from the peer assessment? FAQ: What steps can I take to get a better personal result? FAQ: I don't understand what my teammates are trying to tell me. How do I ask for better feedback? FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern? Do not reply to this email. Rather, contact your teacher whose email is listed below. Sent by Peer Assess Pro on behalf of << Teacher fullname >> << Teacher email >> |
1004 - ADVISORY - Participant - Personal results PUBLISHED but NOT available to view SUBJECT - Incomplete submissions from your team for peer assessment due by << Due Date >>. <<Activity Title>> Dear <<team member>>, You recently completed the peer assessment << Activity Title>>. However, several team of your members have yet to complete their peer assessment. Consequently, you are restricted from viewing the results of the peer assessment as the results would not yet be valid. You may wish to take action by reminding your team members to complete the peer assessment. Once the remainder of your team have completed their peer assessments, you will be able to view your final Personal Result and feedback. Please visit the Activity URL << Activity Specific URL>>. Further information For further information about using the results of the peer assessment, please visit https://www.peerassesspro.com/resources/resources-for-students/. You may find the answers to these Frequently Asked Questions helpful at this stage: FAQ: What is a valid assessed team? FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern? Do not reply to this email. Rather, contact your teacher whose email is listed below. Sent by Peer Assess Pro on behalf of << Teacher fullname >> << Teacher email >> |
1005 - ADVISORY - Participant - FINALISED personal results published but NOT available to view SUBJECT - FINALISED RESULTS: Incomplete submissions from your team for peer assessment due by << Due Date >>. <<Activity Title>> Dear <<team member>>, You and your team members were invited to participate recently in the peer assessment for << Activity Title>> due << Due Date >>. The Teacher finalised the results on << Finalisation Date >>. However, several of your team members failed to complete their peer assessment. Consequently, you are restricted from viewing the results of the peer assessment as the results are not valid. Since the activity has been finalised, there is no option for further peer assessments to be submitted from your team. Further information For further information about using the results of the peer assessment, please visit https://www.peerassesspro.com/resources/resources-for-students/. You may find the answers to these Frequently Asked Questions helpful at this stage: FAQ: What is a valid assessed team? FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern? Do not reply to this email. Rather, contact your teacher whose email is listed below. Sent by Peer Assess Pro on behalf of << Teacher fullname >> << Teacher email >> |
The first table shows the SUBJECT of the emails sent TO the Facilitator in response to various automated events happening during the launch and management of a Peer Assess Pro activity.
The second table shows the detailed content of each email.
The content of email generated by Peer Assess Pro undergoes regular review and improvement. Details may not match exactly the detail presented here.
Quick links and related information
FAQ: What is the content of emails sent by Peer Assess Pro to Participants?
FAQ - What emails have been sent by the platform?
FAQ - How do I fix an invalid, missing or failed email delivery? WARNING 0026
Email ID - Priority - Short Descriptor SUBJECT |
2001 ADVISORY - Facilitator - Launch successful SUBJECT - SUCCESSFUL LAUNCH: Your peer assessment <<Activity Title>> . Due by <<Due Date>> |
2002 ADVISORY - Facilitator - Manage progress SUBJECT - MANAGE PROGRESS: Your peer assessment <<Activity Title>> . Due by <<Due Date>> |
2006 ADVISORY - Facilitator - Due Date imminent SUBJECT - DUE DATE IMMINENT: Review your peer assessment <<Activity Title>> . Due by <<Due Date>> |
2008 ADVISORY - Facilitator - Due Date reached SUBJECT - DUE DATE REACHED: Finalise your peer assessment <<Activity Title>> . Due by <<Due Date>> |
Email ID - Priority - Recipient - Short Descriptor SUBJECT - Subject Detail |
2001 - ADVISORY - Facilitator - Launch successful SUBJECT - SUCCESSFUL LAUNCH: Your peer assessment <<Activity Title>> . Due by <<Due Date>> Dear << Teacher Fullname >>, You have launched successfully the peer assessment activity << Activity Title>>. Number of participants << class size>> allocated to << Number of teams >> teams. The peer assessment is available for students from <<Start Date >> and due for completion by <<Due Date>>. Students complete the peer assessment at the survey URL <<Students Survey URL>>>. To manage this activity, view your Teacher's Peer Assess Pro Dashboard here <<Teacher's Activity URL>>. Manage the peer assessment The Peer Assess Pro Quickstart Guide reminds you of the next steps you will take as you wait for students to respond. See https://www.peerassesspro.com/quickstart-guide-for-teachers/ Prepare your students for peer assessment These multi-media resources will help you prepare your students for undertaking peer assessment through giving honest, fair, and useful feedback. See https://www.peerassesspro.com/resources/introducing-students-peer-assessment/ As a backup communication to your students, we recommend that the material in the next section 'Advise your students' is emailed and/or posted to your students on your Learning Management System Messaging facility. Advise your students [Teacher, send this section by email and/or post on your LMS] +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Dear students, The peer assessment << Activity Title>> is available for you to complete from <<Start Date >> and due for completion by <<Due Date>>. Complete the peer assessment at the survey URL <<Students Survey URL>>>. You will be sent emails from @peerassesspro.com to advise you when feedback results are available. and where to complete the survey. Please check your junk and spam email. Ensure you allow emails from @peerassesspro.com into your Important mailbox and add @peerassesspro.com as a Contact. You may find these Frequently Asked Questions (FAQs) relevant before you start the peer assessment, view here https://www.peerassesspro.com/frequently-asked-questions-2/. FAQ: What is the purpose of peer assessment? FAQ: How do I provide useful feedback to my team members? FAQ: What questions are asked in the peer assessment survey? FAQ: What if I am unable to login to the peer assessment survey? Students can find further information about peer assessment here https://www.peerassesspro.com/resources/resources-for-students/. Kind regards << Teacher fullname >> << Teacher email >> +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Further information For additional advice on managing a Peer Assess Pro activity, review the Frequently Asked Questions here https://www.peerassesspro.com/frequently-asked-questions-2/ FAQ: When and how is the peer assessment conducted? FAQ: How do I correct the Team Composition in a running peer assessment activity? FAQ: How do I take action on the Active Warnings presented in the Peer Assess Pro™ Teacher’s Dashboard?’ What if I ignore the Warnings? FAQ: How do I decide which Personal Result method to apply in my peer assessment activity? FAQ: What is the content of emails sent by Peer Assess Pro? FAQ: Can I adjust the Start Date or Due Date for a running peerassessment activity? Kind regards, Peer Assess Pro https://www.peerassesspro.com/support/ |
2002 - ADVISORY - Facilitator - Manage progress SUBJECT - MANAGE PROGRESS: Your peer assessment <<Activity Title>> . Due by <<Due Date>> Dear << Teacher Fullname >>, Your peer assessment activity << Activity Title>> is available for students from <<Start Date >> and due for completion by <<Due Date>>. However, students can continue submitting responses beyond the Due Date until you personally FINALISE the activity on the Peer Assess Pro Dashboard. To manage this activity, view your Teacher's Peer Assess Pro Dashboard here <<Teacher's Activity URL>>. Review Active Warnings At this mid-point of the peer assessment process, we suggest you review carefully the Active Warnings on your Peer Assess Pro dashboard. In particular 1. Review the Class Statistics particularly for students rated with the lowest Peer Assessed Scores. The Personal Snapshots for these students may identify absent students or students at risk of course failure 2. Review teams that have not yet submitted sufficient responses to become assessed validly. Remind the team's members to submit 3. Review students who have rated a team member significantly differently than the other team members. This outlier rating may be a sign of dysfunction within the team 4. Review the Qualitative Feedback report, examining especially students who have been peer assessed with a low rating by other team member(s) 5. Review students with an OVERCONFIDENT or UNDERCONFIDENT Index of Realistic Self Assessment (IRSA). Suggest they meet with you to discuss their peer assessment results 6. Identify students or teams who have not engaged constructively with the peer assessment process. That is, they have rated their team members over a narrow range of scores or have rated their team members well above average. Encourage them to resubmit and justify the ratings they have provided. Enter team results At this stage you can enter your (provisional or final) Team Results. You can test the impact of the alternative methods for calculating students' Personal Results. Consider downloading the Statistics, Qualitative Feedback, and Teachers Feedback to preview the format of the final reports you will receive from Peer Assess Pro. Provisional publication of results Consider publishing provisionally the Personal Results for members of valid teams. Valid teams that have met the minimum threshold required number of responses. You can preview students' Personal Snapshots in 'live' mode to see what students will see on their login dashboard before your Publish or Update the results. This is helpful for your quality management of the peer assessment process. Prepare your students for interpreting the results of their peer assessment These multi-media resources will help you prepare your students for making productive learning from their peer feedback and results. See https://www.peerassesspro.com/resources/introducing-students-peer-assessment/ Furthermore, the class may find these FAQs relevant from now FAQ: How do I interpret the feedback results I've received from the peer assessment? FAQ: How do I provide useful feedback to my team members? FAQ: I don't understand what my teammates are trying to tell me. How do I ask for better feedback? FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern? Students can find further information about peer assessment here https://www.peerassesspro.com/resources/resources-for-students/. Managing the peer assessment The Peer Assess Pro Quickstart Guide reminds you of the next steps you will take as you remind the remaining students to respond. See https://www.peerassesspro.com/quickstart-guide-for-teachers/ Further information For additional advice on managing a Peer Assess Pro activity, review the Frequently Asked Questions here: https://www.peerassesspro.com/frequently-asked-questions-2/ FAQ: How do I take action on the Warnings presented in the Peer Assess Pro™ Teacher’s Dashboard?’ What if I ignore the Warnings? FAQ: How do I decide which Personal Result method to apply in my peer assessment activity? FAQ: What is a valid assessed team? FAQ: How is an outlier peer assessment rating identified? FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated? Kind regards, Peer Assess Pro https://www.peerassesspro.com/support/ |
2006 - ADVISORY - Facilitator - Due Date imminent SUBJECT - DUE DATE IMMINENT: Review your peer assessment <<Activity Title>> . Due by <<Due Date>> Dear << Teacher Fullname >>, You scheduled the peer assessment activity << Activity Title>> due for completion by <<Due Date>>. However, students can continue submitting responses beyond the Due Date until you personally FINALISE the activity. View your Teacher's Peer Assess Pro Dashboard here <<Teacher's Activity URL>>. Review Active Warnings Prior to publishing and finalising the peer assessment, we suggest you review carefully the Active Warnings on your Peer Assess Pro dashboard. In particular, 1. Review the Class Statistics particularly for students rated with the lowest Peer Assessed Scores. The Personal Snapshots for these students may identify absent students or students at risk of course failure 2. Review the Qualitative Feedback report, examining especially students who have been peer assessed with a low rating by other team member(s). There may be feedback comments upon which you wish to take proactive intervention with the assessor or assessed student 3. Review students with an OVERCONFIDENT or UNDERCONFIDENT Index of Realistic Self Assessment (IRSA). Suggest they meet with you to discuss their peer assessment results. Enter team results At this stage you may enter your Team Results. Next, confirm the method for calculating the Personal Result that will be awarded to each student. Prepare your students for interpreting the results of their peer assessment These multi-media resources will help you prepare your students for making productive learning from their peer feedback and results. See https://www.peerassesspro.com/resources/introducing-students-peer-assessment/ Furthermore, the class may find these FAQs relevant from now FAQ: How do I interpret the feedback results I've received from the peer assessment? FAQ: I don't understand what my teammates are trying to tell me. How do I ask for better feedback? FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern? FAQ: How do I interpret measures of realistic self-assessment? Students can find further information about peer assessment here https://www.peerassesspro.com/resources/resources-for-students/. Further information For additional advice on managing and finalising a Peer Assess Pro activity, review the Frequently Asked Questions in the section 'Manage the peer assessment activity' here: https://www.peerassesspro.com/frequently-asked-questions-2/ FAQ: How do I take action on the Warnings presented in the Peer Assess Pro™ Teacher’s Dashboard?’ What if I ignore the Warnings? FAQ: What is a valid assessed team? FAQ: How is an outlier peer assessment rating identified? FAQ: What steps can I take to get a better personal result? Kind regards, Peer Assess Pro https://www.peerassesspro.com/support/ |
2008 - ADVISORY - Facilitator - Due Date reached SUBJECT - DUE DATE REACHED: Finalise your peer assessment <<Activity Title>> . Due by <<Due Date>> Dear << Teacher Fullname >>, You scheduled the peer assessment activity << Activity Title>> due for completion by <<Due Date>>. However, students can continue submitting responses beyond the Due Date until you personally FINALISE the activity. View your Teacher's Peer Assess Pro Dashboard here <<Teacher's Activity URL>>. Review Active Warnings Prior to publishing and finalising the peer assessment, we suggest you review carefully the Active Warnings on your Peer Assess Pro dashboard. In particular, 1. Review the Class Statistics particularly for students rated with the lowest Peer Assessed Scores. The Personal Snapshots for these students may identify absent students or students at risk of course failure 2. Review the Qualitative Feedback report, examining especially students who have been peer assessed with a low rating by other team member(s). There may be feedback comments upon which you wish to take proactive intervention with the assessor or assessed student 3. Review students with an OVERCONFIDENT or UNDERCONFIDENT Index of Realistic Self Assessment (IRSA). Suggest they meet with you to discuss their peer assessment results. Enter team results At this stage you should enter your Team Results and select the method for calculating the Personal Result that will be awarded to each student. Prepare your students for interpreting the results of their peer assessment These multi-media resources will help you prepare your students for making productive learning from their peer feedback and results. See https://www.peerassesspro.com/resources/introducing-students-peer-assessment/ Furthermore, the class may find these FAQs relevant from now. FAQ: How do I interpret the feedback results I've received from the peer assessment? FAQ: I don’t understand what my teammates are trying to tell me. How do I ask for better feedback? FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern? FAQ: How do I interpret measures of realistic self-assessment? Students can find further information about peer assessment here https://www.peerassesspro.com/resources/resources-for-students/. Further information For additional advice on managing and finalising a Peer Assess Pro activity, review the Frequently Asked Questions in the section 'Manage the peer assessment activity' here: https://www.peerassesspro.com/frequently-asked-questions-2/ FAQ: How do I take action on the Warnings presented in the Peer Assess Pro™ Teacher’s Dashboard?’ What if I ignore the Warnings? FAQ: What is a valid assessed team? FAQ: How is an outlier peer assessment rating identified? FAQ: What steps can I take to get a better personal result? Kind regards, Peer Assess Pro https://www.peerassesspro.com/support/ |
To access the Peer Assess Pro survey you require an activity-specific URL. In general, the format of the Activity URL is:
https://q.xorro.com/teacherid/activityid
Example
https://q.xorro.com/smup/23021
The teacherid is usually four letters, such as smup. The teacher is ALWAYS identified by these letters.
The activityid is usually several digits, such as 23021.
The Activity URL is provided to a student through:
The Participant URL lists ALL the activities currently running that have been started by one teacher. The format is a truncated form of the Activity URL. That is, no Activityid, just the teacherid:
When everything is working correctly, you follow the link to the Activity URL. You should see the Login Page. Note the Activity title in the top left corner. That information should confirm you have the correct Activity URL for the peer assessment you are required to undertake.
Login to Peer Assess Pro Activity
Enter you ID. For students, this is usually your Student ID or Student Registration. Your teacher or facilitator will advise you if a different system of identification is being used.
Successful login confirms your name, and details about the institution and teacher that should be familiar to you!
Select ‘Next’ to proceed to the peer assessment.
Successful login to Peer Assess Pro Activity
Quick links and related information
FAQ: I am unable to login. My login failed
There are several reasons why a student’s login may fail to be successful. Steps to effect a remedy are detailed later.
This FAQ explains how to login correctly:
FAQ: How do I login to my Activity URL
FAQ: How do I correct the Team Composition in a running peer assessment activity?
Select the required Activity Title from the list of teacher’s activities