Published using Google Docs
Reference Guide - Manage a Peer Assessment Activity using Peer Assess Pro™
Updated automatically every 5 minutes

Peer Assess Pro Reference Guide /

Manage a Teammate Peer Assessment Activity using Peer Assess Pro

Reference Guide for Teachers and Students

Version 5.1 2023-06-27

Peter Mellalieu peter@peerassesspro.com 

+64 21 42 0118

Patrick Dodd patrick@peerassesspro.com 

+64 21 183 6315


QUICK START GUIDE

Follow these steps to register, launch, manage, and download the final gradebook for a Peer Assess Pro peer assessment using the Xorro Survey Management system.

Step / Xorro[1]

Task / Reference Guide[2]

Guide[3]

REGISTER

once for your Xorro Teacher’s account [4] 

1.1

LOGIN

to your Xorro Teacher’s account dashboard, at https://qf.xorro.com/

1.2

ORIENT

yourself to launching and managing a peer assessment activity.

1.3

PARTICIPANTS

arranged into teams from your class list

Tip - adapt the sample Participants CSV file, here.

2.1

LAUNCH

a peer assessment activity by importing your Participants CSV.

2.3

MANAGE

your launched peer assessment activity

3.0

WARNINGS

alert you to take action with at risk teams and individuals.

3.1

TEAM RESULTS

entered as the basis for calculating a personal grade (optional).

3.3

SELECT

the Personal Result Calculation Method (optional)

3.4

REVIEW

class, team, individual feedback and grades.

3.5

PUBLISH

provisional and final feedback to team members.

3.6

FINALISE

the activity to prevent further survey responses.

4.0

DOWNLOAD

Finalised Teacher’s Gradebook, Qualitative and Teacher’s Feedback.

4.3

Download interactive at www.peerassesspro.com/quickstart-guide-for-teachers


Launching Peer Assess Pro™ using Xorro-Q: Overview

Once logged in to Xorro-Q, you launch a peer assessment activity.

During the launch process, you Import a Participants CSV that specifies team members (first, last name) arranged by their team name, login id and their email.

The Participants CSV is a comma-separated variable (csv) file that must contain the column headers shown in the example below.

Peer Assess Pro emails an activity URL that enables each team member to complete the peer assessments of their team members.

Timely reminders and personalised feedback reports are communicated to the students from Peer Assess Pro using the email addresses you provided in the Participants CSV.

Active Warnings on the Teachers Dashboard provide the teacher with advice about at-risk teams and individuals, poor peer assessment rating behaviour, and other progress indicators.

>>> View the comprehensive online and eBook ‘Get Started with Peer Assess Pro’

Example Peer Assessment Participants CSV File

id

first

last

email

team

group_code

AMTO01

Amanda

Tolley

Amanda.Tolley@noreply.com

Bear

BUS123.101/PMell/TutB/2020-05-28/SUM

ANWO08

Anna

Worth

Anna.Worth@noreply.com

Bear

BUS123.101/PMell/TutB/2020-05-28/SUM

HOBR03

Holly

Brown

Holly.Brown@noreply.com

Bear

BUS123.101/PMell/TutB/2020-05-28/SUM

ALJO11

Alice

Jones

Alice.Jones@noreply.com

Panda

BUS123.101/PMell/TutB/2020-05-28/SUM

GRGR15

Greta

Green

Greta.Green@noreply.com

Panda

BUS123.101/PMell/TutB/2020-05-28/SUM

JEWA06

Jeff

Wang

Jeff.Wang@noreply.com

Panda

BUS123.101/PMell/TutB/2020-05-28/SUM

BOWI12

Bob

Wilson

Bob.Wilson@noreply.com

Tiger

BUS123.101/PMell/TutB/2020-05-28/SUM

HEJO19

Henry

Jones

Henry.Jones@noreply.com

Tiger

BUS123.101/PMell/TutB/2020-05-28/SUM

JOSM13

John

Smith

John.Smith@noreply.com

Tiger

BUS123.101/PMell/TutB/2020-05-28/SUM

>>> Download example CSV,  EXCEL,  or Google Sheet

Example Survey Questions for a Team Member

Most Frequently Asked Questions (FAQs)

FAQ - Show me a quick video overview demonstration of the whole Peer Assess Pro system

FAQ - I’m having problems importing my participants csv

FAQ - How do I correct the Team Composition in a running peer assessment activity?

FAQ - What is the purpose of peer assessment?

FAQ - What questions are asked in the peer assessment survey?

FAQ - View the comprehensive ‘Get Started with Peer Assess Pro’ online eBook

>>> More FAQs at www.peerassesspro.com/frequently-asked-questions-2 


Quick Link Map

Everyone

For teachers

For team members

Xorro-Q help

Login to Xorro-Q

Join peer assessment activity

www.peerassesspro.com

The peer assessment survey

The purpose of peer assessment

Reference guide

Login and orientation

Undertake the peer assessment

FAQs on the web at http://tinyurl.com/papFAQ

Launch peer assessment activity

Use peer assessment results for better performance

Videos

Manage the peer assessment activity

Quickstart guide for teachers

Definitions, calculations, and examples

Contact us

eBook - 7step guide

Support, Feedback and Contact

Ask us for help, give us feedback, and request additional features.

                https://www.peerassesspro.com/contact-us/

Patrick Dodd         patrick@peerassesspro.com +64 21 183 6315

Peter Mellalieu        peter@peerassesspro.com         +64 21 42 0118 Skype myndsurfer

The Seven-Step Formula for Effective Peer Assessment

 

Download interactive poster https://www.peerassesspro.com/infographic/

Complimentary eBook

Mellalieu, P. J. (2020). How to teach using group assignments: The 7-step formula for fair and effective team assessment. Peer Assess Pro. https://www.peerassesspro.com/ebook

Teachers Process Flowchart: Overview

>>> Hyperlinked chart at http://tinyurl.com/papChart

PEER ASSESS PRO REFERENCE GUIDE

QUICK START GUIDE        1

Example Peer Assessment Participants CSV File        2

Example Survey Questions for a Team Member        3

Most Frequently Asked Questions (FAQs)        3

Quick Link Map        4

Support, Feedback and Contact        4

The Seven-Step Formula for Effective Peer Assessment        5

Complimentary eBook        5

Teachers Process Flowchart: Overview        6

PEER ASSESS PRO REFERENCE GUIDE        7

1. Login to your Xorro HOME page        20

1.1 First time users: Register        21

Register a new Xorro Teacher’s Account as a Free Facilitator        21

Getting started with Xorro Q        21

1.2 Login from your registered Xorro Account        22

1.3 Orient yourself to the Xorro HOME Dashboard        23

1.4 Orient yourself to the Peer Assess Pro platform        25

Fast video flyby        25

Review the Peer Assess Pro facilitators dashboard        26

Overview of the steps required to launch a peer assessment        27

1.5 Peer Assess Pro system flowchart detail        28

2. Launch Peer Assessment activity        29

2.1 Quick start launch        30

Participants CSV templates        30

2.2 Create the peer assessment Participants CSV        31

Alternative Participants CSV templates        31

Instructions and column explanations for the peer assessment Participants CSV        32

Requirements for a peer assessment Participants CSV file        34

Create a CSV version of your Participants CSV file        34

Why won’t Xorro load my Participants CSV file?        35

Your spreadsheet editor will typically NOT create a CSV file, unless...        35

Good practice hint: Create distinctive group codes for every peer assessment activity you launch        35

Large, multi-cohort streams in a class        36

Here There Be Dragons!        36

2.3 Launch and create the peer assessment activity        37

Select ACTIVITIES from the top menu bar        37

Launch Peer Assessment        38

Good practice hint: Avoid using the Xorro default Due Date        38

The Due Date date is advisory only        39

Initiate Create Activity        39

Point of no return!        39

View the Peer Assess Pro Teacher’s dashboard        40

Invite team members to respond and other automated activities        41

2.4 Use a Teamset Group to launch a peer assessment        44

From the Xorro HOME page select the PARTICIPANTS page        44

Select ‘Import Participants’        44

Browse to your Team Members Group CSV file        45

Load, check and confirm correct team membership, then Import        45

Check class and team membership        46

3. Manage the Peer Assessment Activity        48

3.1 Action responses to warnings        49

Adjusting team composition        50

Note: Changes to a Xorro Group have NO EFFECT on current Team Composition        51

3.2 Automated and manual notifications        51

3.3 Enter Team Results        51

3.4 Select the Personal Result Calculation Method        53

3.5 Review class, team, and individual statistics        54

Review Class Results        54

Good practice hint: How to identify at risk students        55

The Individual Personal Snapshot        55

Four possible views of the Individual Personal Snapshot        58

Team Statistics        60

Qualitative Feedback        61

Teacher’s Feedback        61

Advanced Statistics        62

3.6 Publish provisional Personal Results to team members        63

Unpublished status        63

Published status        64

Results hidden when insufficient responses        64

4. Finalise the peer assessment activity        66

4.1 Why Finalise?        67

4.2 Publish Finalised Results to students        67

4.3 Download Teacher’s Gradebook of Results        68

4.4 Finalise the Activity … irrevocably!        69

FREQUENTLY ASKED QUESTIONS        70

Quick Link Map        70

FAQs for teachers        71

The peer assessment survey        71

Login and orientation        71

Launch peer assessment activity        72

Manage the peer assessment activity        72

Responding to Active Warnings        73

Definitions, calculations, and examples        74

Miscellaneous        74

FAQs for team members        76

The purpose of peer assessment        76

Undertaking the peer assessment        76

Using the results from peer assessment for better performance        77

How peer assessment affects personal results        77

FAQ: What is the purpose of peer assessment?        79

Defining peer assessment        79

Developmental feedback        79

Determination of course personal result        79

Criteria for peer assessment in Peer Assess Pro™        80

Peer Assess Pro assesses competencies valued by employers        80

Video        81

FAQ: When and how is the peer assessment conducted?        83

Formative assessment: optional but valuable        83

No surprises!        84

FAQ: How do I provide useful feedback to my team members?        86

FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern?        88

Symptoms of an unfair assessment        88

Steps to address an unfair peer assessment        88

A note on appealing a peer assessment result        89

Prevention is better than cure        90

FAQ: How do I interpret the feedback results I've received from the peer assessment?        92

FAQ: I don’t understand what my teammates are trying to tell me. How do I ask for better feedback?        93

FAQ: What steps can I take to get a better personal result?        94

Raise your Team Result        94

Use your institution’s academic support services        94

Raise your Peer Assessed Score        95

How do I address proactively the challenges of team work?        95

Learning constructively from mid-course peer assessment feedback        96

FAQ: What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process?        98

The tricks we know!        98

Examples: Highly specific and individualized information        99

1. Low quality assessor rating        99

2. Low quality team rating        99

3. Outlier individual rating        100

4. Mismatched self-assessment        100

5. At risk team member        101

Example: Better feedback. Better teams        101

Which teams will raise the Active Warning: Low quality team rating?        102

Which teams tend to have a higher team result?        103

Which teams have worked most productively as a team?        103

Active Warnings, thresholds parameters, and program logic        105

FAQ: Give me a quick overview of how to launch a Peer Assess Pro™ activity through Xorro        106

FAQ: What are the design objectives, key features, and benefits of the Peer Assess Pro development?        108

Design objectives        108

Benefits for students        108

Benefits for teachers        109

Peer Assess Pro™ is a work in progress        109

Where’s the latest?        110

FAQ: How do I find the Peer Assess Pro Xorro Teacher’s dashboard?        111

HOME: Running Activities        111

Alternative method: ACTIVITIES: Running Activities        111

FAQ: How do I navigate the PARTICIPANTS page for Peer Assess Pro?        114

Select PARTICIPANTS Tab        114

Orientation note: Select an existing Group        115

Inactive functions in PARTICIPANTS page        116

FAQ: How do I correct the team composition in a running peer assessment activity?        117

Adjust the team composition in a running peer assessment activity on an LMS        118

Take care! Here there be dragons!!        118

Overview        118

What happens with Synchronise All?        119

Precautions before Synchronise All!        120

Some survey responses might be deleted!        120

Team composition view prior to synchronise all        121

Team composition following synchronise all        122

Survey notifications        122

Adding orphan teams to a running activity        123

Why was my Synchronise All action rejected?        123

Good practice hint when creating a peer assessment activity        123

LMS team arrangement facilities for Moodle        124

Adjust the team composition in a running peer assessment activity on Xorro        125

Take care! Here there be dragons!!        125

Key check points        125

View the team composition        126

Correct the team composition        126

Subtle technical note        127

FAQ: Can I create a peer assessment activity without having all my teams correctly identified by team name and/or team membership?        128

FAQ: How do I create a CSV file from a Google Sheet?        129

Good practice hint!        129

Sample of participants csv file opened using a text editor        130

FAQ: How do I view a demonstration version of Peer Assess Pro?        131

FAQ: How do I correct the participants (team members) in a group already uploaded to Xorro?        132

Update a group’s team members for future use        132

Correct the team members associated with an existing Xorro TeamSet Group        133

FAQ: Where may I view the most recent version of the user guides?        135

Quickstart Guide        135

Video guides        135

Latest reference guide        135

Work in progress Google DOCS development version        135

Frequently Asked Questions for teachers and team members        136

Teachers Process Flowchart        136

FAQ: How do I decide which Personal Result method to apply in my peer assessment activity        137

FAQ: How are peer assessment and personal results calculated and defined mathematically?        143

Calculation methods that exclude a team result        144

Calculation methods that incorporate a team result from team outputs        144

FAQ: How do students know where and when to complete the peer assessment activity then review their results?        145

Automated communications to students        145

Standard operating mode        145

Alternative mode for student access to assessment and results        146

FAQ: How do I view and experience what the students experience?        147

View your student’s personal results feedback report directly from your Teacher’s Dashboard        147

View your students’ experience of the Peer Assess Pro™ survey        147

Enter your Participants’ URL into your browser        148

Select the activity you wish to experience        149

Login in using the Identification (id) of a student in the Team List Group used to create the activity        149

View a survey ready and waiting for responses        150

View a sample question        150

View a student’s published results        151

View the peer assessment survey for a demonstration class        153

FAQ: Why are different terms used to display peer assessment results in the Xorro and previous Google versions of Peer Assess Pro™?        154

FAQ: How do I take action on the Warnings presented in the Peer Assess Pro™ Teacher’s Dashboard?’ What if I ignore the Warnings?        155

Critical and catastrophic warnings!        155

Important warnings        155

Informational warnings        156

Optional emails generated for team members        156

FAQ: When, why, and how do I Refresh and Update Results?        157

When to recalculate        157

Why recalculate?        157

How to recalculate        158

FAQ: What questions are asked in the peer assessment survey?        159

Example Peer Assessment Survey: Quantitative        160

Example Peer Assessment Survey: Qualitative        161

FAQ: Why FINALISE a survey?        162

Best practice before FINALISE SURVEY        162

Download Teacher’s Gradebook of Results        165

Finalise the Survey … irrevocably!        166

FAQ: How is the Peer Assessed (PA) Score calculated?        167

The self-assessment is excluded from calculating PA Score        167

Mathematical definition of Peer Assessed Score, PA Score        168

Example calculations of Peer Assessed Score        171

Alternative mathematical formulations of PA Score        173

Calculation from Average Rating        173

Calculation from Average Team and Leadership Contributions        174

FAQ: How is the self-assessment used to calculate Peer Assessed Score?        175

Spider chart of individual and averaged team peer ratings        175

Index of Realistic Self-Assessment (IRSA)        176

FAQ: How is the Peer Assessed Index (PA Index) calculated?        177

Mathematical definition of Peer Assessed Index        177

Example calculations of Peer Assessed Index        178

FAQ: How is the Indexed Personal Result (IPR) calculated?        180

Mathematical definition of Indexed Personal Result        180

Example calculations of Indexed Personal Result        181

FAQ: How is the Normalised Personal Result (NPR) calculated?        183

Mathematical definition of Normalised Personal Result        184

Example calculations of Normalised Personal Result        185

Impact of adjusting the Spread Factor on Normalised Personal result        187

FAQ: How is the Rank-Based Personal Result (RPR) calculated?        189

Mathematical definition of Rank-Based Personal Result        190

Example calculations of Rank-Based Personal Result        191

Example calculation with tied ranks        194

Adjusting the range using a spread factor        194

Example calculation with spread factor        195

FAQ: How is the Rank Based Personal Result (RPR) calculated (Pre-2022)?        197

Mathematical definition of Rank-Based Personal Result        198

Example calculations of Rank-Based Personal Result        199

Example calculation with tied ranks        201

FAQ: How is Standard Peer Assessed Score (SPAS) calculated?        202

Design features of Standard Peer Assessed Score        203

Mathematical calculation        204

Example calculations of Standard Peer Assessed Score        206

Example charts for Standard Peer Assessed Score        209

Assumptions about Standard Peer Assessed Score        210

The impact of gaming peer assessment        211

FAQ: What is the influence on Standard Peer Assessed Score (SPAS) if a team rates ALL its members with a Peer Assessed Score of 100?        212

FAQ: Would a student receive the same Standard Peer Assessed Score (SPAS) if rated in another class?        212

FAQ: What is Employability? How is it  calculated?        214

Mathematical calculation of Employability        214

Conditioning transformations to de-emphasise unsubstantiated precision        215

Example calculations of Employability        215

FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?        218

Mathematical definition of the Index of Realistic Self Assessment        218

Example calculations of the Index of Realistic Self Assessment        219

Why an IRSA of 100 is not a perfect score!        220

FAQ: How do I interpret measures of realistic self-assessment?        222

Interpreting the Index of Realistic Self Assessment (IRSA)        222

Typical IRSA        222

Overconfident IRSA        222

Underconfident IRSA        223

Developing an exceptionally realistic self-image, ERSA        223

What are the benefits of having an Exceptionally Realistic Self Assessment?        223

What can get in the way of having an Exceptionally Realistic Self-Image?        223

How do I develop my Exceptionally Realistic Self-Image, ERSI?        224

FAQ:  How is insignificant intra-team agreement identified? WARNING 0041        226

Illustration of principle        226

Warning detail        227

Accounting for tied ratings and low-quality assessments        228

A team where all team members rate everyone the same        228

Recommended action        229

Award the same personal result        229

Special cases        230

The same personal result awarded updates according to teacher’s adjustments to team result or result method        230

‘Award the same personal result’ status is deactivated automatically        231

‘Exclude from calculations’ overrides actions that address insignificant team agreement        232

Defining concordance        232

Interpreting the concordance value, W        233

Exclusion of self-assessment ratings        233

Tied rankings        233

Where everyone is equally average        234

Significance of the concordance statistic        234

Requirement for calculating concordance        235

Team discrimination table shows class concordance statistics        235

Example calculations        237

Example A. High rating agreement amongst the team members        237

Ex A. Calculation of Concordance, W, from ranks        239

Conclusion for Example A        240

Example B. Low agreement amongst the team members        240

Conclusion for Example B        241

Example C. Teammates show low discernment by submitting tied ranks        242

A note on computational efficiency        244

FAQ: How is an outlier peer assessment rating identified? WARNING 0042        247

Warning detail        247

Failure to agree across the whole team        247

Example calculations        248

Threshold for warning of outlier individual peer rating        249

Alternative mathematical calculation of Assessor Impact        250

Alternative example calculations        250

FAQ: What is a mismatched self-assessment (IRSA)? WARNING 0040        252

Warning detail        252

Threshold for warning of mismatched self-assessment        253

Example calculations        253

Recommended action for facilitator        254

FAQ: What is an adjusted team arrangement request? WARNING 0006        255

Warning detail        255

Available actions        255

Special cases        258

Reassign a new participant to an existing or new team        258

FAQ: What is an inactive team member? WARNING 0048        260

Warning detail        260

Available actions        260

Good practice for addressing an alleged inactive student        261

Special cases        262

Reassigning a student to a new ‘team of one’        262

Exclude from calculations overrides actions that address insignificant team agreement        262

Example of exclusion from calculations        263

Impact of exclusion from calculations        266

FAQ: What is a low-quality team rating? WARNING 0050        267

Warning detail        267

Threshold for warning of low-quality team rating        268

Example calculations        268

Recommended action for facilitator        269

High performing teams        269

Example case        270

Active Warning 0050 Under development        271

Warning detail        271

Threshold for warning of low-quality team rating        272

Example calculations        272

Recommended action for facilitator        273

High-performing teams        273

Example case        274

FAQ: What is a low-quality assessor rating? WARNING 0300        275

Warning detail        275

Threshold for warning of low quality assessor rating        275

Example calculations        276

Recommended action for facilitator        277

High performing teams        277

FAQ: What is a valid assessed team? WARNING 0022        278

Warning detail        278

Results not displayed to members of non-valid assessed teams        278

How many valid and invalid teams do I have?        279

Recommended action for facilitator        279

Mathematical definition        279

Example calculations        280

FAQ: What is an ‘at-risk’ team member? WARNING 0036        281

Warning detail        281

Recommended action for facilitator        282

Threshold for warning of ‘at-risk’ team member        282

Example calculation        284

Alternative approaches to identifying at-risk students        287

FAQ: What is a team with low psychological safety? WARNING 0034        291

Warning detail        291

Definitions        291

Example calculation for Team Safety        293

Recommended action for facilitator        294

FAQ: What is an unsafe team member? WARNING 0035        296

Warning detail        296

Recommended action for facilitator        297

Limitations        297

FAQ - What emails have been sent by the platform?        299

Survey notifications history        299

Track-and-trace of emails to participants        299

FAQ: What is the content of emails sent by Peer Assess Pro to Participants?        302

Preview email from Active Warnings        302

Preview all emails available for sending        302

Table of email subjects sent to participants        304

Table of email body text sent to participants, listed by Email ID and Subject        306

FAQ: What is the content of emails sent by Peer Assess Pro to Facilitators?        316

Table of email subjects sent to facilitators        317

Table of email body text sent to facilitators, listed by Email ID and Subject        318

FAQ: How do I login to my peer assessment Activity URL        326

Activity URL        326

Participant URL        326

Successful login through Activity URL        327

FAQ: I am unable to login. My login failed        329

Investigation and remedies for login failure        329

1. You entered your ID incorrectly.        329

2. Your teacher or facilitator has entered your ID incorrectly        330

3. The Xorro Activity related to the Activity URL has not yet reached its Start Date        330

4. The Xorro Activity related to the Activity URL has been Finalised and Finished.        331

5. The Xorro Activity related to the Activity URL has been Abandoned        332

6. The institution manager for Xorro has not maintained payment of the subscription to use Xorro and/or Peer Assess Pro        332

7. An exceptional system fault has occurred with the Xorro participants database entry for your ID: duplicate identical ids        333

FAQ: Can I adjust the Start Date or Due Date for a running activity?        335

Adjusting the Start Date        335

Adjusting the Due Date        335

The good news: The Due Date date is advisory only        336

Advise students of your extended deadline        336

Worst case scenario: Abandon the peer assessment        336

FAQ - I’m having problems importing my participants csv        339

What are the common problems when importing a participants file?        339

Is my Participants CSV in the correct format for launching a peer assessment activity?        341

Sample of participants csv file opened using a text editor        341

Sorted Participants CSV viewed in Google Sheets        342

Extra for experts        343

Key points!        344

Error notifications upon upload of a participants CSV to a peer assessment        344

Examples of error notifications upon upload of a participants CSV to a peer assessment        345

Good practice hint!        345

Comprehensive list of potential errors when attempting to import participants csv        346

Potential errors in a participants csv        346

About Comma-separated values (CSV)        348

FAQ - Problems editing and creating participants CSV files        350

Trouble opening csv files with Excel due to regional settings        350

Explanation        351

Workaround solution        351

Solution 1: Use a simple text editor to find and replace semicolons with commas        351

Illustration using Mac TextEdit        351

Solution 2: Adjust Language and Region to use point as decimal marker        355

Illustration using Mac OS on an Apple computer        355

FAQ - How do I fix an invalid, missing or failed email delivery? WARNING 0026        359

Corrective action: avoid these emails        359

Corrective action: use these emails        360

FAQ - How do I resolve an unsynchronised team arrangement? ACTIVE WARNING 0021        362

Corrective Action        363

What is a team arrangement?        363

Synchronise all team arrangements        364

Here there be dragons!        365

Postpone        366

An example scenario        366

One week later        367

Alert to unsynchronised team arrangement        367

Unsynchronised Team Composition        368

Teacher’s actions        369

Before synchronisation        369

Successful synchronisation of the corrected team arrangement        370

FAQ - What is the benefit of a standardized peer assessment rubric?        373

FAQ - What if a student mistakenly advises they are in an incorrect team?        375

Team membership confirmation by student        375

Mistaken team membership notification        376

CONTACT US        378

1. Login to your Xorro HOME page


1.1 First time users: Register

Register a new Xorro Teacher’s Account as a Free Facilitator

Sign up as a Free Facilitator to trial the use of Peer Assess Pro using the Xorro-Q interface:

Sign up as a Free Facilitator 

https://www.xorro.com/free_accounts/pap/new

Getting started with Xorro Q

For related information relevant to registering as a new facilitator:

Getting Started with Xorro Q

For further details contact Patrick Dodd at the offices of Peer Assess Pro.


1.2 Login from your registered Xorro Account

After you login, The your  Xorro HOME Dashboard  page shows will display, as shown in Section 1.3 Orient yourself to the Xorro HOME Dashboard

Now proceed to follow the steps in the Quickstart Guide, or the detailed explanations in Section 2. Launch Peer Assessment Activity

Quick links and related information

VIDEO: Login and orientation 

View: Quick Start Guide

Section 1.3 Orient yourself to the Xorro HOME Dashboard

FAQ: How do I find the the Peer Assess Pro Teacher’s dashboard?


1.3 Orient yourself to the Xorro HOME Dashboard

Your  Xorro HOME Dashboard  page shows

Quick links and related information

VIDEO: Login and orientation 

View: Quick Start Guide

FAQ: How do I find the the Peer Assess Pro Teacher’s dashboard?

Peer Assess Pro system flowchart detail

Peer Assess Pro system flowchart detail http://tinyurl.com/papChart

Each process box in the flowchart pdf version of the flowchart links directly to the specific page in this Reference Guide that explains that step in the process.


1.4 Orient yourself to the Peer Assess Pro platform

Fast video flyby

View this short video illustrating many of the features, benefits, and processes involved in using the Peer Assess Pro platform (6 minutes).

http://tinyurl.com/digitalFlyBy


Review the Peer Assess Pro facilitators dashboard

These are the key features of the Facilitators Dashboard accessed through the Xorro ACTIVITIES tab when you have launched a peer assessment activity.

Overview of the steps required to launch a peer assessment


1.5 Peer Assess Pro system flowchart detail

PDF with hyperlinks at Xorro  Peer Assess ProTM Teachers Process Flowchart http://tinyurl.com/papChart

2. Launch Peer Assessment activity


2.1 Quick start launch

Create a file containing your class list that shows every team member organised into their teams. The required file format is Comma Separated Variables (CSV). This is your Participants CSV file. A sample of the file format is shown in Section 2.2 Create the peer assessment Participants CSV file

Use any of these following templates to adapt and create your Participants CSV file using your preferred editor.

After editing the template, remember to create a CSV file type using  SAVE AS CSV, DOWNLOAD AS CSV or EXPORT AS CSV, depending on your spreadsheet editor.

Participants CSV templates

Excel sheet

Google Sheet

CSV file

For a registered Xorro user, use this link to launch a new peer assessment activity. You will be presented with an option to import directly your Participants CSV.

https://qf.xorro.com/pap/launches/new

If your CSV refuses to load, or the activity fails to create, review the detailed steps in the next sections to ensure your CSV is specified correctly.

FAQ - I’m having problems importing my participants csv

Check carefully that the specifications detailed in the INSTRUCTIONS and COLUMN EXPLANATIONS presented within the sample.csv template are followed strictly.


2.2 Create the peer assessment Participants CSV

Use a spreadsheet editor, such as Google Sheets, Excel or Numbers to produce a file that contains columns of data with these column headers id, first, last, email, team, and group_code. Precise INSTRUCTIONS and COLUMN EXPLANATIONS for each of these data are detailed below.

Alternative Participants CSV templates

Use any of these templates to adapt and create your Participants CSV (comma separated variables file) using your preferred editor. The templates contain the example data and instructions shown below.

CSV file

Excel sheet

Google Sheet

In the sample files, only the group BUS123.101/PMell/TutB/2020-05-28/SUM is a valid teamset suitable for processing by Peer Assess Pro. This is the only group that specifies membership of teams by the students in the class, the teams being Panda, Bear and Tiger.

Sample peer assessment Participants CSV

id

first

last

email

team

group_code

ANWO08

Anna

Worth

ARTS123.204/WShak/2021-02-28

GRGR15

Greta

Green

ARTS123.204/WShak/2021-02-28

AMTO01

Amanda

Tolley

Amanda.Tolley@noreply.com

Bear

BUS123.101/PMell/TutB/2020-05-28/SUM

ANWO08

Anna

Worth

Anna.Worth@noreply.com

Bear

BUS123.101/PMell/TutB/2020-05-28/SUM

HOBR03

Holly

Brown

Holly.Brown@noreply.com

Bear

BUS123.101/PMell/TutB/2020-05-28/SUM

ALJO11

Alice

Jones

Alice.Jones@noreply.com

Panda

BUS123.101/PMell/TutB/2020-05-28/SUM

GRGR15

Greta

Green

Greta.Green@noreply.com

Panda

BUS123.101/PMell/TutB/2020-05-28/SUM

JEWA06

Jeff

Wang

Jeff.Wang@noreply.com

Panda

BUS123.101/PMell/TutB/2020-05-28/SUM

BOWI12

Bob

Wilson

Bob.Wilson@noreply.com

Tiger

BUS123.101/PMell/TutB/2020-05-28/SUM

HEJO19

Henry

Jones

Henry.Jones@noreply.com

Tiger

BUS123.101/PMell/TutB/2020-05-28/SUM

JOSM13

John

Smith

John.Smith@noreply.com

Tiger

BUS123.101/PMell/TutB/2020-05-28/SUM

THWI18

Thomas

Windsor

Thomas.Windsor@noreply.com

Tiger

BUS123.101/PMell/TutB/2020-05-28/SUM

ANWO08

Anna

Worth

Anna.Worth@noreply.com

COMP123.201/PDod/TutA/2020-10-01

HOBR03

Holly

Brown

Holly.Brown@noreply.com

COMP123.201/PDod/TutA/2020-10-01

JOSM13

John

Smith

John.Smith@noreply.com

COMP123.201/PDod/TutA/2020-10-01

Instructions and column explanations for the peer assessment Participants CSV

INSTRUCTIONS

1. Organise your participants data into the columns corresponding to those shown in columns A to F, the first 6 columns headed 'id' through 'email'.

You might find it helpful to paste your data from row 17, below the sample data provided in rows 2 through 16.

The sample data provided demonstrates ten unique individuals (ids), organised into three different groups. One group contains a further three teams.

A group might comprise all members of a class, or subdivisons such as streams, cohorts, sections, or tutorial groups.

In the group called BUS123.101/PMell/TutB/2020-05-28/SUM the participants are subdivided further into three different teams, Bear, Panda and Tiger.

Only group BUS123.101/PMell/TutB/2020-05-28/SUM is a Xorro teamset suitable for a peer assessment activity.

A group is not a team. A group (such as a class) may contain several teams, in which case that's a Xorro teamset.

2. If you are preparing a separate file, ensure you use exactly the same column headers for your list as shown in row 1.

That is, 'id', 'first', 'last', 'email', 'team', 'group_code'.

These headers are not case sensitive.

The sequence of column headings is NOT IMPORTANT.

You may optionally include additional headers and columns of data. This data will be ignored by Xorro.

Data may be sorted by any of the columns.

3. Read carefully the COLUMN EXPLANATIONS, below, for each type of data.

Some data is optional, and can be skipped, as shown for group_code ARTS123.204/WShak/2021-02-28

4. Delete the sample data, immediately below the header row.

That is, delete everything between row 2 and row 16.

CRITICAL: CHECK you do not have duplicate ids in the same group_code.

CHECK you do have all the ids in your class allocated to to a group, and, optionally, a team

5. If you have used this page as your template, you may DELETE this 'instructions' column.

That is, delete anything not part of your data.

Keep the column headers. The headers must be on row 1 of your file.

6. Save (Download, Export, Save As) the file as a CSV, giving it an appropriate filename.

7. From Xorro-Q, browse to PARTICIPANTS, then upload the CSV file.

Alternatively, when you Launch a Peer Assessment Activity, you can IMPORT directly the CSV to create or update the activity.

From this sample file, upon upload three groups would be created in Xorro: ARTS..., BUS.... and COMP....

Only one of the groups is a teamset containing the three teams Bear, Panda, and Tiger.

8. COLUMN EXPLANATIONS

id - Compulsory field.

Identifier for this participant, must be unique for the entire institution.

For a peer assessment activity, this is the participant's login id.

No blanks or characters such as #@$%&*()+

first - Compulsory field.

Participant's first name

last - Compulsory field.

Participant's last name

email - Optional field.

The participant's email.

Ideally required for a peer assessment activity when you require autogenerated warnings and notifications from Peer Assess Pro.

team - Optional field. Required for peer assessment activity.

The name of the team in which the participant is a member.

The participant can be a member of NO MORE than one team within the same group.

A participant may belong to different teams in different groups.

group_code - Optional field. Required for a peer assessment activity.

The code for the group (ie course, class, stream, cohort) into which the participant is being enrolled.

If the participant is in multiple groups, supply a separate line for each group in which the participant is a member.

Good practice. Append to your root code, such as BUS123.101, abbreviations that indicate the teacher, activity date (start or due), subdivision (stream, cohort), summative or formative.

Note that Anna Worth is enrolled in three groups and in one team within group BUS123.101/PMell/TutB/2020-05-28/SUM

Requirements for a peer assessment Participants CSV file

Create a CSV version of your Participants CSV file

After editing the template, remember to create a CSV version of  your file. Depending on your editor, the appropriate command is:

FILE… SAVE AS … TEXT CSV

FILE… DOWNLOAD AS … Commas separated values (.csv)

FILE… EXPORT AS CSV

FILE… EXPORT TO… CSV

Why won’t Xorro load my Participants CSV file?

First, see FAQ - I’m having problems importing my participants csv

Your spreadsheet editor will typically NOT create a CSV file, unless...

In frequent cases, using the FILE… SAVE command in your spreadsheet editor will produce a file with the incorrect file format, such as .xls, .sheet, or .numbers. 

Xorro will reject those file formats. Xorro accepts and loads only .csv.

Follow this advice

FAQ: How do I create a CSV file from a Google Sheet

Good practice hint: Create distinctive group codes for every peer assessment activity you launch

We advise creating a new, unique group_code for each Xorro Activity you create, even for repeat peer assessments within the same class term or semester.

Use a group_code like this

BT123.101/PJM/2020-03-28/FORM 

We suggest your group_code include these elements as per the example above:

We recommend your resulting group_code should distinguish uniquely this semester’s mid-semester formative peer assessment(s) from last semester’s end of class summative where, perhaps, the same institutional class code could have a different set of student names.

The group_code is specified in the Participants CSV file you import prior to launching a Peer Assess Pro™ Activity.

Large, multi-cohort streams in a class

In the general case, a very large class could comprise several cohorts, streams or tutorial sets, each subclass containing several teams conducting one or more peer assessment activities. Consequently, your group_code should help distinguish these separate peer assessment activities. For example,

BT123.101/PJM/TutB/2020-05-28/SUM

Here There Be Dragons! 

Consider two teachers at the same institution teaching the same course but with different tutorial groups. If they use the same goup_code, such as BT101, they will load their own team sets into the same Xorro Participants’ Group, additively, thereby causing mutual confusion and dismay. Similarly, a teacher using the same group_code from term to term, semester to semester, and year to year will experience similar grief.

Quick links and related information

FAQ - I’m having problems importing my participants csv

FAQ: How do I correct the Team Composition in a running peer assessment activity?

FAQ: How do I correct the participants (team members) in a group I uploaded?

FAQ: How do I create a CSV file from a Google Sheet


2.3 Launch and create the peer assessment activity

In summary

Select ACTIVITIES from the top menu bar


Launch Peer Assessment

Enter the following details, in this sequence

Good practice hint: Avoid using the Xorro default Due Date

Set a realistic Due Date that is your target for when you expect and want most students to have completed the peer assessment. In practice, typical Due Dates are set to within four days to seven days beyond the Start Date.

The Due Date is used by Peer Assess Pro to generate automatically:

If you use the Xorro default Due Date, which currently is NOW, the Start Date, you will not receive the benefits of the automated processes conducted by Peer Assess Pro that are triggered by a practical Due Date.

The Due Date date is advisory only

The ‘Due Date’ date is advisory only. Students can CONTINUE to submit responses beyond the Due Date UNTIL the teacher Finalises the activity. After the Finalisation Date, the students will have no more than two weeks to review their results.

FAQ: How do I adjust the Due Date or deadline?

The short answer is ‘You can’t adjust the Due Date!’ You don’t need to!

Initiate Create Activity

After setting the Start At and Due Dates, select  Create Activity .

Point of no return!

Double check your Start Date and Due Date carefully!

Once you   Create Activity   you cannot adjust the Start Date. The peer assessment Survey and the Email notifications to students requesting their response are created when the Start Date is reached. Furthermore, the email advises the students the Start Date and Due Date.

Therefore, an adjustment to the Start Date would confuse the students as the Participant Activity URLs would be announced to students. Those Activity URLs could become unavailable to the students if dates were adjustable.

For a similar reason, you cannot adjust the Due Date. However, the ‘Due Date’ date is advisory only. Students can CONTINUE to submit responses beyond the Due Date UNTIL the teacher Finalises the activity.

FAQ: Can I adjust the Start Date or Due Date for a running activity?

In short, No!

In a ‘worst case scenario’ you can abandon the activity and launch a new activity. Review the foregoing FAQ for details on how to Abandon a running Peer Assess Pro activity.

View the Peer Assess Pro Teacher’s dashboard

Peer Assess Pro Teacher’s Dashboard

Invite team members to respond and other automated activities

When the Start Date occurs, Peer Assess Pro automates several activities:

A unique peer assessment survey is created for every team and team member


Quick links and related information

FAQ - I’m having problems importing my participants csv

FAQ: How do I correct the Team Composition in a running peer assessment activity?

FAQ: Can I adjust the Start Date or Due Date for a running activity?

FAQ: How do I view a list of the participants (team members) in the group I uploaded?

FAQ: How do students know where and when to complete the peer assessment activity then review their results?

FAQ: How do I view and experience what the students experience?

FAQ: How do I find the the Peer Assess Pro Teacher’s dashboard?


2.4 Use a Teamset Group to launch a peer assessment

This is an alternative approach to launching a peer assessment activity. This is a two stage process where you can

From the Xorro HOME page select the PARTICIPANTS page

(Image to come)

Select ‘Import Participants’

This uploads your Participants CSV within which you have classified your students into teams, as detailed in Section 2.2 Create the peer assessment Participants CSV file

Note that multiple teamset groups may be created using this import process. This is potentially useful for managing peer assessment in large, multi-stream  classes.

Browse to your Team Members Group CSV file

Load, check and confirm correct team membership, then Import

You should see a list of all the students belonging to the class for whom you wish to run the peer assessment activity.

Note: The message ‘Exists’ or ‘Conflict’ means that the id (Identification) code has already been identified within your institution, or a previous Group you have uploaded. Carry on!

Check class and team membership

At this point you are unable to confirm the team membership of your team class. You must first launch a peer assessment activity selecting (one of) the Group Codes that existed within the original Participants CSV.

Quick links and related information

FAQ - I’m having problems importing my participants csv

FAQ: How do I view a list of the participants (team members) in the group I uploaded?

FAQ: How do I view or change the participants (team members) in a group I uploaded?

FAQ: How do I correct  the Team Composition in a running peer assessment activity?

FAQ: Can I adjust the Start Date or Due Date for a running activity?

In short, No! Please check carefully your Start Date and Due Dates before you Create Activity.

3. Manage the Peer Assessment Activity


3.1 Action responses to warnings

Active Warnings show when you need to take action to remedy an issue during execution of the peer assessment activity.

In the following example, one member of Team Brazilia has completed the assessment of their four team members. Consequently, a warning is generated for Team Brazilia that the number of responses from the team is insufficient for presenting valid results. In contrast, all four team members of Team Kublas have completed the assessment.

The warnings displayed in this case are

Click through the warning to gain advice on how to remedy the situation. For example, you can remind the students to complete the survey. Emails are automatically generated and sent on your behalf to all or selected students.

Quick links and related information

FAQ: What is the content of emails sent by Peer Assess Pro?

FAQ - What emails have been sent by the platform?

Adjusting team composition

Upon commencing the peer assessment survey, team members are asked first to confirm that the team members identified or their team are correct. If not, the student initiates a request notification to the teacher to readjust their team’s membership.

Once the peer assessment activity has been launched, you can modify the team composition as per the following FAQ.

FAQ: How do I correct the Team Composition in a running peer assessment activity?

Note: Changes to a Xorro Group have NO EFFECT on current Team Composition

Changes to a Xorro Group will have NO EFFECT on a currently running activity, unless you Finalise then Abandon the activity. Then re-launch a new activity with the revised Group. This is an extreme response, and should not generally be required, if you follow the previous FAQ.

Quick links and related information

FAQ: How do I take action on the Warnings presented in the Peer Assess Pro™ Teacher’s Dashboard?’ What if I ignore the Warnings?

3.2 Automated and manual notifications

Students who have NOT completed the survey are sent an email reminder 72 hours, 24 hours and 12 hours before the Due Date.

Similarly, if a student is required to resubmit a response because a team has been reconstituted, an automatic reminder is sent.

Quick links and related information

FAQ: What is the content of emails sent by Peer Assess Pro?

FAQ - What emails have been sent by the platform?

3.3 Enter Team Results

The Team Results for each team must be entered should you intend to select any of these methods to calculate the Personal Result.

After you have entered or revised your Team Results, communicate the Personal Results to your class using Publish or ‘Update’ button.

Team Results are not used to calculate:

Upon entering Team Results, the Peer Assess Pro platform selects automatically the Normalised Personal Result (NPR) method for calculating participants’ Personal result. A Scale Factor of 1.0 is selected.


3.4 Select the Personal Result Calculation Method

The Personal Result Calculation Method calculates the Personal Result you will award to each team member.

When you first enter Team Results, the Peer Assess Pro platform selects automatically the Normalised Personal Result (NPR) method for calculating participants’ Personal result. A Scale Factor of 1.0 is selected.

To adjust the Personal Result Calculation Method and/or adjust the Scale Factor

Quick links and related information

FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?

3.5 Review class, team, and individual statistics

You can explore progress and final results at the class, team, and individual level.

Review Class Results

In the Class Results, select a Bucket Range to identify the specific students lying within the range of a histogram bar chart.

Before reviewing results, see:

FAQ: When, why, and how do I ‘Recalculate Results’?

Example class statistics


In any of the tables, you may

Good practice hint: How to identify at risk students

The Individual Personal Snapshot

The Individual Personal Snapshot enables you to view all data related to one student. The Student View version of the Personal Snapshot shows exactlt the report the student will receive when the teacher Publishes the results of the current Peer Assess Pro activity.

However, the teacher may wish to view how the results will appear to students BEFORE they are Published. Consequently, there are four possible views of an Individual Personal Snapshot. They are variations on the following example. The four views are explained later.

  1. Teacher’s Live View.
  2. Student’ Live View
  3. Student’s Published View.
  4. Teacher’s Published View.


Example Individual Personal Snapshot (1 of 3)


Example Individual Personal Snapshot (2 of 3)


Example Individual Personal Snapshot (3 of 3)

Four possible views of the Individual Personal Snapshot

Note there are four possible views of an Individual Personal Snapshot.

  1. Teacher’s Live View. Shows the feedback the student would view once the current (live) results are Published or Updated. Furthermore, this snapshot includes qualitative feedback in a transparent form where the teacher can view specifically
  1. Student’ Live View. The view not yet made available to the student, but what the student would view once the current results are Published or Updated. This snapshot includes qualitative feedback  ‘who said what’ in anonymised form, just as the student would see the report.
  2. Student’s Published View. The view you may have already Published to the Student, and available for their view. This snapshot includes qualitative feedback  ‘who said what’ in anonymised form, just as the student would see the report
  3. Teacher’s Published View. This view is similar to the view that is Published to the Student, and available for their view. Additionally, like the Teacher’s Live View, this snapshot shows ‘who said what’ and ‘who rated who’.

If the view is not yet Published, the student will see this remark.

Results unpublished

The same message will be also be displayed if the team is not a valid assessed team, even if the results have been Published to the class as a whole.


Team Statistics

Select an individual team to probe the results of its team members. Sort by Peer Assessed Score or Index of Realistic Self Assessment. Then you can quickly review the Individual Personal Snapshot of each team member as part of your diagnosis to identify ‘star performers’ , ‘at risk’ team members, and those with outlier degrees of over confidence or underconfidence.

Example Team Statistics


Qualitative Feedback

(To come)

Teacher’s Feedback

(To come)


Advanced Statistics

There are many advanced statistics and charts you can view. Furthermore, from ‘Available Actions’ you can Download Full Statistics to conduct more detailed investigations beyond the scope of what we have conceived.

Quick links and related information

FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?


3.6 Publish provisional Personal Results to team members

Results of the peer assessment are hidden from team members until you initiate Publish Survey on the Peer Assess Pro Teacher’s dashboard.

Before Publishing, see:

FAQ: When, why, and how do I ‘Recalculate Results’?

Unpublished status


Published status

The foregoing ‘Refresh and Recalculate’ steps provide you with the opportunity to quality review results before publishing and republishing personal results and qualitative peer feedback comments. In short, as the peer assessment activity progresses towards the due date, results ARE NOT automatically updated and made available for viewing by the students.

Take Care! Once an activity is Published, the results can never be unpublished. However, you may re-publish results if new responses are submitted and/or you make adjustments to Team Results, Team Composition, etc. To reiterate, even if interim results have been published to students, as the peer assessment activity continues to progress towards the due date, results ARE NOT automatically updated and made available for viewing by the students.

Results hidden when insufficient responses

Results will be hidden from the teacher and ALL team member in teams where less than one-half of team members have submitted the peer assessment. Peer assessment results are possibly not valid and representative at this stage of the survey activity processing. For small teams, at least 3 team members must have submitted a response. That is, team sizes of 3, 4, 5 and 6 team members require at least three team members to have peer assessed each other. A team of 7 or 8 requires a minimum of 4 responses. Team members who have already submitted a response will ALSO be advised their results are hidden until more of their team members have submitted responses.

Quick links and related information

FAQ: How do students know where and when to complete the peer assessment activity then review their results?

FAQ: How do I view and experience what the students experience?


4. Finalise the peer assessment activity


4.1 Why Finalise?

Survey responses from Team Members are received and available for incorporation into the Peer Assessment activity UNTIL the you explicitly Finalise the Survey. Even responses submitted after the Due Date announced to students, at the launch of the Activity, will be available for incorporation UNTIL the survey is Finalised deliberately by the Teacher. Until Finalisation, you can request a student to reconsider. They will then optionally resubmit their responses.

4.2 Publish Finalised Results to students


4.3 Download Teacher’s Gradebook of Results

From the Peer Assess Pro Teacher’s Dashboard, select either

Example Gradebook Summary Statistics

Example Gradebook Full Statistics

4.4 Finalise the Activity … irrevocably!

Quick links and related information

FAQ: How do students know where and when to complete the peer assessment activity then review their results?


FREQUENTLY ASKED QUESTIONS

Quick Link Map

Everyone

For teachers

For team members

Xorro-Q help

Login to Xorro-Q

Join peer assessment activity

www.peerassesspro.com

The peer assessment survey

The purpose of peer assessment

Table of contents: Reference guide

Login and orientation

Undertake the peer assessment

FAQs on the web at http://tinyurl.com/papFAQ

Launch peer assessment activity

Use peer assessment results for better performance

Videos

Manage the peer assessment activity

Quickstart guide for teachers

Definitions, calculations, and examples

Contact us

eBook - 7step guide


FAQs for teachers

Quickstart Guide for teachers

The peer assessment survey

Login and orientation

Launch peer assessment activity

Manage the peer assessment activity

Responding to Active Warnings

Definitions, calculations, and examples

Miscellaneous

The peer assessment survey

FAQ - What is the benefit of a standardized peer assessment rubric?

FAQ - When and how is the peer assessment conducted?

FAQ - What is the purpose of peer assessment?

FAQ - What questions are asked in the peer assessment survey?

FAQ - How do students know where and when to complete the peer assessment activity then review their results?

FAQ - How are peer assessment and personal results calculated and defined mathematically?

FAQ - Is the self-assessment used to calculate Peer Assessed Score?

Login and orientation

FAQ - Give me a quick overview of how to launch a Peer Assess Pro™ activity through Xorro 

FAQ - How do I navigate the PARTICIPANTS page for Peer Assess Pro?

FAQ - How do I view and experience what the students experience?

Launch peer assessment activity

FAQ - I’m having problems importing my participants csv

FAQ - How do I create a CSV file from a Google Sheet?   

FAQ - Can I create a peer assessment activity without having all my teams correctly identified by team name and/or team membership?

FAQ - How do I correct the participants (team members) in a group already uploaded to Xorro?

FAQ - How do students know where and when to complete the peer assessment activity then review their results? 

FAQ - Can I adjust the Start Date or Due Date for a running activity?

FAQ - What if a student mistakenly advises they are in an incorrect team?

Manage the peer assessment activity

FAQ - Can I adjust the Start Date or Due Date for a running activity?

FAQ - Why FINALISE a survey?

FAQ - How do I correct the Team Composition in a running peer assessment activity?  

FAQ - How do I resolve an unsynchronised team arrangement? (ACTIVE WARNING 0021)

FAQ - What if a student mistakenly advises they are in an incorrect team?

FAQ - What is the content of emails sent by Peer Assess Pro to Participants?

FAQ - What is a valid assessed team?

FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?

FAQ - When, why, and how do I ‘Update and Recalculate Results’?

FAQ - How do I take action on the Warnings presented in the Peer Assess Pro™ Teacher’s Dashboard?’ What if I ignore the Warnings?

FAQ - What happens if [a student tries] to 'game' (fool? play? disrupt?) the peer assessment process?

FAQ - How do I advise a student who feels they have been unfairly treated?

FAQ - What emails have been sent by the platform? (Notifications History)

Responding to Active Warnings

FAQ - How do I take action on the Warnings presented in the Peer Assess Pro™ Teacher’s Dashboard?’ What if I ignore the Warnings?

FAQ: What is an adjusted teamset request? WARNING 0006

FAQ - How do I resolve an unsynchronised team arrangement? ACTIVE WARNING 0021

FAQ -  What is the content of emails sent by Peer Assess Pro to Participants?

FAQ -  What is a valid assessed team? WARNING 0022

FAQ -  How do I fix an invalid, missing or failed email delivery? WARNING 0026

FAQ -  What is a team with low psychological safety? WARNING 0034

FAQ -  What is an unsafe team member? WARNING 0035

FAQ -  What is an ‘at risk’ team member? WARNING 0036

FAQ -  What is a mismatched self-assessment (IRSA)? WARNING 0040 

FAQ -   How is insignificant intra-team agreement identified? WARNING 0041

FAQ - How is an outlier peer assessment rating identified? WARNING 0042 

FAQ - What is an inactive team member? WARNING 0048

FAQ -  What is a low-quality team rating? WARNING 0050

FAQ -  What is a low quality assessor rating? WARNING 0300

Definitions, calculations, and examples

FAQ -  How are peer assessment and personal results calculated and defined mathematically?

FAQ - What is the benefit of a standardized peer assessment rubric?

FAQ -  How is the Peer Assessed (PA) Score calculated?

FAQ -  Is the self-assessment used to calculate Peer Assessed Score?

FAQ -  How is the Peer Assessed Index (PA Index) calculated?

FAQ -  How is the Indexed Personal Result (IPR) calculated?

FAQ -  How is the Normalised Personal Result (NPR) calculated?

FAQ -  How is the Rank Based Personal Result (RPR) calculated?

FAQ -  How is Standard Peer Assessed Score (SPAS) calculated?

FAQ -  What is Employability? How is it calculated?

FAQ -  How is the Index of Realistic Self Assessment (IRSA) calculated?

FAQ -  How do I decide which Personal Result method to apply in my peer assessment activity?

Miscellaneous

FAQ - I’m having problems importing my participants csv

FAQ -  How do I contact people at Peer Assess Pro?

FAQ -  Where may I view the most recent version of the user guides?

FAQ -  Why are different terms used to display peer assessment results in the Xorro and previous Google versions of Peer Assess Pro™?  

FAQ -  What are the design objectives, key features, and benefits of the Peer Assess Pro development? 


FAQs for team members

The purpose of peer assessment

Undertaking the peer assessment

Using peer assessment results for better performance

How peer assessment affects personal results

The purpose of peer assessment

FAQ -  What is the purpose of peer assessment?

FAQ -  How are peer assessment and personal results calculated and defined mathematically?

Undertaking the peer assessment

FAQ -  What questions are asked in the peer assessment survey?

FAQ -  I am unable to login. My login failed

FAQ -  How do I login to my peer assessment Activity URL

FAQ - What if I mistakenly advise the survey I am in an incorrect team?

FAQ -  When and how is the peer assessment conducted?

FAQ -  How do I provide useful feedback to my team members?

FAQ -  How do I know where and when to complete the peer assessment activity then review their results?

FAQ -  How do I view and experience what the students experience?

FAQ -  Is the self-assessment used to calculate Peer Assessed Score?

FAQ -  What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process?

Using the results from peer assessment for better performance

FAQ -  How do I interpret the feedback results I've received from the peer assessment?

FAQ -  How do I interpret measures of realistic self-assessment?

FAQ -  What steps can I take to get a better personal result?

FAQ -  Is the self-assessment used to calculate Peer Assessed Score?

FAQ -  I don’t understand what my teammates are trying to tell me. How do I ask for better feedback?

FAQ -  I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern?

FAQ -  What is a team with low psychological safety?

FAQ -  What is an unsafe team member?

FAQ -  What is an ‘at risk’ team member?

FAQ -  What is Employability? How is it calculated?

How peer assessment affects personal results

FAQ -  How are peer assessment and personal results calculated and defined mathematically?

FAQ -  Is the self-assessment used to calculate Peer Assessed Score?

FAQ -  What steps can I take to get a better personal result?

FAQ -  What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process?

FAQ -  I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern?


FAQ: What is the purpose of peer assessment?

Defining peer assessment

Peer assessment is an educational activity in which students judge the performance of their peers, typically their teammates. Peer assessment takes several forms including

Developmental feedback

The ability to give and receive constructive feedback is an essential skill for team members, leaders, and managers.

Consequently, your teacher has chosen to use  Peer Assess Pro™  to help you provide developmental feedback to your team members, for both formative and/or summative purposes.

The goal of developmental feedback is to highlight both positive aspects of performance plus areas for performance improvement. The result of feedback is to increase both individual and team performance (Carr, Herman, Keldsen, Miller, & Wakefield, 2005).

Determination of course personal result

Additionally, your teacher may use the quantitative results calculated by Peer Assess Pro™ to determine your Personal Result for the team work conducted by your team. Your Personal Result may contribute to the final (summative) assessment grade you gain for the course in which Peer Assess Pro™ is applied.

In general, your Personal Result is calculated from two factors:

Criteria for peer assessment in Peer Assess Pro™

There are many possible criteria for assessing your contribution to your team’s work. Peer Assess Pro has chosen to place equal weight on two groups of factors based on a well-established instrument devised by  Deacon Carr, Herman, Keldsen, Miller, & Wakefield (2005), Task Accomplishment, and Contribution to Leadership and team processes:

Peer Assess Pro assesses competencies valued by employers

The selection of the criteria used in the Peer Assess Pro is reinforced by the results from a recent survey that asked employers to rate the importance of several competencies they expected to see in new graduates from higher education. The figure shows teamwork, collaboration, professionalism, and oral communications rate amongst the most highly needed Career Readiness’ Competencies (CRCs) sought by employers. All these Career Readiness competencies rate at least as ‘Essential’, with Teamwork and Collaboration rating almost Absolutely Essential (National Association of Colleges and Employers, 2018).

Employers rate their essential need for Career Readiness Competencies

Source: National Association of Colleges and Employers (NACE). (2018). Figure 42, p. 33.

Quick links and related information

FAQ: What questions are asked in the peer assessment survey?

FAQ: How are peer assessment and personal results calculated and defined mathematically?

FAQ: How is the Peer Assessed (PA) Score calculated?

Video

Mellalieu, P. J. (2021, June 9). Why Peer Assessment? The key to improved group assignments. Better Feedback. Better Teams. https://www.peerassesspro.com/why-peer-assessment/


References

Deacon Carr, S., Herman, E. D., Keldsen, S. Z., Miller, J. G., & Wakefield, P. A. (2005). Peer feedback. In The Team Learning Assistant Workbook. New York: McGraw Hill Irwin.

National Association of Colleges and Employers (NACE). (2018). Job Outlook 2019. Bethlehem, PA. https://www.naceweb.org/


FAQ: When and how is the peer assessment conducted?

The best practice for conducting peer assessment in an academic course follows several stages.

  1. Introduction. The teacher introduces the team activity and related course assignments
  2. Peer assessment purpose. The teacher explains the role, purpose, and process of peer assessment
  3. The team activity commences
  4. A formative peer assessment is conducted using Peer Assess Pro™ early or mid-way through the team activity.
  5. Team members receive formative feedback generated by Peer Assess Pro™  that indicates their provisional peer assessment rating and (optionally) their indicative end of class personal result. More importantly, they receive qualitative information that provides guidance on what behaviors are required to improve their contribution towards the results the team is seeking.
  6. The team activity continues towards its conclusion. Team members confirm informally with each other that they are correctly applying the more productive behaviours identified through the formative peer assessment.
  7. The team activity concludes. 
  8. The summative peer assessment using Peer Assess Pro™ is conducted at the conclusion of the team activity and/or before the conclusion of the course.

Formative assessment: optional but valuable

The midpoint formative peer assessment is an optional element of peer assessment within the classroom. As a minimum, the formative peer assessment gives the team members experience of the Peer Assess Pro™ mechanism including the questions that will be used to conduct the final, summative peer assessment.

More importantly, the midpoint formative assessment helps ensure that team members have the opportunity to respond proactively to the peer feedback they receive immediately the peer assessment activity concludes. Through undertaking appropriate corrective action mid-way through the course, team members have the opportunity to raise their peer assessment rating, their team’s results, and, therefore, their end of course personal results.

No surprises!

The intention of formative assessment is that, ideally, a team member should face no surprises when they receive their final personal result and peer assessment feedback at  the conclusion of the course. For instance, a free-riders should receive clear feedback that the rest of their team observes they are free-riding. Consequently, the free-rider should learn in a timely manner that they will be penalised at the concluding summative assessment unless they remediate their behaviour. It is equally important that an overachieving student who does most of the work is given timely feedback that they need to learn to involve and engage the other team members in the team’s planning and execution of tasks. The Peer Assess Pro™  survey specifically targets these aspects of leadership and team process contributions. This particular style of overachieving student should be identified through the peer assessment ratings they receive.

To minimise the risk of surprises, it is important, therefore, that the peer assessment you provide to your team members at the midpoint of a team activity is


Quick links and related information

FAQ: What questions are asked in the peer assessment survey?

FAQ: How do I provide useful feedback to my team members?

FAQ: How do students know where and when to complete the peer assessment activity then review their results?

FAQ: How do I view and experience what the students experience?

FAQ: How do I interpret the feedback results I've received from the peer assessment?

FAQ: How are peer assessment and personal results calculated and defined mathematically?

FAQ: Is the self-assessment used to calculate Peer Assessed Score?


FAQ: How do I provide useful feedback to my team members?

It is essential that the peer assessment a team member provides to their team members through peer assessment is:

Ohland et al (2012) provide a table of Behaviorally Anchored Ratings covering high and low contributions to team effectiveness. The table provides some guidance to team members about how they might give accurate, effective, and productive feedback to their team members through peer assessment.

Examples of high and low contributions to team effectiveness

HIGH

CONTRIBUTION

LOW

  • Does more or higher quality work than expected.
  • Makes important contributions that improve the team's work.
  • Helps to complete the work of teammates who are having difficulty.
  • Completes a fair share of the team's work with acceptable quality.
  • Does not do a fair share of the team's work.
  • Delivers sloppy or incomplete work.
  • Misses deadlines. Is late, unprepared, or absent for team meetings.
  • Does not assist teammates. Quits if the work becomes difficult.
  • Asks for and shows an interest in teammates' ideas and contributions.
  • Improves communication among teammates.
  • Provides encouragement or enthusiasm to the team.
  • Asks teammates for feedback and uses their suggestions to improve.
  • Listens to teammates and respects their contributions.
  • Communicates clearly.
  • Shares information with teammates.
  • Participates fully in team activities.
  • Respects and responds to feedback from teammates.

INTERACTION

  • Interrupts, ignores, bosses, or makes fun of teammates.
  • Takes actions that affect teammates without their input.
  • Does not share information.
  • Complains, makes excuses, or does not interact with teammates.
  • Accepts no help or advice.
  • Watches conditions affecting the team and monitors the team's progress.
  • Makes sure that teammates are making appropriate progress.
  • Gives teammates specific, timely, and constructive feedback.
  • Notices changes that influence the team's success.
  • Knows what everyone on the team should be doing and notices problems.
  • Alerts teammates or suggests solutions when the team's success is threatened.

KEEPING FOCUS

  • Is unaware of whether the team is meeting its goals.
  • Does not pay attention to teammates' progress.
  • Avoids discussing team problems, even when they are obvious.

  • Demonstrates the knowledge, skills, and abilities to do excellent work.
  • Acquires new knowledge or skills to improve the team's performance.
  • Able to perform the role of any team member if necessary.
  • Has sufficient knowledge, skills, and abilities to contribute to the team's work.
  • Acquires knowledge or skills needed to meet requirements.
  • Able to perform some of the tasks normally done by other team members.

CAPABLE

  • Missing basic qualifications needed to be a member of the team.
  • Unable to perform any of the duties of other team members.
  • Unable or unwilling to develop knowledge or skills to contribute to the team.

Source: Ohland et al., (2012)

Adapted by Mellalieu (2017) from Ohland, M. W., Loughry, M. L., Woehr, D. J., Bullard, L. G., Felder, R. M., Finelli, C. J., … Schmucker, D. G. (2012). APPENDIX B: Behaviorally Anchored Rating Scale (BARS) Version, from Comprehensive Assessment of Team Member Effectiveness. Academy of Management Learning & Education, 11(4), 609–630. Retrieved from http://amle.aom.org/content/11/4/609.short

Quick links and related information

FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern?

For teachers: How do I advise a student who feels they have been unfairly treated?

Symptoms of an unfair assessment

Here are some symptoms that you may have been treated unfairly by one or more teammates in their peer assessment of you:

Steps to address an unfair peer assessment

If you believe you may have been unfairly treated, these are the steps you should pursue, in this order of action

  1. Ensure that you understand the results of your peer assessment, and how the results are calculated. See the Quick Links below for suggestions.
  2. Email then arrange to discuss your concerns with your teacher. Your teacher can check the peer assessments provided by your team members. For example, perhaps one or more extreme peer assessments have affected negatively your result. The teacher can explore that possibility with you and the team member concerned. You should be able to present evidence to the teacher of the work you agreed to complete, your record of attending meetings, the work you produced, and relevant communications from your team members expressing their satisfaction or otherwise of your contributions to team outputs and/or working processes.
  3. Meet with your team members to ensure you fully understand WHY they have awarded you the ratings they gave. In particular, ensure you understand accurately what you must do to achieve a better peer assessment rating in the future. In addition, when you meet your teacher in Step 2, you might request the teacher’s attendance at this meeting as a meeting observer or facilitator ‘to keep the peace’.
  4. If there remains a dispute between your self-assessment and your team’s assessment, the teacher MAY with discretion request that one or more of your team members resubmit their peer assessment. This step can only be undertaken if the peer assessment activity is not finalised.
  5. If you continue to dispute your peer assessment and/or personal result then you should pursue your institution’s policy for appealing an assessment result. This policy is usually mentioned in your course syllabus, assignment specification, programme overview, and/or learning management system.

A note on appealing a peer assessment result

An appeal against a peer assessment result is likely to fail if one or more of the following circumstances have prevailed:

Prevention is better than cure

Take these steps to avoid a mismatch between the peer assessment result you expect, and the result you receive.

Quick links and related information

How peer assessment affects personal results

FAQ: How are peer assessment and personal results calculated and defined mathematically?

FAQ: Is the self-assessment used to calculate Peer Assessed Score?

FAQ: What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process?

Using the results from peer assessment for better performance

FAQ: How do I interpret the feedback results I've received from the peer assessment?

FAQ: I don’t understand what my teammates are trying to tell me. How do I ask for better feedback?

FAQ: How do I interpret measures of realistic self-assessment?

FAQ: What steps can I take to get a better personal result?


FAQ: How do I interpret the feedback results I've received from the peer assessment?

(To be published)

Quick links and related information

FAQ: How do I interpret measures of realistic self-assessment?


FAQ: I don’t understand what my teammates are trying to tell me. How do I ask for better feedback?

Begin by viewing this video. Watch especially for the question that is introduced soon after minute 15 by Harvard University professor Sheila Heen.

Heen, S. (2015). How to use others’ feedback to learn and grow. TEDx. Retrieved from https://www.youtube.com/watch?v=FQNbaKkYk_Q

As Heen and Stone observe

“Feedback is less likely to set off your emotional triggers if you request it and direct it. So donʼt wait until your annual performance review. Find opportunities to get bite-size pieces of coaching from a variety of people throughout the year. Donʼt invite criticism with a big, unfocused question like “Do you have any feedback for me?” Make the process more manageable by asking a colleague, a boss, or a direct report,

“Whatʼs one thing you see me doing (or failing to do) that holds me back?”

That person may name the first behavior that comes to mind or the most important one on his or her list. Either way, youʼll get concrete information and can tease out more specifics at your own pace.” (Heen & Stone, 2014)

Quick links and related information

Heen, S., & Stone, D. (2014). Find the Coaching in Criticism. Harvard Business Review, 9. Retrieved from https://medschool.duke.edu/sites/medschool.duke.edu/files/field/attachments/find-the-coaching-in-criticism.pdf


FAQ: What steps can I take to get a better personal result?

Your Personal Result is determined from a combination of your Team Result and your Peer Assessed Score. Consequently, to raise your Personal Result you need to apply balanced effort to raising both these contributing factors.

Raise your Team Result

Typically, your Team Result is earned from its assignment outputs, such as a report, and/or a presentation. Consequently, the grade for the Team Result is determined by the teacher, based on the rubric (marking guideline) they apply to assess your team’s outputs. Ensure you understand the assignment elements and how each will be assessed. Seek out exemplars of good practice. Pursue the guidance found in:

Mellalieu, P. (2013, March 15). Creating The A Plus Assignment: A Project Management Approach (Audio). Innovation & chaos ... in search of optimality website: http://pogus.tumblr.com/post/45403052813/this-audio-tutorial-helps-you-plan-out-the-time

Use your institution’s academic support services

In addition to your teacher and their assistant tutors, your academic institution will offer personal and group coaching to guide you on the specific success factors related to the type of assignment you are pursuing. Schedule appointments to make use of these support facilities early in your project. Locate the online resources these coaching support services have curated for your guidance.


Raise your Peer Assessed Score

Group and team projects present special challenges of coordination, motivation, communication and and leadership. These challenges are normal! Furthermore, an essential part of your job as team member is to overcome proactively these challenges as part of your academic learning journey.

As you overcome these challenges you will achieve several benefits directly instrumental in raising your Personal Result:

You will also develop team work and leadership competencies that will both raise your future employability, and your effectiveness in future teamwork, as discussed in:

FAQ: What is the purpose of peer assessment?

How do I address proactively the challenges of team work?

Whilst there are many resources to help address the challenges of team work in academic settings, we suggest you familiarise yourself with these resources early in your team project. Since “Any fool can learn from their own mistakes. It takes genius to learn from the mistakes of others” (Einstein), be proactive rather than foolish in learning effective team working skills from:

Turner, K., Ireland, L., Krenus, B., & Pointon, L. (2011). Collaborative learning: Working in groups. In Essential Academic Skills (2nd ed., pp. 193–217).

Carr, S. D., Herman, E. D., Keldsen, S. Z., Miller, J. G., & Wakefield, P. A. (2005). The Team Learning Assistant Workbook.

Learning constructively from mid-course peer assessment feedback

Good practice peer assessment management by your teacher will provide you with two opportunities for peer assessment and peer feedback through your course, formative and summative.

Your first, mid-course, formative assessment provides you with early advice about your strengths and opportunities for development as perceived by your team members. Make use of this formative feedback at the earliest opportunity as you proceed towards the conclusion of your team work, and your final, summative peer assessment. Usually, this final, summative assessment is where you earn the significant contribution to your course grade from the Personal Result earned from your Peer Assessed Score awarded by your team members.

Consequently, take proactive action following the mid-course formative assessment through referring to:

FAQ: How do I interpret the feedback results I've received from the peer assessment?

Maybe you don’t understand or don’t agree with the feedback your teammates are providing. In that case, refer to

FAQ: I don’t understand what my teammates are trying to tell me. How do I ask for better feedback?


Quick links and related information

The purpose of peer assessment

FAQ: What is the purpose of peer assessment?

Undertaking the peer assessment

Using peer assessment results for better performance

How peer assessment affects personal results

FAQ: How are peer assessment and personal results calculated and defined mathematically?

FAQ: What steps can I take to get a better personal result?

FAQ: What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process?

FAQ: Is the self-assessment used to calculate Peer Assessed Score?


FAQ: What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process?

What happens if a team member attempts to 'game' the peer assessment process?

The designers of Peer Assess Pro have many decades’ experience working with students. We know the tricks that students attempt to play with peer assessment. We have anticipated the tricks,  so Peer Assess Pro warns the teacher that a trick may be being played. Furthermore, the teacher receives highly specific and student individualized information about each incident. The teacher may then undertake overt or covert action to address the issue to which they have been alerted. For example, the trick-playing student or team may then receive a request to reconsider and resubmit their peer assessment. In more extreme incidents, the student or team may receive an invitation to visit the teacher for a counselling consultation.

The tricks we know!

Here follow a few of the ‘tricks’ that Peer Assess Pro identifies and warns the teacher about during the survey process. Examples follow later.

  1. Low quality assessor rating: An assessor may have engaged unconstructively with peer assessment. Also known as “I can’t be bothered with this. I’ll give everyone the same rating
  2. Low quality team rating: A team may have engaged unconstructively with peer assessment. Also known as “If we give everyone a high mark, we’ll get a better personal result won’t we?”
  3. Outlier individual rating: A team member has assessed another team member very differently than the other team members. Also known as “This is my friend, I’ll rate her well” or “We’ve never got on well together. Now this is my chance for revenge!
  4. Mismatched self-assessment: A team member's self-assessment is materially different from the peer assessment given by their team. Also known as “If I give myself a high rating, I’ll get a better personal result won’t I?”
  5. At risk team member: A team member has been rated amongst the lowest in class. Also known as “Darn! I thought I could hide under the radar”

Examples: Highly specific and individualized information

Here are some examples of the highly specific and individualized Active Warnings a teacher receives about each incident.

1. Low quality assessor rating

Madison may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 100 and low range 3.  Team  Alpha

Ben  may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 86 and low range 5.  Team Bravo

This message warns the teacher that the team member has given everyone a near perfect Peer Assessed Score or a similar score (narrow range). Practically, from the student’s point of view they are ‘wasting their votes’. If everyone is scored with the same or similar score then students who have contributed substantially to the team’s result will not be adequately recompensed. Furthermore, if EVERY team member pursued this same approach, then every team member would be awarded the Team Result. In this case, the team member just looks stupid in the eyes of the teacher. Furthermore, the team member fails to gain practice at being a leader where giving accurate assessments of team members’ contributions is a valued management competency.

2. Low quality team rating

Team Bravo  may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 98 and low range 8.

Team Echo  may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 94 and low range 10.

These messages warns the teacher that the team collectively may have arranged to give everyone a near perfect Peer Assessed Score or the same score. Practically, from the students’ point of view, this trick is a waste of time. If everyone is scored with the same score, or a perfect Peer Assessed Score of 100, then every team member will be awarded the Team Result … which is usually not 100. The team members just look stupid in the eyes of the teacher. Furthermore, they may not receive useful qualitative feedback and ratings that help guide focussed development of their future productivity in team assignments and their future professional work in teams.

3. Outlier individual rating

The warning highlights a situation where the team members appear to be inconsistent in rating high, medium and low contributors to the team’s process and results.

This example is a symptom there maybe some disruptive team dynamics or bullying within the team.

Harrison Ford assessed Steven Spielberg awarding a PA Subscore of 38. Compared with the average rating by the other team members of 70 this subscore DEPRESSED the PA Score to 64 by 7 PA Score Units. Team  Alpha

This message warns the teacher there maybe some favouritism between friends or allies.

Donald Trump assessed Vladimir Putin awarding a PA Subscore of 90. Compared with the average rating by the other team members of 57 this subscore RAISED the PA Score to 64 by 7 PA Score Units. Team  Charlie

4. Mismatched self-assessment

Anna  self-assessment of 68 is OVERCONFIDENT compared with the peer assessment of 34 given by others in team  Charlie.  IRSA = 51

This message warns the teacher that the team member has a very much higher opinion of their performance than is evidenced by the rating provided by their peers. The teacher may request an interview with the student to explore the reasons for this divergence, and how the student can develop a more realistic self-assessment.

Alternatively, the team member may be being scape-goated by the remainder of the team, and that possibility will be discussed with the team member for whom this warning is raised.

5. At risk team member

Anna has been rated amongst the lowest in class. Low Recommendation 2.3 and/or low Peer Assessment Score 34. Team  Alpha

This message warns the teacher that the team member is rated very poorly when compared with most of the class. It’s often a symptom of little or no attendance or contribution by the team member, which the teacher will verify through examining the qualitative feedback provided by the team members. Again, the teacher may request an interview with the student to explore the reasons for this divergence, and how the student can develop a more realistic self-assessment.

Example: Better feedback. Better teams

Examine the following teacher’s dashboard graphic revealing a real class that undertook a peer assessment.


Teachers dashboard: visible identification of teams with low quality team rating

Which teams will raise the Active Warning: Low quality team rating?

Team 1, 14, 13, 2, 7, 5, and 11. Over ½ of the teams in the class!

Observation: This class was poorly briefed on how to make the best use of peer assessment and feedback. With a better briefing, less than 10% of teams will raise this warning.

Which teams tend to have a higher team result?

The lower Team Results are associated with teams that had a low quality team rating. Apart from Team 15, all teams with an adequate quality team rating had a Team Result equal to or greater than the Class median Team Result of 73.3. For example, Team 10 (Team Result 90) through to Team 4 (Team Result 76.7) according to the sort by Range in the foregoing table.

Which teams have worked most productively as a team?

Team 10, with a Team Result of 90 is clearly a high performing team. The moderately low Range of PA Scores (10) across the team suggests IN THIS CASE that everyone contributed relatively equally and effortfully towards a great Team Result. Reminder: The Team Result is awarded by the teacher: it is independent of the Peer Assessed Scores of the team.

However, Team 3 is also a good candidate for being a fair and productive team. They engaged honestly with peer assessment, awarding a high spread of Peer Assessed Scores (Range 18.8) an a team average PA Score (78.3). This team average was not outrageously high, in contrast to teams 1 (100!), 14 (100), 13, 2, 7, 5. Furthermore, Team 3 earned the class median Team Result of 73.3, which appears then allocated according to the peer assessed contribution of the team members. This fair distribution is illustrated in the following graph and table. Team Member Charlie earned the highest Personal Result of 81, whilst Able earned 65.3. Similar reasoning applies to Team 6 to a slightly lesser degree, since the Range is not so wide.

Note from the following graph how teams 14, 5, 13, 1, 2 and 7 are again glaringly identified in the Teachers Dashboard as outlier teams poorly engaged with the peer assessment process: the low vertical spread in the graph. This low vertical spread in the Personal Result (NPR in this case) derives from the low range of Peer Assessed Scores across each team.

With this admittedly small case size, we advance the proposition that ‘Better feedback leads to better teams’. And/Or ‘Better teams give better feedback!’. In conclusion, let’s say Better feedback. Better teams.


Teachers dashboard: a fairly productive team

Quick links and related information

FAQ: What is the purpose of peer assessment?

FAQ: How do I provide useful feedback to my team members?

FAQ: How do I interpret the feedback results I've received from the peer assessment?

FAQ: I don’t understand what my teammates are trying to tell me. How do I ask for better feedback?

FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern?

Active Warnings, thresholds parameters, and program logic

The following section explains how the teacher should respond to the Active Warnings displayed on their dashboard. The thresholds parameters and program logic for raising the warnings are also provided.

Responding to Active Warnings


FAQ: Give me a quick overview of how to launch a Peer Assess Pro™ activity through Xorro

If this is your first time using Peer Assess Pro, we recommend strongly that you glance briefly our Frequently Asked Questions so you are prepared to answer your own and your students' concerns - https://www.peerassesspro.com/frequently-asked-questions-2/

Download a pdf of the Quickstart Guide and this Reference Guide here - http://tinyurl.com/papRefPdf

View a quick video overview demonstration of the whole Peer Assess Pro system

Contact

Patrick Dodd - https://www.peerassesspro.com/contact/


Quick links and related information

View the web Quickstart Guide at tinyurl.com/pdfQuickWeb

View our comprehensive online and eBook introduction Get Started with Peer Assess Pro

FAQ: How do I contact people at Peer Assess Pro?

FAQ: Where may I view the most recent version of the User Guide?

FAQ: What are the design objectives, key features, and benefits of the Peer Assess Pro development?


FAQ: What are the design objectives, key features, and benefits of the Peer Assess Pro development?

Design objectives

Our overall objectives for Peer Assess Pro™ are

Benefits for students

Benefits for teachers

Peer Assess Pro™ is a work in progress

We appreciate your participation in this pre-market release of our substantially revised Peer Assess Pro™ in conjunction with the Xorro advanced quiz and survey platform.

As we proceed through this pre-market refinement phase we respond almost daily  to your suggestions for improving both the software applications and user documentation. These improvements are implemented at anytime whilst we undergo our Beta Development phase. We anticipate that our implementations are robust enough to prevent loss of your data and wasting your time. We crave your forgiveness if we have been over optimistic in keeping Murphy’s Law at a distance.

Where’s the latest?

You need not take any action to use the latest versions of the Peer Assess Pro™ Xorro Teacher’s Dashboard. Those updates happen in the background and will automatically use any data and activities you have initiated. However, if you use the PDF version of this user guide, you will need to update regularly to the latest version here.

Quick links and related information

FAQ: where may I view the most recent version of the Reference Guide?


FAQ: How do I find the Peer Assess Pro Xorro Teacher’s dashboard?

If you quit your browser then wish to return to the Teachers Dashboard

HOME: Running Activities

Alternative method: ACTIVITIES: Running Activities


From HOME Tab


From Activities Tab: Running Activities

Quick links and related information


FAQ: How do I navigate the PARTICIPANTS page for Peer Assess Pro?

Select PARTICIPANTS Tab

 

Note the list of ‘All participants’ currently known to Xorro in your institution.

Note a list of all other Groups uploaded by other Teachers in your institution. A group is a list of participants, such as students in a class. The minimum requirement for a Group is id, first name, last name.

However, for a peer assessment activity a Group must include team membership for all team members. This team membership is not required for most other Xorro activities. Accordingly, Groups set up for other teachers or by other teachers will rarely contain the correct team membership data required for your Peer Assess Pro™ activity.

Orientation note: Select an existing Group

Select Group ClassAM101.6. This group selection displays  a list of about 25 students in the class titled AM101.6


Inactive functions in PARTICIPANTS page

Quick links and related information

FAQ: How do I correct the participants (team members) in a group I uploaded?

FAQ: How do I correct  the Team Composition in a running peer assessment activity?


FAQ: How do I correct the team composition in a running peer assessment activity?

In a launched, running peer assessment activity, you often need need to make these adjustments:

Select the context you require

LMS (Moodle, Canvas) Adjust the team composition in a running peer assessment activity on an LMS

Xorro Adjust the team composition in a running peer assessment activity on Xorro


Adjust the team composition in a running peer assessment activity on an LMS

Context: Peer Assess Pro running on an LMS (Moodle, Canvas, Blackboard)

Take care! Here there be dragons!!

Ensure you read ALL of this FAQ before proceeding.

If you make a mistake in this process the consequence may lead to unrecoverable loss of survey responses received to date.

Overview

In the LMS version of Peer Assess Pro, the facilitator is alerted to requests to adjust the team composition in the running activity from several Active Warnings. These Active Warnings and their associated available actions streamline the facilitator’s workflow for managing adjustments to the team arrangement.

For example, new participants may have been enrolled on the LMS since the peer assessment activity was launched. Additionally, a participant may have alerted the facilitator that team participant(s) (or themself) have been

  1. Assigned into an incorrect team
  2. Withdrawn from the course
  3. An inactive contributor to the team’s project.

The relevant Active Warnings are described in these FAQs

FAQ - What is an adjusted team arrangement request? WARNING 0006

FAQ - What is an inactive team member? WARNING 0048

FAQ - How do I resolve an unsynchronised team arrangement? ACTIVE WARNING 0021

The facilitator responds to the suggested actions presented on the Teachers Dashboard for the relevant Active Warnings to first produce a proposed team arrangement on the LMS side. The facilitator will

  1. Confirm or adjust a request by a participant that another participant(s) (or themself) should be reassigned into another existing team, (in response to WARNING 0006 and WARNING 0048). When the facilitator confirms this request Peer Assess Pro automatically reassigns the participant to the correct team on the LMS, but DOES NOT synchronise the reassignment with the running Peer Assess Pro activity
  2. Apply the standard LMS team arrangement facilities to add, remove, and reassign participants to teams (in response to WARNING 0048) and/or
  3. Apply the standard LMS team arrangement facilities to add, remove or rename teams.

The facilitator now must

  1. Review the proposed changes to the arrangement of participants and/or teams on the LMS. Depending on the LMS in use this proposed team arrangement is termed the LMS grouping, group set, or teamset
  2. Examine carefully the Team Composition shown on the Peer Assess Pro Teacher Dashboard Available Actions. The Team Composition explicitly highlights the differences in team membership that exist between the LMS (proposed arrangement) and the running Peer Assess Pro activity. Since there are differences, Active Warning 0021 Unsynchronised team arrangement will be raised on the Teacher Dashboard
  3. FINALLY, the facilitator initiates Synchronise All, the available action in Active Warning 0021 Unsynchronised team arrangement.

What happens with Synchronise All?

Before the facilitator initiates Synchronise All, the refreshed Team Composition view for the Peer Assess Pro activity will now show the updated team arrangement, Step 5 above.The updated view combines the intentions just-stated by the facilitator on the LMS (the proposed team arrangement), and the current status of the team arrangement within the running activity. The symbols (+, -) signal mismatches, or the lack of synchronisation between the LMS and the current Peer Assess Pro activity.

Precautions before Synchronise All!

Prior to the facilitator committing to synchronisation, the facilitator’s proposed changes are clearly indicated, as the example illustrates. Crucially, this Team Composition view provides a ‘last chance’ opportunity for the facilitator to review carefully the changes they intend.

In the Team Composition view, the symbol (-) signifies that, following synchronisation, a participant would be dropped from a team, or a team deleted from the current peer assessment activity. The symbol (+) indicates that a participant or team will be added.

For example, in the following example, once the facilitator has initiated Synchronise All, Kamryn MILLER will be reassigned from team Black Robins to Team Brown Kiwis. The two teams Black Robins and Brown Kiwis will now include the new class participants Jill ROBERTSON and Jason SMITH. The team Wax Eyes will be added, along with its team members, Kael BRIDGES, Jonathan CHANG, and Kyleigh COHEN.

Some survey responses might be deleted!

Note that any survey responses already generated by Kamryn Miller for team Black Robins will be irretrievably deleted following synchronisation. Similarly, Estrella Hawkins’ responses will be irretrievably deleted..

Team composition view prior to synchronise all

Black Robins

Kamryn MILLER (-), Alexander SAMPSON , Mikaela RAY , Ramon MCKNIGHT, Jill ROBERTSON (+)

Brown Kiwis

Estrella HAWKINS (-), August DAUGHERTY , Nehemiah MCCONNELL,  Kamryn MILLER (+), Jason SMITH (+)

Grey Warblers

Joslyn HOOVER , Alyvia YANG , Mariyah POLLARD , Arianna SCHROEDER

Pukekos

Dorian SULLIVAN, Elisha NUNEZ , Muhammad HOLT , Skylar MCCLURE

Red Rooks < Red Ruru

Alberto UNDERWOOD , Annika KLINE , June MCKINNEY , Jaylee MURRAY

Waxeyes (+)

Kael BRIDGES (+), Jonathan CHANG (+), Kyleigh COHEN (+)

Team composition following synchronise all

Black Robins

Alexander SAMPSON , Mikaela RAY , Ramon MCKNIGHT, Jill ROBERTSON

Brown Kiwis

August DAUGHERTY , Nehemiah MCCONNELL, Kamryn MILLER, Jason SMITH

Grey Warblers

Joslyn HOOVER , Alyvia YANG , Mariyah POLLARD , Arianna SCHROEDER

Pukekos

Dorian SULLIVAN, Elisha NUNEZ , Muhammad HOLT , Skylar MCCLURE

Red Rooks

Alberto UNDERWOOD , Annika KLINE , June MCKINNEY , Jaylee MURRAY

Waxeyes

Kael BRIDGES, Jonathan CHANG, Kyleigh COHEN

Survey notifications

The founding members of teams Black Robins and Brown Kiwis will be alerted by an automated notifications from Peer Assess Pro they must submit an updated survey response for their re-configured teams, as will the relocated Kamryn MILLER. Notification 0013: RESUBMIT peer assessment due to TEAM CHANGE.

The newly-enrolled class members in the new team Waxeyes Kael BRIDGES, Jonathan CHANG, Kyleigh COHEN, Kael BRIDGES, Jonathan CHANG and Kyleigh COHEN will be advised to submit the peer assessment survey. Notification 0011: Request to COMPLETE peer assessment. 

Note that if a founding participant has not yet submitted, they will receive an updated notification to complete the peer assessment, updated to include the new team arrangement, Notification 0011: Request to COMPLETE peer assessment. 

There will be no impact on the survey responses for the participants in teams Grey Warblers and Pukekos.

Adding orphan teams to a running activity

An orphan team is a team that has not been assigned to a teamset (grouping, group set).

Teams that are to be added to the activity must be added to the same teamset (grouping, group set) that was used to create and launch the peer assessment activity. Furthermore, ALL the teams available in the teamset must have been selected for the initial launch.

If the Peer Assess Pro activity was launched using orphan teams, then no additional teams can be added. In this case, participants can only be added to or deleted from teams used to create the initial activity.

Why was my Synchronise All action rejected?

A Synchronise All action will rejected for several reasons when the team arrangement is not feasible.

Good practice hint when creating a peer assessment activity

Before launching a Peer Assess Pro activity always

Quick links and related information

FAQ - What is an adjusted team arrangement request? WARNING 0006

FAQ - What is an inactive team member? WARNING 0048

FAQ - How do I resolve an unsynchronised team arrangement? ACTIVE WARNING 0021

LMS team arrangement facilities for Moodle

Groups—MoodleDocs. https://docs.moodle.org/400/en/Groups

Grouping users—MoodleDocs. https://docs.moodle.org/400/en/Grouping_users

Groupings—MoodleDocs. https://docs.moodle.org/400/en/Groupings

Groups versus groupings—MoodleDocs. https://docs.moodle.org/19/en/Groups_versus_groupings


Adjust the team composition in a running peer assessment activity on Xorro

Context: Peer Assess Pro running on Xorro

Take care! Here there be dragons!!

Ensure you read ALL of this FAQ before proceeding.

If you make a mistake in this process the consequence may lead to unrecoverable, loss of survey responses received to date.

Key check points

View the team composition

Select the ‘Team Composition’ button for the running Peer Assessment Activity for which you wish to adjust the team composition.

Correct the team composition

During the re-import, the changes to the teamset will be presented to you so that you can check and confirm the adjustment process. Take care!

Upon completion of the re-import process, the running Peer Assess Pro Activity will continue.

All students in teams affected by a change in composition are now required to resubmit their peer assessment responses. Reason: They now have different team members to rate. The remaining teams of the class will be unaffected. Their responses remain submitted and evident within Peer Assess Pro.

Affected team members will be notified of their need to re-submit by an automatically generated email from Peer Assess Pro.

Subtle technical note

You cannot change the participants in the Xorro Group used to create the running activity, as explained in the FAQ:

FAQ: How do I correct the participants (team members) in a group already uploaded to Xorro?

Reason: whenever a Xorro activity is created a snapshot is taken of the Group used to create the activity. From that moment this snapshot, known as a Xorro Teamset, is inextricably connected with the activity. That activity-specific teamset can be updated only during a running activity through the FAQ detailed above, through the Team Composition section of the Peer Assess Pro dashboard.

In the image above, the Group used to create the peer assessment activity is BT101. Any changes made to that group WILL NOT affect the running activity. The teamset created from the Group BT101 is denoted 2019-02-24 BT101 by Beta Beta. That name indicates what date the teamset was created, from which Group, and by whom.


FAQ: Can I create a peer assessment activity without having all my teams correctly identified by team name and/or team membership?

You can add, swap or delete delete team members anytime before launching the activity, and anytime before the peer assessment activity is finalised.

Good Practice Hint. Get your team composition list absolutely correct before the activity is launched and made available for response by your students. Reason: All students in teams affected by a change in composition will be required to resubmit their peer assessment responses. The students now have different team members to rate. However, the remaining teams of the class will be unaffected.

Quick links and related information

FAQ: How do I correct the Team Composition in a running peer assessment activity?

FAQ: How do I create a CSV file from a Google Sheet?

Good practice hint!

Confirm that the Participants CSV file you created is in the correct format. Open the Participants CSV using a text editor (Apple Textedit, or Microsoft Windows Notepad).

The format should appear as illustrated below. Note that

  1. The first row, containing the column headers must be stated as id, first, last, group_code, team, email. REQUIRED in any order.
  2. The data on each row is separated by the delimiter comma (,). Alternatively, a semicolon (;) may be used. The delimiter is illustrated above in the Sample of participants csv file opened using a text editor.
  3. The definitions and rules for what data may be listed within each of the subsequent rows is explained in Xorro help Importing Participants, Teams and Groups

Sample of participants csv file opened using a text editor

id,first,last,group_code,team,email,

BOWI12,Bob,Wilson,123.101,Tiger,Bob.Wilson@xorroinstitution.com,

ALJO11,Alice,Jones,123.101,Panda,Alice.Jones@xorroinstitution.com,

JOSM13,John,Smith,123.101,Tiger,John.Smith@xorroinstitution.com,

JOSM13,John,Smith,123.202,,John.Smith@xorroinstitution.com,

GRGR15,Greta,Green,123.101,Panda,Greta.Green@xorroinstitution.com,

GRGR15,Greta,Green,123.204,,,

HEJO19,Henry,Jones,123.101,Tiger,Henry.Jones@xorroinstitution.com,

AMTO01,Amanda,Tolley,123.101,Bear,Amanda.Tolley@xorroinstitution.com,

JEWA06,Jeff,Wang,123.101,Panda,Jeff.Wang@xorroinstitution.com,

HOBR03,Holly,Brown,123.101,Bear,Holly.Brown@xorroinstitution.com,

HOBR03,Holly,Brown,123.202,,Holly.Brown@xorroinstitution.com,

THWI18,Thomas,Windsor,123.101,Tiger,Thomas.Windsor@xorroinstitution.com,

ANWO08,Anna,Worth,123.101,Bear,Anna.Worth@xorroinstitution.com,

ANWO08,Anna,Worth,123.202,,Anna.Worth@xorroinstitution.com,

ANWO08,Anna,Worth,123.204,,,

Quick links and related information

FAQ - I’m having problems importing my participants csv


FAQ: How do I view a demonstration version of Peer Assess Pro?

A Beta Test demonstration site has been established with these credentials:

Browse to: https://qf.staging.xorro.com/

Enter: Username BetaTest, Password Secret

This Beta Test User is established for you to view. But don’t touch to hard!

View


FAQ: How do I correct the participants (team members) in a group already uploaded to Xorro?

WARNING! HERE THERE BE DRAGONS!!

If a peer assessment activity is launched and running then you cannot update team membership details in that running activity using the procedure described here!!!  Instead, apply the procedure described here FAQ: How do I correct the Team Composition in a running peer assessment activity?

Reason: whenever a Xorro activity is created a snapshot is taken of the Group used when creating the activity. From that moment this snapshot, known as a Xorro Teamset, is inextricably connected with the activity. That activity-specific teamset can only be updated during a running activity through the following FAQ.

Update a group’s team members for future use

  1. On the Xorro Dashboard, Select Participants.
  2. In the Groups column left, select the Group name to view a list of all students in your class (Group).

Correct the team members associated with an existing Xorro TeamSet Group

  1. Recreate the Team Members Group CSV file
  2. Load then import the Team Members file according to Section 2. Launch Peer Assessment Activity
  3. Note that the data loaded from the TeamSet CSV operates according to these rules
  4. If the id does not exist in the Xorro Group then ALL new attributes such as id, team, email, first name, last name will ADD a NEW Record to the Group and institution All Participants records.
  5. If the id exists already in the Xorro Group then altered attributes in the CSV, such as team, email, first name, last name will UPDATE the EXISTING record identified by the id in the Group and All Participants records.
  6. Note that team membership is Group specific.
  7. Caution: If the existing Xorro Group contains an id that is NOT supplied in the new csv file then that id will remain in the Xorro group with its original attributes including team membership. HERE THERE BE DRAGONS! You must explicitly delete these surplus members from the Group.

Quick links and related information

FAQ: How do I correct the Team Composition in a running peer assessment activity?


FAQ: Where may I view the most recent version of the user guides?

Quickstart Guide

Quickstart Guide for Peer Assess Pro: Xorro. (2019, March 6). Peer Assess Pro. http://tinyurl.com/pdfQuickWeb

Pdf version: http://tinyurl.com/pdfQuick

Video guides

Login and orientation. (2019). Auckland: Peer Assess Pro.

Launch a Peer Assess Pro Activity. (2019). Auckland: Peer Assess Pro.

Student survey experience. (2019). Auckland: Peer Assess Pro.

Latest reference guide

Peer Assess Pro. (2019, March 5). Manage a Peer Assessment Activity using Xorro: Reference Guide for Teachers. Auckland: Peer Assess Pro

Web version http://tinyurl.com/papRefWeb2

Pdf version http://tinyurl.com/papRefPdf

Work in progress Google DOCS development version

Google Docs version.

Feel welcome to make suggestions or ask questions using the Comment feature of the Google Docs development version. Shows work in progress improvements.

Frequently Asked Questions for teachers and team members

Frequently Asked Questions (FAQs) (2019). In Manage a Peer Assessment Activity using Xorro: Reference Guide for Teachers [web]. Auckland, New Zealand: Peer Assess Pro. http://tinyurl.com/papFAQ

Teachers Process Flowchart

Peer Assess Pro. (2019). Xorro  Peer Assess ProTM Teachers Process Flowchart: Overview and Detail. http://tinyurl.com/papChart

Quick links and related information


FAQ: How do I decide which Personal Result method to apply in my peer assessment activity

The choice of calculation method for determining a team member’s personal result is determined by the teacher's preference for compensating more strongly team members who have contributed significantly to their teams, and under-rewarding team members who are peer assessed as weak contributors. The figure illustrates the statistical features, such as team average, range, and standard deviation, associated with each method.

Alternative calculation methods for Personal Result (PR) illustrating effect on team average and spread for a given Team Result

The teacher can select either the Peer Assessed Score (PA Score) or Peer Assessed Index (PA Index) if they wish to exclude a team result in calculating the Personal Result (PR).

More usually, the Peer Assessed Score and Team Result (TR) are combined mathematically to produce a Personal Result. There are three alternative methods. As the figure illustrates, the Indexed Personal Result (IPR) is the least discriminating method, whilst the Rank-Based Personal Result (RPR) is the most discriminating in terms of favouring significant team contributors and penalising weak contributors. Most teachers select the Normalised Personal Result, often with a spread factor of 1.5 to 2.0.

In contrast to the graphical illustration earlier, the following table summarises the example calculations presented through a series of FAQ that present the mathematical definition and example calculations for each method.

Comparison of Personal Results calculated by several methods in a team of four members

ASSESSEE

ASSESSOR

Bridget

Julian

Lydia

Nigella

Mean

Range

Rank Reversed

1

2

4

3

Peer Assessed Score, PA Score

54

74

82

78

75

28

Peer Assessed Index, PA Index

66

90

100

95

88

34

Team Result, TR

50

50

50

50

50

0

Indexed Personal Result, IPR

33

45

50

48

44

17

Normalised Personal Result, NPR

(SpreadFactor = 1)

39

51

56

54

50

17

Normalised Personal Result, NPR

(Spreadfactor = 2)

28

52

62

58

50

34

Rank-Based Personal Result, RBR

20

40

80

60

50

60

Source: FAQ: How are peer assessment and personal results calculated and defined mathematically?


Definitions and features of calculation methods used in Peer Assess Pro

Attribute (X1)

Abbreviation (X1)

Definition (X1)

Peer Assessed Score

PA Score

A relative measure of the degree to which a team member has contributed to their team's overall achievement, team processes, and leadership. The Peer Assessed Score (PA Score) is calculated for each team member directly from their Average Team Contribution (ATC) and Average Leadership Contribution (ALC). That is, from the ten components of Team and Leadership contribution survey in the peer assessment.

A Peer Assessed score is generally used to compare the relative contribution of students WITHIN the same team, rather than BETWEEN teams. The Team Result has NO impact on the value of the Peer Assessed Score. Values for the PA Score range from zero through 100.

Peer Assessed Index

PA Index

The Peer Assessed Score (PA Score) is indexed upwards so that the person in the team with the highest Peer Assessed Score is awarded a Peer Assessed Index of 100. All other team members receive a proportionally lower PA Index in the ratio PA Score / max(PA Score). The Team Result has NO impact on the value of the Peer Assessed Index.

Team Result

TR

The result awarded to the team for the outputs of their work. The teacher typically derives the Team Result (TR) from grades for team reports, presentations, and results of Team Readiness Assurance Tests.

The teacher may select to combine a student's Peer Assessed Index (PA Index) with their team's Team Result (TR) to calculate a Personal Result (PR) for each student, reflecting their relative contribution to the Team Result as assessed by their peer team members. Peer Assess Pro enables the teacher to select from several methods to combine the Team Result and Peer Assessed Index (PA Index) to produce a Personal Result: the Indexed Personal Result (IPR), the Normalised Personal Result (NPR), and the Rank Based Personal Result (RPR).

Measures of a student's personal result

Personal Result

PR

A student's personal result gained from combining their Peer Assessed Index (PA Index) and, optionally, their Team Result (TR).

The teacher selects from one of several Calculation Methods to calculate the Personal Result that incorporates the Team Result. These methods are Indexed Personal Result (IPR), Normalised Personal Result (NPR), and Rank-Based Personal Result (RPR).

The choice of method is determined by the teacher's preference for compensating more strongly students who have contributed significantly to their teams, and under-reward students who are peer assessed as weak contributors. Figure 1 illustrates the statistical features, such as team average, range, and standard deviation, associated with each method. The IPR is the least discriminating method, whilst the RPR is the most discriminating in terms of favouring significant team contributors and penalising weak contributors, as the figure illustrates.

Indexed Personal Result

IPR

The Indexed Personal Result is calculated from the Team Result (TR) combined with the student's specific Peer Assessed Index (PA Index). The Indexed Personal Result method awards the Team Result to the TOP RATED student in the team, since, by definition, their Peer Assessed Index is 100. All remaining students in the same team earn the Team Result downwards, directly proportional to their PA Index.

The Indexed Personal Result calculation means that NO team member can earn an Indexed Personal Result greater than the Team Result. That is, values for the Indexed Personal Result range from zero up to the Team Result.

Normalised Personal Result

NPR

The Normalised Personal Result is calculated from the Team Result combined with the student's specific Indexed Personal Result (IPR). However, in contrast to the IPR method, the Normalised Personal Result method awards the AVERAGE student in the team the Team Result (TR). All remaining students are awarded a Personal Result ABOVE or BELOW the Team Result depending on whether their IPR is above or below that team's average.

Features of the Normalised Personal Result are that (a) In contrast to the IPR method, the Normalised Personal Result method calculates a Personal Result ABOVE the Team Result for the above-average peer rated students in the team (b) The average of the team's Normalised Personal Results matches the Team Result (c) The spread of the team's Normalised Personal Results matches the spread of the Indexed Personal Results (IPR) that is calculated for that team. Spread is measured by the standard deviation statistic. .

Optional feature: To enhance the effect of rewarding high contributors and penalising weak contributors the tutor can increase the Spread Factor (SF) from the default value of 1.0. Increasing the Spread Factor increases the spread of the results centred around the Team Result. However, an increase in the Spread Factor will maintain a team average NPR that matches that team's Team Result. A Spread Factor of 1.5 to 2.0 is recommended, especially in classes where team members are reluctant to penalise weak contributors and/or reward the highest contributors through their peer assessment rating responses.

Values for the NPR range from zero to 100. Calculations that exceed these ranges are clipped to fit within zero to 100

Rank Based Personal Result

RPR

The Rank Based Personal Result is calculated from the Team Result combined with the student's specific Rank Within Team based on that student's Peer Assessed Score. Like the Normalised Personal Personal Result the RPR method awards the AVERAGE student in the team the Team Result. All remaining students are awarded a personal result above or below the Team Result depending on whether their Rank Within Team is above or below that team's middle-ranked student.

Features of the Rank Based Personal Result (PR) calculation method are that (a) A team's RPR values are spread over a MUCH WIDER range than the NPR and IPR methods. Small differences in PA scores within a team are amplified significantly by this method (b) In contrast to the IPR method, the RPR method calculates a Personal Result significantly ABOVE the Team Result for the top ranked student in the team (c) Like the NPR method, the average of the team's RPR values matches the Team Result. Values for the Rank Based Personal Result range from zero to 100. Calculations that exceed these ranges are clipped to fit within the range zero to 100.

Note that in the Xorro version of Peer Assess Pro, we have renamed the following Personal Result Methods from those used in the Google Docs version of Peer Assess Pro.

Renaming of terms for Peer Assess Pro

Peer Assess Pro

Abbreviation

Google Peer Assess Pro

Abbreviation

Peer Assessed Score

PA Score

Team Based Learning Score

TBL Score Score

Peer Assessed Index

PA Index

Team Based Learning Index

TBL Index

Quick links and related information

FAQ: How are peer assessment and personal results calculated and defined mathematically?


FAQ: How are peer assessment and personal results calculated and defined mathematically?

A teacher has several alternative calculation methods to determine a personal result from a team member’s Peer Assess Pro assessment. The teacher will usually advise team members about the method they have chosen.

The teacher’s choice of calculation method for a personal result is determined by the teacher's preference for

These choices are illustrated in this figure.

A student’s Personal Result emerges from the Teacher’s choice of Calculation Method, relative Peer Assessed Score, and Team Result

Calculation methods that exclude a team result

The teacher should select either the Peer Assessed Score (PA Score) or Peer Assessed Index (PA Index) if they wish to exclude the team result in calculating the personal result. Alternatively, set the team result, TR, equal for all teams and use either the IPR, NPR, or RPR methods.

Quick links and related information

FAQ: How is the Peer Assessed (PA) Score calculated?

FAQ: How is the Peer Assessed Index (PA Index) calculated?

Calculation methods that incorporate a team result from team outputs

More usually, the Peer Assessed Score (PA Score) and team result are combined through one of three methods. The following methods are listed in order of increasing impact for compensating more strongly students who have contributed significantly to their teams, and under-rewarding students who are peer assessed as weak contributors

FAQ: How is the Indexed Personal Result (IPR) calculated?

FAQ: How is the Normalised Personal Result (NPR) calculated?

FAQ: How is the Rank Based Personal Result (RPR) calculated?

Quick links and related information

FAQ: What factors are measured in the peer assessment survey?

FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?

FAQ: How do students know where and when to complete the peer assessment activity then review their results?

Automated communications to students

The Teachers Process Flowchart: Detail illustrates the points throughout the peer assessment process where emails are sent to students to advise them

In most cases, the emails are generated automatically by the Peer Assess Pro system. In the case of warnings, the teacher has the option of initiating an email request to a student, or ignoring that warning.

Copies of all emails are sent to the teacher whose Xorro account was used to launch the activity

Standard operating mode

When you create and launch a Peer Assess Pro™ Peer Assessment activity in Xorro AND the Start Date has been reached:

Alternative mode for student access to assessment and results

Alternatively, the teacher can direct students to the Participant URL shown at the top left of the  Xorro HOME page. The student must then select from a list the correct peer assessment activity for their response. The teacher may deliver other Xorro-based test activities from which the student must select the correct Peer Assess Pro™ activity distinguished by the Activity Title specified by the teacher.

FAQ: How do I view and experience what the students experience?

FAQ: What questions are asked in the peer assessment survey? in the peer assessment survey?


FAQ: How do I view and experience what the students experience?

View your student’s personal results feedback report directly from your Teacher’s Dashboard

Click on the name of a student, and you will view the feedback report available for the student.

There are four possible views.

The student’s views are anonymised.

View your students’ experience of the Peer Assess Pro™ survey

Enter your Participants’ URL into your browser


Select the activity you wish to experience

Login in using the Identification (id) of a student in the Team List Group used to create the activity


View a survey ready and waiting for responses

The student will see this view when all of the following are TRUE:

Note that students can continue to submit responses AFTER the Due Date UNTIL the teacher has Finalised the activity.

View a sample question

View a student’s published results

The student will be able to see their Personal Results when all the following are true:

A student with a Xorro Plus account may view his results any time after the Activity is Finalised by the Teacher.

The student views

Example results for a student

View the peer assessment survey for a demonstration class

Xxx TO DO xxx

Quick links and related information

FAQ: What questions are asked in the peer assessment survey? in the peer assessment survey?

FAQ: How is the Peer Assessed (PA) Score calculated?

FAQ: How do students know where and when to complete the peer assessment activity then review their results?


FAQ: Why are different terms used to display peer assessment results in the Xorro and previous Google versions of Peer Assess Pro™?

The following terms have been renamed from the Google version of Peer Assess Pro for Peer Assess Pro

Renaming of terms for Peer Assess Pro

Peer Assess Pro

Abbreviation

Google Peer Assess Pro

Abbreviation

Peer Assessed Score

PA Score

Team Based Learning Score

PA Score Score

Peer Assessed Index

PA Index

Team Based Learning Index

PA Index

Quick links and related information

FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?


FAQ: How do I take action on the Warnings presented in the Peer Assess Pro™ Teacher’s Dashboard?’ What if I ignore the Warnings?

In general, see the Sections

Action responses to warnings

Responding to Active Warnings

Warning messages are under constant development and refinement as we respond to facilitators’ and team members’ experience of Peer Assess Pro.

Critical and catastrophic warnings!

These warnings must be resolved, otherwise utterly invalid results will arise, and students’ time will be wasted completing incorrect surveys.

Example: The composition of a team needs adjusting, see

See Adjusting team composition

Important warnings

Peer Assess Pro will not be able to present results for all teams unless these warnings are resolved.

Example: Insufficient responses from a team are received

See Results hidden from team members and teacher

Example: Enter Team Results:

See Enter Team Results

Informational warnings

Advisory warnings do not affect critically the operation of Peer Assess Pro. However, the teacher would be prudent to review the details to ensure that peer assessments have been conducted fairly and honestly.

Example: Overgenerous or parsimonious ratings by a team member.

FAQ: How is an outlier peer assessment rating identified? WARNING 0042 

Optional emails generated for team members

Several warnings give the facilitator the option to despatch an email to students advising them of exceptional conditions and requesting their action. For example

The criteria used to generate these warnings, and the recommended response by the facilitator is detailed in this section:

Responding to Active Warnings

For example, in the case of a Mismatched self-assessment, the team member is invited to meet with the teacher to explore the reasons for the mismatch, and develop approaches to narrow the gap.

Quick links and related information

FAQ: What is the content of emails sent by Peer Assess Pro?

Responding to Active Warnings


FAQ: When, why, and how do I Refresh and Update Results?

When to recalculate

 You select Refresh and Update results when

Why recalculate?

The most important reason is that you as a teacher MUST be able to review results BEFORE displaying (publishing) results to students. After examining the results to date, you might publish an interim snapshot of the results for view by students.

Students may review the interim results and raise issues such as a questionable peer assessment rating, such as scapegoating. Alternatively, you may need to adjust a Team Result, or experiment with another method of Personal Result Calculation.

In this situation, we have presumed you do not want new responses, nor adjustments to be immediately viewable by students. In particular,  you need the opportunity to review the effect of adjustments before explicitly publishing revised results to students.

How to recalculate

Quick links and related information


FAQ: What questions are asked in the peer assessment survey?

The Peer Assess Pro survey measures one overall assessment, Recommendation, followed by ten quantitative ratings, then several qualitative questions.

The ten quantitative ratings are used to calculate the Peer Assessment Score (PA Score). The ten ratings are categorized into two classes: Contribution to Task, and Contribution to Leadership and Teamwork, as shown in the example survey below.

In addition, two qualitative questions are asked that request examples of behaviours supporting the quantitative ratings in relation to Contribution to Task, and Contribution to Leadership and Teamwork. Finally, the assessor is asked to provide Development Feedback. That is, advice that would help the team member improve their future contribution to the team.

Quick links and related information

FAQ - What is the benefit of a standardized peer assessment rubric?

FAQ: How is the Peer Assessed (PA) Score calculated?

FAQ: How do students know where and when to complete the peer assessment activity then review their results?

FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?

The ten questions used as the basis for calculating the Peer Assessment Score are adapted from:

Deacon Carr, S., Herman, E. D., Keldsen, S. Z., Miller, J. G., & Wakefield, P. A. (2005). Peer feedback. In The Team Learning Assistant Workbook. New York: McGraw Hill Irwin.


Example Peer Assessment Survey: Quantitative

My name is:

I am rating my team member:

My Team name is:

Team Member A

Team Member B

Team Member C

Self

Recommendation

How likely is it that you would recommend this team member to a friend, colleague or employee?

1 = Highly unlikely, 5 = Extremely likely

Contribution to Task Accomplishment

Rate the team member on a 5-point scale.

Rating scale:

1 = Almost never, 2 = Seldom, 3 = Average, 4 = Better than most, 5 = Outstanding

Rate your typical or average team member a mid-level rating of 3.

Initiative

Shows initiative by doing research and analysis.

Takes on relevant tasks with little prompting or suggestion.

Attendance

Prepares for, and attends scheduled team and class meetings.

Contribution

Makes positive contributions to meetings.

Helps the team achieve its objectives.

Professionalism

Reliably fulfils assigned tasks.

Work is of professional quality.

Ideas and learning

Contributes ideas to the team's analysis.

Helps my learning of course and team project concepts.

Contribution to Leadership and Team Processes

Focus and task allocation

Keeps team focused on priorities.

Facilitates goal setting, problem solving, and task allocation to team members.

Encourages contribution

Supports, coaches, or encourages all team members to contribute productively.

Listens and welcomes

Listens carefully and welcomes the contributions of others.

Conflict management and harmony

Manages conflict effectively.

Helps the team work in a harmonious manner.

Chairmanship

Demonstrates effective leadership for the team.

Chairs meetings productively.


Example Peer Assessment Survey: Qualitative

Peer Assessment Survey:

Feedback to the team member

Submit one copy of this form for each team member

My name is:

I am a member of Team Number and Name:

I am assessing (student’s name):

Contribution to Task Accomplishment

For the team member you have assessed, provide specific examples of productive or ineffective behaviours related to your ratings of Contribution to Task Accomplishment. For example, shows initiative; attends meetings; makes positive contributions; helps team achieve objectives; is reliable; contributes quality work; contributes to learning of course concepts. Further examples here http://tinyurl.com/BARSOhland

Contribution to Leadership and Team Processes

For the team member you have assessed, provide specific examples of productive or ineffective behaviours related to your ratings of Contribution to Leadership and Team Processes. For example: keeps team focused on priorities; supports, coaches and encourages team members; listens carefully; manages conflict effectively; demonstrates effective leadership.

Development feedback

What specific behaviours or attitudes would help your team member contribute more effectively towards your team's accomplishments, leadership, and processes? Please provide specific positive or constructive feedback that could enable the team member to improve their behaviour productively. Considering your team member's strengths, how could that person coach other team members to acquire similar strengths for Task Accomplishment, Team Processes, and Leadership?

Source: Peer Assess Pro (2019).


FAQ: Why FINALISE a survey?

Survey responses from Team Members are received and available for incorporation into the Peer Assessment activity UNTIL the you explicitly Finalise the Survey. Specifically, responses submitted by students after the Due Date announced to students, at the launch of the Activity, will be available for incorporation UNTIL the survey is Finalised deliberately by the Teacher. Until you Finalise, you can request a student to reconsider their responses. Students can optionally resubmit their responses until you Finalise.

When you FINALISE, the current state of the Live View of Gradebook results will be Published to students for their view.

Best practice before FINALISE SURVEY


Recommended steps prior to FINALISE: LMS platform

Recommended steps prior to FINALISE: Xorro platform


Download Teacher’s Gradebook of Results

From the Peer Assess Pro Teacher’s Dashboard, select either

Example Gradebook Summary Statistics

Example Gradebook Full Statistics

Finalise the Survey … irrevocably!

Quick links and related information

FAQ: How do students know where and when to complete the peer assessment activity then review their results?

FAQ - What is an inactive team member? WARNING 0048


FAQ: How is the Peer Assessed (PA) Score calculated?

The Peer Assessed Score, PA Score, is a relative measure of the degree to which a team member has contributed to their team's overall achievement, team processes, and leadership.

A Peer Assessed Score is generally used to compare the relative contribution of students WITHIN the same team, rather than BETWEEN teams. The Team Result has NO impact on the value of the Peer Assessed Score.

The PA Score is calculated for each team member directly from summing the ten ratings of Team and Leadership Contribution surveyed in the peer assessment. The sum of ratings is  adjusted by scale factors to give values for the PA Score that range from zero through 100.

The Peer Assessed Score is an essential factor used as the basis for calculating several alternative measures of Personal Result including the Peer Assessed Index (PA Index), Indexed Personal Result (IPR), Normalised Personal Result (NPR), and Rank Based Personal Result (RPR).

The self-assessment is excluded from calculating PA Score

The self-assessment conducted by a team-member is EXCLUDED from the calculation of their Peer Assessed Score. The self-assessment, PA (self),  is used to enable the student to compare their self-perception with that of their team members, and the class as a whole. One method of comparison, the IRSA, is based on the ratio  as detailed in the FAQ:

FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?


Mathematical definition of Peer Assessed Score, PA Score

There are ten Peer Rating components  awarded by each Assessor, a, to each Assessee, s, in the team of t members. The mathematical task is to combine all these ratings into one Peer Assessed Score for each team member.

The Peer Assessed SubScore  is defined as the peer assessment score awarded by Assessor a to Assessee s:

                           

Where

 = the Peer Rating for each of the ten peer assessment components, r,  submitted by the Assessor a for the assessed team member, the Assessee, s. The student’s self-assessment is excluded from the calculation of the PA Score. The Recommendation rating is excluded from calculation of the PA Score.

To ensures the PA Score ranges from zero through 100 the following features are required in the above formula:


The Peer Assessed Score,  for team members s is the mean of the PA Subscores awarded by the other (t - 1) team members to the team member s.

Where

t = the number of team members in the team in which s is a team member.

 = the peer assessment score awarded by Assessor a to Assessee s, mathematically defined earlier.

Note that Peer Assessed Score takes NO account of the team’s Team Result. The Team Result is accounted for in the Indexed Personal Result (IPR), Normalised Personal Result (NPR) and Rank-Based Personal Result (RPR) methods discussed elsewhere.

An example calculation is shown below. In the first table, the team member Bridget (ASSESSEE) is rated by her three team members (ASSESSORS), plus her own self-rating. The subsequent tables show the calculation of the Peer Assessment Score for all four team members based on all team members’ assessment ratings. The long-form calculations show in detail the arithmetic calculations.

Quick links and related information

FAQ: What questions are asked in the peer assessment survey?

FAQ: How do students know where and when to complete the peer assessment activity then review their results?

Alternative but equivalent methods for calculating the Peer Assessed Score are detailed below in the section:

Alternative mathematical formulations of PA Score

 

Example table of assessments for assessed team member Bridget

ASSESSEE: Bridget

ASSESSOR:

Ratings by team member:

Team Name: Kubla

Bridget (Self)

Julian

Lydia

Nigella

Mean Rating

Contribution to Task Accomplishment

Rating scale:

1 = Almost never, 2 = Seldom, 3 = Average, 4 = Better than most, 5 = Outstanding

Initiative

Shows initiative by doing research and analysis.

Takes on relevant tasks with little prompting or suggestion.

5

5

3

1

9/3

Attendance

Prepares for, and attends scheduled team and class meetings.

4

4

4

1

9/3

Contribution

Makes positive contributions to meetings.

Helps the team achieve its objectives.

4

5

5

1

11/3

Professionalism

Reliably fulfils assigned tasks.

Work is of professional quality.

4

3

4

1

8/3

Ideas and learning

Contributes ideas to the team's analysis.

Helps my learning of course and team project concepts.

5

5

5

1

11/3

Contribution to Leadership and Team Processes

Focus and task allocation

Keeps team focused on priorities.

Facilitates goal setting, problem solving, and task allocation to team members.

5

5

3

1

9/3

Encourages contribution

Supports, coaches, or encourages all team members to contribute productively.

4

4

4

1

9/3

Listens and welcomes

Listens carefully and welcomes the contributions of others.

5

5

3

1

9/3

Conflict management and harmony

Manages conflict effectively.

Helps the team work in a harmonious manner.

4

4

4

1

9/3

Chairmanship

Demonstrates effective leadership for the team.

Chairs meetings productively.

5

5

5

1

11/3

SubTotal

SubTotal = Task + Leadership

45

45

40

10

# 95/30

( 3.167)

Peer Assessed Score

PA Score = (2.5 x SubTotal ) - 25

* 87.5

87.5

75

0

54.2

* The self-assessment ratings are excluded from calculation of the PA Score. So, 54.2 = (87.5 + 75 + 0) / 3

# Alternatively, PA Score = (25 x Mean Rating) - 25. So, 54.2  = 25 x 95/30 - 25 = (25 x 3.167) - 25


Example calculations of Peer Assessed Score

Suppose that the Peer Assessed Scores determined from all four team members rating each other appear as follows. Bridget’s PA Scores are copied from the previous table, forming the second vertical column here.

Since

Now consider the Assessment by Lydia of Bridget

In the previous table, note how Nigella rated Bridget with the minimum possible rating of one for all ten components. By definition, that gives a PA Score of zero. Similarly, if an assessor had rated a team member the maximum rating of five across all ten components, then a PA Score of 100 would have resulted.

Peer Assessed Sub-Scores for a team of four members

ASSESSEE

ASSESSOR

Bridget

Julian

Lydia

Nigella

Bridget

87.5

62.5

75

72.5

Julian

87.5

92.5

87.5

82.5

Lydia

75

82.5

77.5

80

Nigella

0

77.5

82.5

82.5


Now the PA Score for each ASSESSEE team member is calculated from the mean of the PA SubScores provided by the other ASSESSORS in their team, as shown in the following table. The self-assessments of each ASSESSOR are excluded from the calculation. For example, the PA Score for Nigella is determined as follows from the ratings by her three teammates Bridget, Julian and Lydia:

Since

Then for Nigella

Calculation of Peer Assessed (PA) Scores for a team of four members

ASSESSEE

ASSESSOR

Bridget

Julian

Lydia

Nigella

Bridget

-

62.5

75

72.5

Julian

87.5

-

87.5

82.5

Lydia

75

82.5

-

80

Nigella

0

77.5

82.5

-

Peer Assessed Score

54.2

74.2

81.7

78.3


Note how Nigella’s rating of Bridget (PA Score = 0) seems an outlier when compared with the much higher ratings given by Julian and Lydia (7.5 and 75). Peer Assess Pro warns the teacher when outlier ratings like this occur.

This outlier issue is discussed in

FAQ: How is an outlier peer assessment rating identified? WARNING 0042                                         

Alternative mathematical formulations of PA Score

The following equations provide the identical mathematical result for the calculation of PA Score.

Calculation from Average Rating

Where:

Average Rating is the average rating of an assessed student s averaged over all the ten components of the rating for that student, by their team members. The Average Rating lies between 1 and 5.

The factor (-1) adjusts the Average Rating value to zero through 4. The scale factor 100 /4 adjusts the PA Score to lie between zero and 100.

Notice from the first table showing ratings of Bridget that the average rating across all ten components contributing to her Peer Assessment Score given by her three team members was shown as  

Therefore, the PA Score is calculated directly from the average rating:

Calculation from Average Team and Leadership Contributions

Finally,

 

         

Where:

ATC and ALC are the average ratings for the five components that comprise the Task and Leadership contributions, respectively.

Mathematically:

ATC and ALC range over the values 1 through 5. The factor (-1) adjusts those values from zero through 4. The scale factor 50/4 (= 12.5) ensures that the PA Score achieves a range from zero to 100.

Quick links and related information

FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?

FAQ: How are peer assessment and personal results calculated and defined mathematically?

FAQ: How is the self-assessment used to calculate Peer Assessed Score?

The self-assessment conducted by a team-member when they rate their team members is EXCLUDED from the calculation of that team member’s Peer Assessed Score. Instead, their self-assessment, PA (self),  is used to enable the team member to compare their self-perception with that of their team members, and the class as a whole. This comparison is provided to the team member through a SPider Chart and the calculation of their Index of Realistic Self Assessment (IRSA).

Spider chart of individual and averaged team peer ratings

The Spider Chart shows each of their eleven ratings provided by themself, compared with the average of the ratings provided to them by their peer team members. The class average ratings for each of the 11 factors are also provided.  In this example, the team member has significantly UNDERRATED themself on nearly all factors (innermost plots), when compared with the ratings provided by their team members (orange).

Spider Chart comparison of self and other team members’ contribution ratings

Index of Realistic Self-Assessment (IRSA)

Another method of comparison, the IRSA, is based on the ratio

as detailed in the FAQ:

FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?

For the team member illustrated in the foregoing Spider Chart, their Peer Assessed Score, PA Score, is 92 and their self-assessed Score, PA (self), is 75. The ratio results in the Index of Realistic Self Assessment (IRSA) 122 = 100 x 92 / 75.

An IRSA between 75 and 95 is typical of about 2/3 of team members in a class. About ⅙ of team members achieve an IRSA below 75. Such people appear to assess their team members excessively OVERCONFIDENT in their abilities. In contrast, an IRSA above 95 suggests the team member has a tendency to UNDERESTIMATE their team contribution when contrasted with the assessment perceived by their team members.

Quick links and related information

FAQ: How are peer assessment and personal results calculated and defined mathematically?


FAQ: How is the Peer Assessed Index (PA Index) calculated?

The Peer Assessed Index is defined such that the team member with the maximum PA Score for each team is assigned a PA Index of 100. All other team members in the same team are scaled in relation to the maximum PA Score for that group.

In a gradebook of results, the PA Index is useful for identifying the team members most highly rated by their peers, as they have PA Indexes in the 90 to 100 range. In combination with the Team Result, the PA Index is used to calculate the Indexed Personal Result, (IPR), Normalised Personal Result, (NPR) and Rank-Based Personal Result (RPR).

Mathematical definition of Peer Assessed Index

Where

 = the Peer Assessed Score for a team member s in team t, as defined in: FAQ: How is the Peer Assessed (PA) Score calculated?

= the maximum value of PA Score found across all members in team t.


Example calculations of Peer Assessed Index

Consider a team of four team members, whose PA Scores are shown in the following table. Lydia has a PA Score of 82, the highest for the team. Therefore, Lydia’s PA Index is 100, by definition.

Calculation of Peer Assessed Index (PA Index) for a team of four members

ASSESSEE

ASSESSOR

Bridget

Julian

Lydia

Nigella

Bridget

-

62.5

75

72.5

Julian

87.5

-

87.5

82.5

Lydia

75

82.5

-

80

Nigella

0

77.5

82.5

-

Peer Assessed Score

54

74

82

78

Peer Assessed Index

66

90

100

95

Bridget has a PA Score of 54, the lowest for the team. Therefore, since

Note that, as expected

The data for the previous table is drawn from

FAQ: How is the Peer Assessed (PA) Score calculated?

Quick links and related information

FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?

FAQ: How is the Peer Assessed (PA) Score calculated?


FAQ: How is the Indexed Personal Result (IPR) calculated?

The Indexed Personal Result (IPR) is calculated from the Team Result (TR) combined with the team member’s specific Peer Assessed Index (PA Index). The Indexed Personal Result method awards the Team Result to the TOP RATED team member in the team, since, by definition, their Peer Assessed Index is 100. All remaining team members in the same team earn the Team Result downwards, directly proportional to their PA Index.

The definition of Indexed Personal Result means that NO team member can earn an Indexed Personal Result greater than the Team Result. That is, values for the Indexed Personal Result range from zero up to the Team Result. Consequently, the IPR disadvantages team members who have been rated unfavourably by their peers. However, no reward is made for the team member(s) who have been rated as the most contributing team members. In contrast, the Normalised Personal Result and Rank-Based Personal Result do award a Personal Result above the Team Result for those team members who contribute above average to the team’s outputs, as assessed by their peers.

Mathematical definition of Indexed Personal Result

For each team member s, in their team, t

Where

= the team result awarded by the teacher for the outputs of team t

= the Peer Assessed Index for the team member s, as defined in

FAQ: How is the Peer Assessed Index (PA Index) calculated?

Example calculations of Indexed Personal Result

Suppose that the following team has a Team Result, TR, of 50 and Peer Assessed Indexes previously calculated as follows. The example data is taken from:

FAQ: How is the Peer Assessed Index (PA Index) calculated?

Calculation of Indexed Personal Result in a team of four members

ASSESSEE

ASSESSOR

Bridget

Julian

Lydia

Nigella

Peer Assessed Score, PA Score

54

74

82

78

Peer Assessed Index, PA Index

66

90

100

95

Team Result, TR

50

50

50

50

Indexed Personal Result, IPR

33

45

50

47.5

Bridget has a PA Index of 66, the lowest for the team. Therefore, since

                

In contrast, Lydia has the highest PA Score in the team, and hence a PA Index of 100. Therefore

                

The IPR for Lydia is equivalent to the Team Result, 50, as defined.

Quick links and related information

FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?

FAQ: How is the Peer Assessed Index (PA Index) calculated?

FAQ: How is the Peer Assessed (PA) Score calculated?


FAQ: How is the Normalised Personal Result (NPR) calculated?

The Normalised Personal Result, NPR, is calculated from the Team Result combined with the team member’s specific Indexed Personal Result (IPR). The Normalised Personal Result method awards the average student in the team the Team Result (TR). All remaining students are awarded a Personal Result above or below the Team Result depending on whether their IPR is above or below that team's average IPR.

Features of the Normalised Personal Result method are that

Use the Normalised Personal Result method with a high Spread Factor if you

Mathematical definition of Normalised Personal Result

For each team member s, in their team, t

Where

 the team result awarded by the teacher for the outputs of team t

That is, the mean value of the IPR values found for team t, containing n team members.

                   = a factor chosen optionally by the teacher that will  S T R E T C H  each team’s intrinsic spread of NPRs, as measured by the team’s standard deviation of NPR results. The default Spread Factor is 1.0. However a Spread Factor of between 1.5 and 2.o is recommended.

Values of NPR are trimmed to be within the range zero to 100.


Example calculations of Normalised Personal Result

Suppose that the following team has a Team Result, TR, of 50 and Indexed Personal Result previously calculated as follows. This first example illustrates a Spread Factor of 2.0. The example data is taken from:

FAQ: How is the Indexed Personal Result (IPR) calculated?

Calculation of Normalised Personal Result in a team of four members

Spreadfactor = 2.0

ASSESSEE

ASSESSOR

Bridget

Julian

Lydia

Nigella

Mean

Peer Assessed Score, PA Score

54

74

82

78

Peer Assessed Index, PA Index

66

90

100

95

Team Result, TR

50

50

50

50

Indexed Personal Result, IPR

33

45

50

48

44

Correction Factor

(Spreadfactor = 2)

-22

+2

+12

+8

0

Normalised Personal Result, NPR

(Spreadfactor = 2)

28

52

62

58

50

Bridget has a PA Index of 66, the lowest for the team.

The  for the four-member team is 44, calculated from ¼ x (33 + 45 + 50 + 48).

Since

Then

In contrast, the Normalised Personal Result for Lydia, with her IPR of 50, is calculated as follows:

Note how Lydia’s NPR of 62 is above the team Result of 50. Note also how the mean of the NPR values across the team is 50 = (28 + 52 + 62 + 58)/4, identical to the Team Result of 50.


Impact of adjusting the Spread Factor on Normalised Personal result

The previous example showed calculations of NPR using a Spread Factor of 2.0. The following table shows the results of calculating the Normalised Personal Result for the team using a more modest Spread Factor of 1.0.

Note the following:

The default Spread Factor is 1.0. However a Spread Factor of between 1.5 and 2.o is recommended.

Calculation of Normalised Personal Result in a team of four members

SpreadFactor = 1.0

ASSESSEE

ASSESSOR

Bridget

Julian

Lydia

Nigella

Mean

Peer Assessed Score, PA Score

54

74

82

78

Peer Assessed Index, PA Index

66

90

100

95

Team Result, TR

50

50

50

50

50

Indexed Personal Result, IPR

33

45

50

48

44

Correction Factor

(SpreadFactor = 1)

-11

+1

+6

+4

0

Normalised Personal Result, NPR

(SpreadFactor = 1)

39

51

56

54

50

Quick links and related information

FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?

FAQ: How is the Peer Assessed Index (PA Index) calculated?

FAQ: How is the Peer Assessed (PA) Score calculated?

FAQ: How are peer assessment and personal results calculated and defined mathematically?


FAQ: How is the Rank-Based Personal Result (RPR) calculated?

The Rank-Based Personal Result is calculated from the Team Result combined with the student's specific Rank Within Team based on that student's Peer Assessed Score. Like the Normalised Personal Personal Result the RPR method awards the AVERAGE student in the team the Team Result. All remaining students are awarded a personal result above or below the Team Result depending on whether their Rank Within Team is above or below that team's middle-ranked student.

The Extended version of the RPR method, due for implementation in 2023, incorporates the use of a Spread Factor.

Features of the Rank Based Personal Result (RPR) calculation method are that

Mathematical definition of Rank-Based Personal Result

For student s in their team t with n team members

Where

the team result awarded by the teacher for the outputs of team t

The number of pieces of cake allocated to team member s. An integer number, except in the case of tied ranks.

                   = the rank average of the team member s in team t containing n members, where the team member with the lowest Peer Assessed Score in that team is ranked as 1, calculated using the rank.average method. Equal ranks are permitted and are calculated as the average of the rankings they would take.

                            = numbers of members in team t

Values of RPR are trimmed to lie within the range zero to 100.

Alternative mathematical formulation

The formula defined above can be simplified for calculation purposes by substituting the formula for ShareFraction.

 

Example calculations of Rank-Based Personal Result

Suppose that the following team with n = 4 team members has a Team Result, TR, of 50 and Peer Assessed Scores previously calculated. The example data is taken from:

FAQ: How is the Peer Assessed (PA) Score calculated?

Calculation of Rank-Based Personal Result in a team of four members

ASSESSEE

ASSESSOR

Bridget

Julian

Lydia

Nigella

Mean

Peer Assessed Score, PA Score

54

74

82

78

Rank Average

1

2

4

3

Share Fraction

1/10

2/10

4/10

3/10

Team Result, TR

50

50

50

50

50

Rank-Based Personal Result, RBR

20

40

80

60

50

Observe how there are ten pieces of cake to be allocated, calculated from the sum of the n=4 ranks.

=

 

Thus, the poorest ranking team member, Bridget, gets one piece, a 1/10 share fraction. In contrast, the best-ranked team member, Lydia, gains four pieces, a 4/10 share fraction.

Bridget has a PA Score of 64, the lowest for the team. Her rank in the team,  is therefore 1.

Note how Julian, ranked 2nd from the lowest, receives double the ShareFraction, and, consequently, double the RPR than does Bridget. Applying the alternative formula,

Lydia, the top-ranked student,  = 4, receives four times the RBR that Lydia received, = 80.

Observe how the mean of the RBR values matches the Team Result for team t of 50.

             =

                                =

    =

    =

Observe that, by definition, the sum of the ShareFractions across the team is exactly 10/10 = 100 %. All the ten pieces of cake are allocated across the team, and each person gets one more piece than the next poorer-ranked student (except in the case of ties).


Example calculation with tied ranks

The following example shows a case where two team members, Julian and Nigella, have the same Peer Assessed Score of 74, and a rank of 2.5 = (2+3)/2. The Google RANK.AVG function where the is_ascending flag is set to FALSE delivers this ranking behaviour.

Calculation of Rank-Based Personal Result with tied scores

ASSESSEE

ASSESSOR

Bridget

Julian

Lydia

Nigella

Mean

Peer Assessed Score, PA Score

54

74 =

82

74 =

Rank Average

4

2.5 =

1

2.5 =

Share Fraction

1/10

2.5/10

4/10

2.5/10

Team Result, TR

50

50

50

50

50

Rank-Based Personal Result, RBR

20

50

80

50

50

Adjusting the range using a spread factor

In a manner similar to the use of the Scale Factor for the Normalised Personal Result Method, NPR, a Spread Factor can be applied in the calculation of the Rank-Based Personal Result.

A SpreadFactor of 2.0 provides the same ‘natural’ values of RBR as determined by the earlier formula. The default Spread Factor is 1.0, which reduces the ‘natural’ spread given by the earlier definition, and brings the spread of RBR values to align comparably with the values typically found using the Normalised Personal Result Method with the default SpreadFactor of 1. A Spread Factor of between 1.5 and 2.o is recommended. The following example shows the calculations for RBR with Spread Factors of 2.0, 1.0, and 2.5.

Calculation of Rank-Based Personal Result with several spreadfactors

ASSESSEE

ASSESSOR

Bridget

Julian

Lydia

Nigella

Mean

Range

Peer Assessed Score, PA Score

54

74 =

82

74 =

28

Rank Average

1

2.5 =

4

2.5 =

Share Fraction

1/10

2.5/10

4/10

2.5/10

5/5

Team Result, TR

50

50

50

50

50

Rank-Based Personal Result, RBR

(Spreadfactor = 2)

20

50

80

50

50

60

Rank-Based Personal Result, RBR

(Spreadfactor = 1.0)

35

50

65

50

50

30

Rank-Based Personal Result, RBR

(Spreadfactor = 1.5)

27.5

50

72.5

50

50

45

Observe from the table

Example calculation with spread factor

Consider Lydia, ranked best in team, with a ScaleFactor of 1.5

Quick links and related information

FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?

FAQ: How is the Peer Assessed Index (PA Index) calculated?

FAQ: How is the Normalised Personal Result (NPR) calculated?

FAQ: How are peer assessment and personal results calculated and defined mathematically?


FAQ: How is the Rank Based Personal Result (RPR) calculated (Pre-2022)?

From late-2022 this method of calculation will be superseded by

FAQ: How is the Rank Based Personal Result (RPR) calculated?

The Rank Based Personal Result is calculated from the Team Result combined with the student's specific Rank Within Team based on that student's Peer Assessed Score. Like the Normalised Personal Personal Result the RPR method awards the AVERAGE student in the team the Team Result. All remaining students are awarded a personal result above or below the Team Result depending on whether their Rank Within Team is above or below that team's middle-ranked student.

Features of the Rank Based Personal Result (RPR) calculation method are that

Mathematical definition of Rank-Based Personal Result

For student s in their team t with n team members

Where

the team result awarded by the teacher for the outputs of team t

                   = the reversed rank of the team member s in team t where the team member with the lowest Peer Assessed Score in that team is defined as 1. Equal ranks are permitted.

                            = numbers of members in team t

Values of RPR are trimmed to be within the range zero to 100.


Example calculations of Rank-Based Personal Result

Suppose that the following team has a Team Result, TR, of 50 and Peer Assessed Scores previously calculated as follows. The example data is taken from:

FAQ: How is the Peer Assessed (PA) Score calculated?

Calculation of Rank-Based Personal Result in a team of four members

ASSESSEE

ASSESSOR

Bridget

Julian

Lydia

Nigella

Mean

Peer Assessed Score, PA Score

54

74

82

78

Rank (Reversed)

1

2

4

3

Share Fraction

1/10

2/10

4/10

3/10

Team Result, TR

50

50

50

50

50

Rank-Based Personal Result, RBR

20

40

80

60

50

First calculate the sum of ranks for the team of four members, n = 4. This number is the denominator for calculating the ShareFraction for each team member.

 

Consequently, there are 10 ‘pieces of cake’ to be shared amongst the 4 team members, in proportion to their reversed rank.


Bridget has a PA Score of 64, the lowest for the team. Her rank in the team,  is therefore 1.

Note how the second-ranked student, Julian receives double the ShareFraction, and, consequently, double the RPR than does Bridget

Lydia, the top-ranked student,  = 4, receives four times the RBR that Lydia received, = 80.

Note how the mean of the RBR values matches the Team Result for team t of 50.

             =

                                =

    =

    =

Note that, by definition, the sum of the ShareFractions across the team is exactly  100 %.

Example calculation with tied ranks

The following example shows a case where two team members have the same Peer Assessed Score of 74. Note how Lydia has a reverse rank of 4, not 3. The Google RANK function, for example, with the optional is_ascending flag set to 1 demonstrates this ranking behaviour.

Calculation of Rank-Based Personal Result with tied scores

ASSESSEE

ASSESSOR

Bridget

Julian

Lydia

Nigella

Mean

Peer Assessed Score, PA Score

54

74

82

74

Rank (Reversed)

1

2=

4

2=

Share Fraction

1/9

2/9

4/9

2/9

Team Result, TR

50

50

50

50

50

Rank-Based Personal Result, RBR

22

44

89

44

50

Quick links and related information

From mid-2022, to be superseded by

FAQ: How is the Rank Based Personal Result (RPR) calculated?

FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?

FAQ: How is the Peer Assessed Index (PA Index) calculated?

FAQ: How are peer assessment and personal results calculated and defined mathematically?

FAQ: How is Standard Peer Assessed Score (SPAS) calculated?

How do we compare students within a class, and between classes based on their Peer Assessed Scores?

The short answer is: We can use the Peer Assessed Score to compare students ONLY within their team. A PA Score above that specific team’s average PA Score suggests that team member has contributed more than a team member with a lower PA score.

A Peer Assessed Score of 90 indicates that a student in the same team has contributed clearly more to their team’s outcomes than a student in the same team with a Peer Assessed Score of 30. However, a Peer Assessed Score achieved by a student in one team does not meaningfully compare with the Peer Assessed Score of a student in another team. A Peer Assessed Score of 60 in Team t1 is no better nor worse than a PA Score of 90 achieved by a student in another team t2.

We cannot conclude from comparing the Peer Assessed Score which is the better student in terms of team contribution and/or leadership when the students are from different teams. Why? Some students and teams diligently commit to rating each other so the average student in their team does rate ⅗ on each of the ten items in the peer assessment survey, as intended. Meanwhile, other teams believe they are all above average, having come from their local equivalent of Lake Wobegon. By chance and/or good team functioning, some teams achieve that desired state where all members work productively and effectively together: the Holy Grail of the Dream Team. Other teams comprising high performers can conversely fall into the desolation of dismal performance characterised by the Apollo Syndrome (Belbin). 

The long answer is that through applying appropriate data analytics, we can develop three related numbers that enable comparison of peer assessed team members both within and between classes, and over time. These measures are Standard Peer Assessed Score (SPAS), Employability, and Net Employability. In essence, the data analytic processes can be likened to a forensic photoanalyst attempting to read an automobile’s number plate. Imagine the original photo image has been photographed through smog, on a dark night, from a far distance, with a low resolution setting, using a poor quality lense and poor imaging sensor. But through advanced algorithms that remove background noise, amplify relevant signals, and enhance clarity, a readable, useful image can be discerned, as illustrated in the example of from Acclaim Software

Source: Acclaim Software. (2015). Forensics - Recovering the Most Detail from Your Image - Focus Magic. http://www.focusmagic.com/forensics-tutorial.htm

The Standard Peer Assessed Score (SPAS) is our first measure designed to enable a more realistic relative comparison of peer assessment ratings between members of a whole class. The Standard Peer Assessed Score combines normalised values of the Recommendation and Peer Assessed Score for each team member. The normalisation applies several data analytic processes to correct for the biases introduced by some students and teams in their rating. The SPAS approach is not perfect, but it’s a start. Furthermore, the determination of Standard Peer Assessed Score is a necessary precursor to the calculation of Employability and Net Employability, discussed elsewhere.

Design features of Standard Peer Assessed Score

The whole-of-class values of Standard Peer Assessed Score for a particular class response dataset are targeted to have these features:

Mean: 50

Standard Deviation: 20

Maximum possible range: from 0 to 100

By virtue of the definition of the Standard Peer Assessed Score, the following effects occur by design:

One half of the class values of Standard Peer Assessed Score will fall in the range zero to 50 (below the target average). Naturally, the remaining one half of values will fall in the range 50 to +100 (above average).

Approximately ⅔ of Standard Peer Assessed Scores in the class will lie between 30 and 60. That is, within one standard deviation of the mean value of 50. More accurately, if SPAS was normally distributed, then 68.3 percent of the class dataset values of SPAS will lie between plus and minus one standard deviation of the mean.

Approximately ⅙ of students in the class will receive a Standard Peer Assessed Score value of either greater than 60, or less than 30. More precisely, 15.9 percent of values will lie in each of these ranges.

Finally, given the wonders of the normal distribution, 95% of all class members will lie in the range of SPAS 10 through 90. That implies that a student with a SPAS above 90 is in the top 2.5 % of members of the class. Conversely a student with SPAS less than 10 is in the bottom 2.5% of the class. This knowledge allows the teacher to more reliably identify their star students, and students at risk, rather than relying simply on Peer Assessed Score.

Mathematical calculation

The general approach to creating the Standard Peer Assessed Score is to apply z-score normalisation to a student’s (raw) Recommendation, R and Peer Assessed Score, PAS. The two z-scores () are added, then re-scaled to achieve, for the class dataset as a whole, the target mean,  of 50, and target standard deviation,  of 20 required for the SPAS statistic. Note that the result of z-score normalisation for any data set is such that the normalised data has a mean of zero and standard deviation of 1.0, detailed later.

 The Standard Peer Assessed Score for student s is defined as

Where

                 = Target mean for the SPAS statistic, by definition a constant of 50

                =Target standard deviation for the SPAS statistic, by definition a constant of 20

                = a correction factor to ensure the standardisation process achieves the target standard deviation, . The factor is required because in practice the distributions of the raw data are not normally distributed, but tend to have strong negative skew, due to such factors as the Lake Wobegon effect mentioned earlier. A factor of 1.2 has been found appropriate in practice.

= The z-score normalisation of the Recommendation rating for student s by their team members.        

        

= The z-score normalisation of the Peer Assessed Score rating  for student s by their team members.

                = Recommendation rating awarded to student s by their team members in team t

        = Peer Assessed Score awarded to student s by their team members in team t

                = estimate of the class mean Recommendation rating derived over all valid assessed teams in the class responses dataset

                = estimate of the class standard deviation of the class Recommendation ratings derived over all valid assessed teams in the class responses dataset

        = the population mean of the Peer Assessed Scores derived over all team members in the team t  in which student s is a member.

        = the population standard deviation of the Peer Assessed Scores derived over all team members in the team t in which student s is a member.

Notes

Divisor of 2. The sum of the two z-score normalised functions, each with unit standard deviation, gives a resulting distribution with standard deviation of 2.0. Consequently the divisor of 2 is required in the calculation of SPAS so that has a mean of zero and standard deviation of 1.

Trimming. Values of Standard Peer Assessed Score that calculate above +100 are trimmed down to +100. Similarly, values of Standard Peer Assessed Score that calculate below 0 are trimmed up to 0.

Example calculations of Standard Peer Assessed Score

The following table shows example calculations of Standard Peer Assessed Score for three students in two teams, A and B. Note how Michael Mass (Team A) and Lydia Loaded (Team B) have both been awarded the same Peer Assessed Score 0f 50 by their team members. However,  because of their different team means and standard deviations, the z-score normalisations realise +1 and +2 respectively.

As part of the journey towards calculating SPAS, the intermediate calculations of the combined z-scores provide the basis for calculating the percentage proportion of the entire class who would fall below that combined z-score. This can be interpreted as the percentage of the class who would recommend the specific team member to an employee, a colleague, or another team. This percentage is rounded conservatively to produce the student’s Employability rating, the detailed methodology for which is detailed in the FAQ

FAQ: What is Employability? How is it calculated?


Example calculations for Standard Peer Assessed Score (SPAS) and Employability

Student, s

Peter Johns

Michael Mass

Lydia Loaded

Recommendation,

2.0

4.5

3.0

Mean of class Recommendation,

3.0

3.0

3.0

Standard deviation of class Recommendation,

0.5

0.5

0.5

Normalised Recommendation,

(2-3)/0.5 =

-2

(4.5-3)/0.5 =

+3

(3.0-3.0)/0.5 =

0

Peer Assessed Score,

30

50

50

Team,

A

A

B

Mean of team Peer Assessed Score,

40

40

20

Standard Deviation of of team Peer Assessed Score,

10

10

15

Normalised Peer Assessed Score,

(30-40)/10 =

 -1

(50-40)/10 =

 +1

(50-20)/15 =

+2

Combined z-scores,

(-2-1)/2 =

 -1.5

(+3+1)/2 =

 +2

(0+2)/2 =

+1

Target Standard Deviation,

20

20

20

Correction factor,

1.2

1.2

1.2

Target mean,

50

50

50

Standardised Peer Assessed Score, SPAS

50 + 1.2 x 20 x (-1.5)

 = 50 - 36 =

 14

50 + 1.2 x 20 x 2

 = 50 + 48 =

 98

50 + 1.2 x 20 x  1

= 50 + 24 =

 74

Proportion of class below Combined z-score

 

0.5 + GAUSS(-1.5)

= 0.5 - 0.4332 =

 6.9%

0.5 + GAUSS(+2)

= 0.5 + 0.4772 =

 97%

0.5 + GAUSS(+1)

= 0.5 +0.34 =

84%

Employability

10

90

80

Example charts for Standard Peer Assessed Score

The following figures show a Standard Peer Assessed Score histogram, and the histograms for the Recommendation and Peer Assessed Score data that contribute to the Standard Peer Assessed Score chart.

Figure 1. Histogram of Recommendation

Mean = 3.7, standard deviation = 0.53

Figure 2: Histogram of Peer Assessed Score

Mean = 67, standard deviation = 11.3

Figure 3: Standard Peer Assessed Score histogram

Mean = 0, Standard deviation = 20

Assumptions about Standard Peer Assessed Score

The calculation of Standard Peer Assessed Score assumes several conditions, described as follows.

The statistical distributions of the Recommendation and Peer Assessed Scores (PA_Score) are assumed to be normally distributed. In practice, the distributions are typically asymmetric with negative skew. See Figures 1 and 2 earlier.

The Recommendation score awarded to a student s1 in team t1 are assumed to be absolutely comparable to a similar Recommendation score awarded to another student s2 in another team t2. In other words, a Recommendation score of 3.5 awarded to student s1 in team t1 means exactly the same for student s2 in team t2 if they are also awarded a Recommendation score of 3.5. Similarly, a difference in Recommendation ratings of 1.0 unit means the same in any team. In practice, the Recommendations made by one team may not be consistent with the Recommendation values assigned by another team. However, given that Recommendation is a ‘top of mind’ peer assessment done at the start of the Peer Assess Pro survey, we think it is a reasonable approximation. Consequently, the Recommendation values are z-score normalised using the mean and standard deviation of the entire class of responses.

In contrast, in normalising the Peer Assessed Score it is well recognised that different teams award quite different Peer Assessed Scores to a students who would ordinarily achieve the same Peer Assessed Score in an ideal world of perfect raters. Consequently, it is assumed that each team possesses a uniform, random mix of student capabilities drawn from the entire class. Therefore, all things being equal, one would expect that the mean and standard deviations of each team’s Peer Assessed Score would be equivalent. However, in practice, this equivalence is rarely observed. Consequently, the need arises to z-score normalise the Peer Assessed Score for each team to achieve a set of nor aloised Peer Assessed Scores with mean zero and standard deviation 1 FOR EACH TEAM.

The impact of gaming peer assessment

The Peer Assessed Score awarded to a student s1 in team t1 is assumed NOT to be comparable to similar Peer Assessed Score that might be awarded to another student x in team y. Why? Some teams honestly peer assess each, whilst others attempt to ‘game’ the peer assessment process, such as awarding everyone above average, or even the full 5/5 rating for each of the team peer assessment factors. In contrast, it is assumed that the Peer Assessed Score of the average student in team t1 should be adjusted to match the peer rating of the average student rated in another team t2, even though the arithmetic value of the (original) Peer Assessed Scores usually differ. The same reasoning applies to the spread of Peer Assessed Score values within teams, namely, that the best team members in team t1 should be rated comparably with the best team member in team t2, even if their Peer Assessed Scores differ. Consequently, the Peer Assessed Scores WITHIN a team are scaled to match the relative values within other teams through normalisation using each team’s mean and standard deviation.

FAQ: What is the influence on Standard Peer Assessed Score (SPAS) if a team rates ALL its members with a Peer Assessed Score of 100?

In that case, the z-score normalised Peer Assessed Score  for every team member is set to 0.5.

A Future option to consider: Exclude students from consideration for receiving calculation of their SPAS in the case of a ‘misguided team’, identified as

FAQ: Would a student receive the same Standard Peer Assessed Score (SPAS) if rated in another class?

In general, ‘NO’. A student is motivated differently in each of the classes the take. The luck of the draw is that they may work with a superior or inferior team, who will rate them relatively differently.

Quick links and related information

FAQ: What is Employability? How is it calculated?

FAQ: How are peer assessment and personal results calculated and defined mathematically?


FAQ: What is Employability? How is it  calculated?

For a specific Peer Assess Pro assessment, Employability is the statistical probability that team members from the class would recommend the specific team member to an employee, a colleague, or another team.

Employability is a proprietary measure defined by Peer Assess Pro™ drawn from the calculation of a student’s Standard Peer Assessed Score (SPAS). SPAS combines a student’s Peer Assessed Score and their Recommendation score, through various statistical treatments such as z-score normalisation. The resulting Employability score is a statistical probability, ranging from 5 to 95 percent. Employability is the best available estimate of the degree to which team members from the class in which the student has participated in a team project would recommend that specific team member to an employee, a colleague, or another team.

Mathematical calculation of Employability

Where

= the employability for student s, ranging over values from 5 to 95 in steps of 5.

 = the Gaussian distribution.  The statistical probability that a random variable, z, drawn from a normal distribution, will lie between the mean and z standard deviations above (or below) the mean. The GAUSS function returns values between -0.5 and +0.5

 is the combined z-score resulting from combining the z-score normalisation of the Recommendation  and Peer Assessed Scores for student s, as explained in the mathematical calculations for the Stand Peer Assessed Score. Through the process of normalisation,  has a mean of zero, standard deviation 1, which is the required input for the GAUSS function.

 is a mathematical function that rounds one number to the nearest integer multiple of another. In the case of Employability, m = 5. For example,  and . The MROUND function coupled with the attenuation factor of 95 achieves a step interval of 5 units.

The constant 0.5, adds the probability that a z-score lies between minus infinity and the mean, which is, by definition, 50%.

Conditioning transformations to de-emphasise unsubstantiated precision

The following transformations are applied to remove the impression of an over-precise measure of Employability, and reduce the possibilities of elation or despair in response to extreme values of Employability. Specifically, we apply a Principle of Conservatism the result of which is that Employability is conditioned to lie between 5 to 95, and rounded to increase in steps of 5, rather than the theoretically possible values of zero to 100, with apparently infinite precision!

The MROUND to the closest multiple of 5 coupled with attenuation by 95 achieves the step interval of 5 units.

The constant 2.5 is a translation factor that compensates for the shift downwards in mean values on account of the 95 attenuation factor.

Example calculations of Employability

The following table shows example calculations for of Employability based on the most likely range of possible values for combined z-scores arising from the generation of Standardised Peer Assessed Score, SPAS

The subsequent graph shows the data from the calculations of Employability charted against Combined z-scores.

As an example, consider a student achieving a SPAS of zero, arising from their combined z-score of -3. According to the normal distribution, less than 1 in 1000 students would recommend this student, as indicated by the proportion of the class who would fall below a combined z-score of -3. The calculation of Employability generously raises the assessment of the student suggesting that 5 % of the class would recommend them! The same conservativism happens at the other extreme, where a brilliantly contributing student (eg above a Combined z-score of +2) achieves an Employability of 95%, whereas if the normal distribution was to believed, they might expect 98% of the class to recommend them.

Example calculations of Employability from Combined z-scores

Combined z-scores,

-3

-1.5

-1

-0.5

0

0.5

1

1.5

2

3

-0.50

-0.43

-0.34

-0.19

0

0.19

0.34

0.43

0.48

0.50

Standardised Peer Assessed Score, SPAS

-22

14

26

38

50

62

74

86

98

122

Standardised Peer Assessed Score, SPAS

(Trimmed to 0 to 100)

0

14

26

38

50

62

74

86

98

100

Proportion of class below Combined z-score

 

0.1%

7%

16%

31%

50%

69%

84%

93%

98%

99.9%

Employability

5

10

20

30

50

70

80

90

95

95

Quick links and related information

FAQ: How is Standard Peer Assessed Score (SPAS) calculated?

FAQ: How are peer assessment and personal results calculated and defined mathematically?


FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?

Having a good sense of who you are enables you to build upon your strengths and correct your weaknesses. In turn, that can make you more successful at work and in your personal life. You are able to better understand, predict and cope with others more effectively. You can better distinguish valid and invalid informal and formal feedback from others. You are more likely to select (and achieve!) realistic personal goals. (‘ERSI: Exceptionally Realistic Self-Image’, 2012)

The Index of Realistic Self Assessment (IRSA) is a first step in providing data upon which to develop an Exceptionally Realistic Self-Image (ERSI).

Mathematical definition of the Index of Realistic Self Assessment

The Index of Realistic Self Assessment (IRSA) is a ratio-based measure of the extent to which a team members’  SELF assessment is matched by the assessment of the OTHER members of your team.

Where

   = the Peer Assessed Score assigned that student by their team members

      = the Peer Assessed Score a student has assessed themself

IRSA typically lies in the range 50 to 120. However, theoretically, IRSA could lie between zero and infinity. IRSA values generally calculate as:

IRSA is calculated only when these two conditions occur:

Extreme values of IRSA are notified in the teacher’s Active Warnings, as detailed in the FAQ

FAQ: What is a mismatched self-assessment (IRSA)?

Example calculations of the Index of Realistic Self Assessment

The data for the following table is drawn from

FAQ: How is the Peer Assessed (PA) Score calculated?

Calculations of the Index of Realistic Self Assessment for four team members

ASSESSEE

ASSESSOR

Bridget

Julian

Lydia

Nigella

Bridget

87

62.5

75

72.5

Julian

87.5

93

87.5

82.5

Lydia

75

82.5

78

80

Nigella

0

77.5

82.5

82

Peer Assessed Score (others)

54

74

82

78

Peer Assessed Score (self)

87

93

78

82

  Index of Realistic Self Assessment

62

80

105

95

Indication

Overconfident

Typical

Underconfident

Typical (Borderline underconfident)

Lydia has been assessed by others with a PA Score of 82. Her self-assessment has produced her  of 78. Therefore, since

Lydia’s IRSA of 105 indicates that she is an outlier when compared with most team members in a typical class. Specifically, she is underconfident in terms of assessing her strengths when compared with how others perceive her.

Why an IRSA of 100 is not a perfect score!

From our experience using Peer Assess Pro in many classes, we find most team members overrate themself when compared with how their team members rate them. This overrating results in a self-assessed Peer Assessed Score typically 7 to 10 points higher than the  Peer Assessed Score awarded by the other members of that same team. This phenomenon of overrating of one’s self assessment is well-established in the literature, termed self-enhancement bias (See, for instance, (Loughnan et al., 2011). Informally, self-enhancement bias is also known as the Lake Wobegon Effect, a phenomenon observed in a fictional town “where all the women are strong, all the men are good looking, and all the children are above average." (‘Lake Wobegon effect’, n.d.; ‘Lake Wobegon: The Lake Wobegon Effect’, 2017).

Quick links and related information

FAQ: What is a mismatched self-assessment (IRSA)?

FAQ: What is a valid assessed team?

FAQ: How do I interpret measures of realistic self-assessment?

Lake Wobegon effect. (n.d.). Retrieved 25 July 2017, from http://psychology.wikia.com/wiki/Lake_Wobegon_effect

Lake Wobegon: The Lake Wobegon Effect. (2017). In Wikipedia. Retrieved from https://en.wikipedia.org/w/index.php?title=Lake_Wobegon&oldid=787029148#The_Lake_Wobegon_effect

Loughnan, S., Kuppens, P., Allik, J., Balazs, K., de Lemus, S., Dumont, K., … Haslam, N. (2011). Economic Inequality Is Linked to Biased Self-Perception. Psychological Science, 22(10), 1254–1258. https://doi.org/10.1177/0956797611417003


FAQ: How do I interpret measures of realistic self-assessment?

From our experience using Peer Assess Pro in many classes, we find most team members overrate themself when compared with how their team members rate them. This overrating results in a self-assessed Peer Assessed Score typically 7 to 10 points higher than the  Peer Assessed Score awarded by the other members of that same team. This phenomenon of overrating of one’s self assessment is well-established in the literature, termed self-enhancement bias (See, for instance, (Loughnan et al., 2011). Informally, self-enhancement bias is also known as the Lake Wobegon Effect, a phenomenon observed in a fictional town “where all the women are strong, all the men are good looking, and all the children are above average." (‘Lake Wobegon effect’, n.d.; ‘Lake Wobegon: The Lake Wobegon Effect’, 2017).

Interpreting the Index of Realistic Self Assessment (IRSA)

The usual tendency of team members is to apply a self-enhancement bias when rating themselves using Peer Assess Pro. Consequently, we can interpret Index of Realistic Self Assessment (IRSA) scores in one of three ways: typical team members, overestimated, and underestimated.

Typical IRSA

An IRSA score between 75 and 95 suggests the assessed team member understand realistically their team contribution when contrasted with the assessment perceived by other team members. A score between 75 and 95 is typical of about 2/3 of team members in a class.

Overconfident IRSA

An IRSA below 75 suggests the assessed team member OVERESTIMATES their team contribution when perceived by other team members. An index below 75 suggests the team member undertake action to understand proactively their areas for development by informally soliciting further feedback and guidance from their team members. About ⅙ of team members achieve an index of below 75.

Underconfident IRSA

An IRSA above 95 suggests the assessed team member has a tendency to UNDERESTIMATE their team contribution when contrasted with the assessment perceived by other team members. The team member should consider developing more confidence in applying and displaying their strengths. About ⅙ of team members achieve an index of above 95.

Developing an exceptionally realistic self-image, ERSA

An Index of Realistic Self Assessment that is not in the ‘typical’ range of 75 to 95 suggests that the team member take active steps to

What are the benefits of having an Exceptionally Realistic Self Assessment?

What can get in the way of having an Exceptionally Realistic Self-Image?

How do I develop my Exceptionally Realistic Self-Image, ERSI?

A three-step programme to develop an Exceptionally Realistic Self-Image includes

Quick links and related information

FAQ: What is a mismatched self-assessment (IRSA)?

FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?

ERSI: Exceptionally Realistic Self-Image. (2012). Orange County Human Resource Services Portal. Retrieved from http://bos.ocgov.com/hr/hrportal/docs/docs_hr_leadership_forum/minutes_2012/minutes_030812/ersi.doc

Lake Wobegon effect. (n.d.). Retrieved 25 July 2017, from http://psychology.wikia.com/wiki/Lake_Wobegon_effect

Lake Wobegon: The Lake Wobegon Effect. (2017). In Wikipedia. Retrieved from https://en.wikipedia.org/w/index.php?title=Lake_Wobegon&oldid=787029148#The_Lake_Wobegon_effect

Loughnan, S., Kuppens, P., Allik, J., Balazs, K., de Lemus, S., Dumont, K., … Haslam, N. (2011). Economic Inequality Is Linked to Biased Self-Perception. Psychological Science, 22(10), 1254–1258. https://doi.org/10.1177/0956797611417003


FAQ:  How is insignificant intra-team agreement identified? WARNING 0041

An important test of the validity of teammate peer assessment is the degree to which all the members of a team agree on the ratings they have awarded their teammates, intra-team agreement.

Peer Assess Pro conducts an advanced statistical test to verify that all the ratings provided by a team’s members are concordant in their agreement with each other. The degree of concordance is tested for statistical significance. If the ratings fail to agree to an acceptable level of significance, the following Active Warning is raised.

CRITICAL 0041 Insignificant team agreement

The team members’ ratings of each other fail to agree to an acceptable level of statistical significance. It is neither fair nor valid to award a different peer-assessed personal result to each student in the team.

Illustration of principle

The following table shows Peer Assessed Scores calculated from the four ratings provided by each teammate in a team. Self-assessments are excluded. Note how Yode is rated very low by Andy and Mike, but very high by Pat and Pete. Similarly, Pat is rated lowest in the team by Yode, Mike and Pete, but highest by Andy.

In this example, the teammates, as assessors, fail to agree in their ratings of each other. Consequently, the resulting Peer Assessed Scores, from which contribution-based personal results are calculated, are not trustworthy. Technically, they are imprecise measures of the Peer Asessed Score.  For example, perhaps the Peer Asessed Score for Pat should be zero (the mode), 1.5 (the median), or 25 (the mean)?

In the discussion that follows, we’ll assert that the fair and valid approach to dealing with this team’s untrustworthy peer assessments is to consider awarding the same result to each teammate, such as the team result.

The measure of a team’s agreement in their ratings of each other is termed concordance, which ranges from zero to 1. In this example the concordance, W, is 0.333. The statistical significance of this concordance, p(W), is 50% which does not reach the acceptable level of 10% or better we demand in Peer Assess Pro.

In conclusion, we can state ‘The teammates agree slightly at a level that is essentially insignificant’.

Theses statistics are explained with calculation examples later in this FAQ.

Untrustworthy intra-team agreement in a peer assessment survey

ASSESSOR (m)

ASSESSEE

Peer Assessed Score

Yode

Andy

MIke

Pete

Pat

Yode

52

85

8

3

98

100

Andy

63

68

75

60

63

60

Mike

57

65

53

100

50

58

Pete

45

23

48

58

50

50

Pat

25

3

95

0

0

100

Peer Assessed Score = MEAN(Peer Assessed Subscores) excluding Self-assessed score

GREEN shows highest ranked ASSESSEE by the ASSESSOR

RED shows lowest ranked ASSESSEE by the ASSESSOR

Concordance, W = 0.33. Significance, 25 < p(W) < 50%

For calculations, see Example B below.

Warning detail

The extended detail for the Active Warning displays one or more messages for each relevant team such as

Team Alpha's ratings of each other fail to agree with acceptable significance. Concordance (W) = 0.52.  Significance p(W) = 20%.

The Insignificant team agreement Active Warning is raised when the statistical significance, p(W) of the concordance value, W, exceeds the threshold level of 10 per cent.

The degree of agreement is measured by Kendall’s Concordance statistic, W. When W=1, there is complete agreement amongst the ratings given by the team members. When W=0, there is complete disagreement; the ratings by the team are essentially random.

However, the statistical significance of the agreement, p(W) is of more practical value than the degree of agreement, W.  A numerically smaller significance value, p(W), between 0 and 10 per cent, indicates a highly significant degree of agreement (concordance) amongst the team members. A highly significant degree of agreement (concordance) implies greater confidence in the validity and fairness of personal results determined from the peer assessments in that team.

In contrast, the Active Warning is raised only when the significance value, p(W), lies above 10 per cent through 100 per cent. A significance value above 10 per cent indicates unacceptable or weak degree of agreement (concordance) amongst the team members, that is, Insignificant intra-team agreement. A weakly significant concordance implies low confidence in the validity and fairness of personal results determined from the peer assessments in that team.

Accounting for tied ratings and low-quality assessments

All other factors remaining equal, if a team member rates two or more team members with the same peer assessed score those ratings are termed tied ratings. As the proportion of tied ratings within a team increases, then the degree of agreement, W, reduces towards zero. Consequently, the numerical value of significance, p(W), increases and the intra-team agreement becomes increasingly unacceptable.

A team where all team members rate everyone the same

In the extreme case where all team members rate everyone the same then the concordance is W  = 0, and the significance p(W) is 100%. These results signify no agreement because the assessors have made no distinction in relative ranking amongst the teammates they are assessing.

Example C shows the impact of tied ranking in the calculation of the concordance statistics when compared with similar team results without tied rankings in Example A.


Recommended action

When the team's ratings agree with weak significance, that is, 10% < p(W) <= 100%, then it is neither fair nor valid to award a peer-assessed personal result to each student in the team. Accordingly, the teacher is presented with several approaches to resolve the lack of agreement.

  1. Encourage the team to engage in a ‘courageous conversation’ to review and discuss the basis for how they have awarded Expert (superior), Intermediate (average) and Novice (inferior) performance to each of their teammates
  2. Invite the team to review and resubmit a more accurate survey taking special care to provide useful and accurate qualitative feedback
  3. Award the same personal result to each team member.

The Active Warnings presents to the teacher a proforma message to send to all the team members of the selected teams explaining the available options.

Award the same personal result

The same teacher-determined Team Result will be applied to each team member if the teacher has selected the Personal Results Methods of either

If the teacher subsequently updates the Team Result, then the updated Team Result will be applied to the member of teams for whom ‘Award the same personal result’ has been activated. See Special Cases below.

A Personal Result of 50/100, the mid-range value, is applied to each team member if the teacher has selected the Personal Results Methods of either

In their Personal Feedback Report, the student will view the original Peer Assessed Score and quantitative ratings derived from the peer assessment survey. However, the Personal Result will show as either the Team Result or 50/100 depending on the Personal Result Method selected at the time the students’ personal feedback reports were published or updated.

At any time, the teacher can view in the Full Statistics csv download the original peer assessed scores and personal results calculated by each of the available personal result methods.  

Special cases

The same personal result awarded updates according to teacher’s adjustments to team result or result method

Once the ‘Award the same personal result’ status is activated for a team’s personal results, then the personal result awarded to each team member will update in response to subsequent adjustments made by the teacher on the platform.

Specifically, if the teacher, at any time, adjusts

  1. The Team Result, then the (same) result awarded to each team member will also update to the new Team Result when the personal result method is Normalised Personal Result (NPR), Indexed Personal Result (IPR) or Rank-Based Personal Result (RBR).
  2. The Personal Result Method, then the (same) result awarded to each team member will also adjust according to the new result method selected. For example, if the teacher adjusts the Personal Result method from Peer Assessed Score (PAS) to Normalised Personal Result (NPR) then the result awarded of 50/100 will adjust to the Team Result then in effect. Note that the Personal Result Method automatically adjusts from Peer Assessed Score (PAS) to Normalised Personal Result (NPR) when Team Results are first entered.

In other words, the personal result presented to a student in their personal feedback report depends on the Personal Result Method selected at the time their personal feedback reports are published or updated.

‘Award the same personal result’ status is deactivated automatically

In general, if the students in a team resubmit their responses, but the recalculated value of p(W) remains greater than the threshold level of 10 per cent then ‘Award the same personal result’ status remains in effect, since the team’s ratings of each other continue to fail to agree with acceptable significance

However, when the ‘Award the same personal result’ status for a team is in effect, that status is deactivated when

  1. One or more students in the team update their survey responses AND the significance of the ratings of each other IMPROVE to agree with acceptable significance. That is, the significance, p(W) of the recalculated concordance value, W, improves so that it is less than the threshold level of 10 per cent, 0<= p(W) <= 10%
  2. A team member has been deleted from the team AND the recalculated significance of the ratings of each other IMPROVE to agree with acceptable significance, 0<= p(W) <= 10%
  3. A new team member has been added to the team AND submits their survey response. The status will ALWAYS be deactivated when a new team member has been added, so the teacher can reappraise their decision.
  4. The ‘Undo’ action is deliberately applied by the teacher to deactivate the action of ‘Award the same personal result’.

When the ‘Award the same personal result’ status is deactivated, then the personal result (individual result) applied to each team member is the personal result calculated according to the team result, peer assessed scores, and personal result method in effect at that instant. As stated in the earlier special case, the personal result updates subsequently in response to the teacher’s adjustments to team result or result method.

‘Exclude from calculations’ overrides actions that address insignificant team agreement

When the Exclude from calculations action has been set for a student through a teacher’s response to Active Warning 0048 Inactive Team Member this action OVERRIDES certain responses to Active Warning 0041 Insignificant Team Agreement.

The concordance value, W, and significance, p(W) is calculated ONLY for the team members who have not been excluded. In effect, the team size is reduced by the number of excluded students.

If the teacher chooses the action Award same personal result to all members of the team then only the active team members will earn the relevant personal result. The excluded student(s) will earn a missing result, as dictated by the ‘Exclude from Calculations’ action.

Defining concordance

The quantitative degree of agreement amongst several judges evaluating several objects is termed concordance. In the case of teammate peer assessment, the teammates are both judges (assessors) and the ‘objects’ judged, (assessees).

Concordance is calculated using the non-parametric statistic Kendall’s W (Gibbons & Chakraborti, 2020; Kendall & Babington-Smith, 1939). The statistic is calculated not from the raw peer assessment ratings, but from the relative rankings of the raw ratings each teammate provides. The use of relative rankings rather than raw peer assessment ratings corrects for several assessment issues such as a judge (student) who overall rates their teammates near-Expert and/or over a narrow range.

Conceptually, the notion of concordance is analogous to the statistical notion of correlation. Correlation is a statistical measure that expresses the extent to which two variables are linearly related, meaning they change together at a constant rate (JMP Statistical Discovery, 2022). In contrast, statistical concordance applies to three or more variables, in our case, the three or more members of a team who are peer assessing each other.

For example, consider three instruments for measuring the temperature of water, a mercury thermometer, an alcohol thermometer, and a digital thermometer. We place each of these instruments in several samples of water. Perhaps one sample contains ice and salt, another is boiling, and another sample is taken directly from the kitchen faucet. We want to know ‘To what extent are the measurements observed by three instruments in agreement?’.

Interpreting the concordance value, W

If the concordance statistic W is 1, then all the peer assessment survey respondents have been unanimous. Each respondent has assigned the same rank order to the list of their teammates (as derived from their raw peer assessment ratings). If  W is 0, then there is no overall trend of agreement among the respondents. The team’s responses may be regarded as essentially random. Intermediate values of W indicate a greater or lesser degree of unanimity among the various responses.

Note that a correlation coefficient statistic can range from -1.0 to +1.0, with zero indicating the complete absence of a linear relationship. An inverse relationship is signalled by -1.0. In contrast, concordance, W, can lie only between +1 (complete agreement) and zero (no agreement).

Exclusion of self-assessment ratings

In Peer Assess Pro, the calculation of the concordance statistic, W, is modified to exclude the self-assessed scores of each teammate (Willerman, 1955; Lewis & Johnson, 1971; Gibbons & Chakraborti, 2003). This modification can be applied only when all members of the team have conducted their peer assessment survey AND there are at least four team members.

When self-assessments are excluded, but all team members provide a peer assessment rating then the set of ratings is known as a Youden Square Design. Technically speaking, a Youden Square Design is a ‘balanced incomplete block design with square data matrices and zero major diagonal’. In plain language, in a team of five members, every member is rated four times, and also provides four ratings of the other team members.

Tied rankings

The effect of ties is to reduce the value of W. However, this effect is small unless there are a large number of ties. Peer Assess Pro applies the standard statistical correction to account for ties in the calculation of W (Gibbons & Chakraborti, 2020, ch. 12; Siegel & Castellan, 1988, p. 266, and Zaiontz, 2021). This correction is demonstrated in Example C below.

Where everyone is equally average

When there are many tied ranks, this is a symptom of one or more low-quality individual assessments. In the extreme case of low-quality team assessment where everyone rates everyone the same, then the concordance value W will be zero. No agreement. These results signify no agreement because the assessors have made no distinction in relative ranking amongst the teammates they are assessing. If the team is truly and honestly unable to make a distinction about the relative contribution of its teammates (W=0), then by mathematical definition the team result will be awarded by Peer Assess Pro to the teammates.        

Significance of the concordance statistic

Several significance tests can be used to test the concordance statistic, W, against the null hypothesis of no agreement. In other words, if the significance test p(W) is GREATER than 10 per cent we accept that there is NO statistically valid agreement amongst the teammates. More correctly, we ACCEPT the null hypothesis that there is NO AGREEMENT. Conversely, if the significance test p(W) is LESS than 10 per cent we ACCEPT that THERE EXISTS statistically valid agreement amongst the peer assessment ratings provided by the teammates to each other in their team.

Conventionally, the chi-square test is used to test concordance W for significance for large values of team size (Kendall & Gibbons, 1990). For team size <=10, alternative tests for significance are required, such as the F-Test or the results of a  Monte Carlo permutation test.

When self-assessments are excluded, Willerman (1955) provided a significance test based on the Beta distribution restricted to significance values of 1 per cent and 5 per cent. To provide greater range and precision, Peer Assess Pro has implemented significance tests derived from a Monte Carlo permutation test presented in Lewis & Johnson (1971). The Lewis and Johnson test values cover significance values over the range from p(W) = 0.1 per cent (Exceptionally significant) through p(W) = 99 per cent (Essentially insignificant). This ranging enables the teacher to more clearly see the extent to which their entire class’ peer assessment ratings and agreement are significant.

Requirement for calculating concordance

The concordance statistic, W and its significance, p(W) can only be determined practicably for teams

For teams of three members the concordance statistic can be calculated. However, even when W=1 the significance is weakly significant, p(W) >10%. The same personal result should be awarded all team members.

Team discrimination table shows class concordance statistics

A table of concordance statistics for the entire class, W and p(W), is presented in the Team Discrimination display available from the Teachers Dashboard in Advanced Statistics. You can sort the table by selecting the column header.

In general, the teacher would accept as valid the contribution-based personal results derived from peer assessment for teams Sparkling Violeteer and Black Robins. In contrast, the teacher would generally choose to award the Team Result or a Peer Assessed Score of 50/100 to the teams Red Ruru and Xinjiang Ground-Jay.

Several measures of team discrimination in teammates’ peer assessment ratings

Team

Range

PA Score (Mean)

Team Result

ConcordanceW

Significance p(W)

Sparkling Violetear

71.3

47.9

75.0

0.80

1.0%

Black Robins

55.9

60.6

95.0

1.00

2.5%

Red Ruru

34.1

77.3

30.0

0.25

98.0%

Xinjiang Ground-Jay

61.6

57.5

80.0

0.24

98.0%

Pukekos

8.8

54.5

85.0

Grey Warblers

89.2

57.4

75.0

Example calculations

Five teammates in team Sparkling Violetear have rated each other on three separate occasions, A, B, C. The results of the detailed examples are summarised in the table, with the interpretation of the level of significance.

Example

Concordance W

Significance p(W)

Interpretation

A

0.8

1%

The teammates agree very strongly at a level that is exceptionally significant.

B

0.333

50%

The teammates agree slightly at a level that is essentially insignificant.

C

(with ties)

0.76

1%

The teammates agree strongly at a level that is exceptionally significant.

Example A. High rating agreement amongst the team members

Five teammates in team Sparkling Violetear have rated each other with the following peer-assessed subscores calculated from their teammate peer assessment survey.

Ex. A. Peer assessed subscores: Team Sparkling Violetear

ASSESSOR

ASSESSOR

ASSESSEE

Peer Assessed Score

Yode

Andy

Mike

Pete

Pat

Yode

74

85

95

3

98

100.

Andy

63

68

75

60

63

60.

Mike

56

65

53

100

50

58.

Pete

44

23

48

58

50

50.

Pat

3

3

8

0

0

100

The Peer Assessed Score for each Assessee excludes the Self-Assessed Score.

Excluding the self-assessment scores yields the following table of rankings, where the top-ranked assessee is awarded a rank of 1.

Ex A. Ranked peer-assessed subscores, excluding self-assessed score

ASSESSOR

ASSESSEE

Yode

Andy

Mike

Pete

Pat

Yode

1

3

1

1

Andy

1

1

2

2

Mike

2

2

3

3

Pete

3

3

2

4

Pat

4

4

4

4

Kendall’s Concordance W is calculated from the formula by Willerman, and Lewis and Johnson.

Where

, the sum of squared deviations

 number of teammates in the team

the rank given to Assessee i by Assessor j

, the total rank awarded to Assessee i by the other  Assessors j, excluding the self-assessment  

 the mean value of the total ranks


Ex A. Calculation of Concordance, W, from ranks

ASSESSEE

Total Ranks,  

Yode,

 j=1

Andy,

j=2

Mike,

 j=3

Pete,

j=4

Pat,

 j=5

Yode, i=1

6

16

1

3

1

1

Andy, i=2

6

16

1

1

2

2

Mike, i=3

10

0

2

2

3

3

Pete, i=4

12

4

3

3

2

4

Pat, i=5

16

36

4

4

4

4

Totals

= 

50/5 = 10

72

From the previous table of ranked peer assessed subscores

 number of teammates in the team = 5

The calculation of  for Yode, where i=1

 

Now, Kendall’s Concordance is

Significance is determined from the table Values of S Associated with a Given Probability and a Given Number of  Judges (Lewis and Johnson) adapted to enable lookup by the value of W, since  .

For n = 5, the critical values are W = .756 and W = 0.822 at the 1 per cent and 0.5 per cent levels. Since W = 0.80 is greater than the critical value of 0.756 but less than 0.822, then we adopt the conservative significance level p(W) at the 1 per cent level.

Conclusion for Example A

Given that the concordance statistic is significant at the 1 per cent level, we REJECT the null hypothesis that there is NO AGREEMENT amongst the raters. From a statistical perspective, we accept that the team has submitted a fair and valid teammate peer assessment. We could say ‘The teammates agree very strongly at a level that is exceptionally significant.’ No Active Warning is generated for this Example A.

Note how it is the rating by Assessor Mike of Teammate Yode, that reduces the concordance from a perfect 1.0 down to 0.80. This outlier rating is identified in Peer Assess Pro through the Active Warning 0042 Outlier individual rating. However, in this case, the outlier rating does not have a material impact on the Peer Assessment Scores and, therefore, the contribution-based Personal Results that would be awarded.

Example B. Low agreement amongst the team members

On another occasion, the five teammates in Team Sparkling Violetear rated each other with the following peer-assessed subscores calculated from their teammate peer assessment survey. The ratings are similar to those in Example A, except that two ratings by Andy are swapped, signalled by *, assessing Yode and Pat.

Ex. B. Peer assessed subscores: Team Sparkling Violetear

ASSESSOR (m)

ASSESSEE

Peer Assessed Score

Yode

Andy

MIke

Pete

Pat

Yode

52

85

8 *

3

98

100

Andy

63

68

75

60

63

60

Mike

57

65

53

100

50

58

Pete

45

23

48

58

50

50

Pat

25

3

95 *

0

0

100


Ex B. Calculation of Concordance, W, from ranks

ASSESSEE

Total Ranks,  

Yode,

 j=1

Andy,

j=2

Mike,

 j=3

Pete,

j=4

Pat,

 j=5

Yode, i=1

9

1

4

3

1

1

Andy, i=2

6

16

1

1

2

2

Mike, i=3

10

0

2

2

3

3

Pete, i=4

12

4

3

3

2

4

Pat, i=5

13

9

4

1

4

4

Totals

= 

50/5 = 10

30

Now, Kendall’s Concordance is

For Example B, the concordance, W, is calculated as 0.333, which yields a significance value between 50% and 25% (Lewis and Johnson, 1971). Conservatively, we adopt a significance value of p(W) = 50 per cent. Compared with Example A, (Concordance, W = 0.80) the concordance value here, W = 0.33, is lower and essentially insignificant. Consequently, we can state ‘The teammates agree slightly at a level that is essentially insignificant’.

Conclusion for Example B

For Example B, Peer Assess Pro will present the Active Warning

Team Sparkling Violeteer’s ratings of each other fail to agree with acceptable significance. Significance p(W) = 50%. Concordance (W) = 0.33.

Given the poor significance, worse than the threshold of 10 per cent, we accept the hypothesis that there is no agreement amongst the raters. The team has submitted a potentially unfair and invalid teammate peer assessment from a statistical perspective.  In this example, the Teacher should either

  1. Award the same personal result to each team member: A Peer Assessed Score of 50, or the same Team Result, or
  2. Encourage the team to engage in a ‘courageous conversation’ to review and discuss the basis for how they have awarded Expert (superior), Intermediate (average) and Novice (inferior) performance to each of their teammates, and/or
  3. Request Andy and Mike to reconsider their assessment of the teammates, according to the guidance presented in Active Warning 0042 Outlier individual rating. 

In this case, we note that it is Assessors Andy and Mike who are the outlier assessors compared with the other three teammates, particularly in their assessment of Yode. But can we be confident about that suspicion? Perhaps we can only verify by consulting with the team.

Example C. Teammates show low discernment by submitting tied ranks

This example revisits team Sparkling Violetear where several teammates, Pat, Pete and Mike, have been less discerning in their ratings compared with Example A. The adjustments are signalled with * in the following table.  

Mike has rated two teammates the same. Pete has rated three teammates the same, and Pat has rated everyone the low, same rating except himself!

Ex. C. Peer assessed subscores: Team Sparkling Violetear

ASSESSOR

ASSESSOR

ASSESSEE

Peer Assessed Score

Yode

Andy

Mike

Pete

Pat

Yode

23

85

95

3

50 *

30 *

Andy

55

68

75

60 *

63

30 *

Mike

50

65

53

100

50 *

30 *

Pete

40

23

48

60 *

50

30 *

Pat

3

3

8

0

0

100

The Peer Assessed Score for each Assessee excludes the Self-Assessed Score.

The ranking for these peer assessments subscores is shown here. Note that tied ranks are calculated using the RANK.AVERAGE function. For example, the ranks awarded by Pete are {(2 + 3)/2, 1, (2 + 3)/2, 4} = {1, 2.5, 2.5, 4}. Similarly, the four identical ranks awarded by Pat are each (1+2+3+4)/4 = 10/4 = 2.5.

Ex C. Calculation of Concordance, W, from ranks

ASSESSEE

Total Ranks,

Total Ranks Squared,  

Yode,

 j=1

Andy,

j=2

Mike,

 j=3

Pete,

j=4

Pat,

 j=5

Yode, i=1

11.5

132.25

3.5

3

2.5

2.5

Andy, i=2

6

36

1

1.5

1

2.5

Mike, i=3

8

64

2

1

2.5

2.5

Pete, i=4

9

81

3

2

1.5

2.5

Pat, i=5

15.5

240.25

4

3.5

4

4

Totals

553.5

Now, Kendall’s Concordance, correcting for ties in a Youden Square is

, the square of the total ranks awarded to Assessee i by the other  Assessors j, excluding the self-assessment.

        the correction factor accounting for tied ranks, in this case,  78.

For this revised example for team Sparkling Violeteer, the concordance, W, is calculated as 0.76, which is significant at the 1 per cent level. Consequently, we can state ‘The teammates agree strongly at a level that is exceptionally significant’.

Note that compared with Example A, Andy is now ranked best in the team, indicated by his lowest sum of ranks, 6. Yode is now ranked one of the poor performers in this team with sum of ranks 11.5. Despite Pat’s attempts to manipulate the peer assessment, he still remains ranked worst in the team indicated by the highest sum of ranks 15.5, since his self-assessment is excluded from the calculation of the Peer Assessed Score and Personal Result.

The W value for Example C is corrected for the tied rankings awarded by Mike, Pete and Pat with the correction factor T = 78. The calculation of the correction factor, T, is beyond the scope of this explanation but see Gibbons & Chakraborti, (2020, ch. 12), Siegel & Castellan (1988, p. 266) and a worked example in Zaiontz, (2021).

In general, the effect of ties is to reduce the value of W. In this case, the uncorrected value of W = 0.59 has a lesser significance, between 5 and 10 per cent. These values compare with the tie-corrected values of W = 0.76 and a significance of 1 per cent.

A note on computational efficiency

Note that the calculation of the numerator in Example C is numerically equivalent to the formula used in Examples A and B. However, the method of Example C is computationally more efficient in not requiring the calculation of . In other words, in the specific case of an n-sized Youden Square

 

Therefore,

, the square of the total ranks awarded to Assessee i by the other  Assessors j, excluding the self-assessment.

 number of teammates in the team

the rank given to Assessee i by Assessor j

the correction factor accounting for tied ranks as per Gibbons & Chakraborti (2020, ch. 12), Siegel & Castellan (1988, p. 266), and Zaiontz (2021).

Quick links and related information

FAQ: How is an outlier peer assessment rating identified? WARNING 0042 

Training: Practice developing and applying a peer assessment survey rubric

Guidelines for conducting a team-based courageous conversation

Durbin, J. (1951). Incomplete Blocks in Ranking Experiments. British Journal of Statistical Psychology, 4(2), 85–90. https://doi.org/10.1111/j.2044-8317.1951.tb00310.x

Gibbons, J. D., & Chakraborti, S. (2020). Nonparametric Statistical Inference (Apple Books; 6th ed.). Chapman and Hall/CRC Press.

Kendall, Maurice. G., & Babington-Smith, B. (1939). The Problem of m Rankings. The Annals of Mathematical Statistics, 10(3), 275–287. https://doi.org/10.1214/aoms/1177732186

Lewis, G. H., & Johnson, R. G. (1971). Kendall’s Coefficient of Concordance for Sociometric Rankings with Self Excluded. Sociometry, 34(4), 496–503. https://doi.org/10.2307/2786195

Mellalieu, P. J. (2021, April 28). Is peer assessment valid for determining individual grades in group work? Better Feedback. Better Teams. https://www.peerassesspro.com/is-peer-assessment-valid-for-determining-individual-grades-in-group-work/

Siegel, S., & Castellan, N. J., Jr. (1988). Nonparametric Statistics for the Behavioral Sciences (2nd ed.). New York: McGraw-Hill. p. 266. ISBN 978-0-07-057357-4.

Willerman, B. (1955). The adaptation and use of Kendall’s Coefficient of Concordance (W) to sociometric-type rankings. Psychological Bulletin, 52(2), 132–133. https://doi.org/10.1037/h0041665

Zaiontz, C. (2021). Kendall’s Concordance (W) Coefficient. Real Statistics Using Excel. https://www.real-statistics.com/reliability/interrater-reliability/kendalls-w/


FAQ: How is an outlier peer assessment rating identified? WARNING 0042

If one member of a team submits a peer assessment for an assessee ‘materially different’ than the assessments given by the other team members, this difference gives rise to an Active Warning in Peer Assess Pro titled

Critical Warning 0042 Outlier individual rating

A team member has assessed another team member very differently than the other team members.

Warning detail

The extended detail for the Active Warning displays one or more messages such as:

Harris  assessed  Michael  awarding a PA Subscore of 38. Compared with the average rating by the other team members of 70 this subscore DEPRESSED the PA Score to 64 by 7 PA Score Units. Team Alpha

Josef  assessed  Alvin  awarding a PA Subscore of 100. Compared with the average rating by the other team members of 66 this subscore RAISED the PA Score to 73 by 7 PA Score Units. Team Alpha

An Outlier individual rating warning will be raised ONLY if the impact on the assessee’s Peer Assessed Score is raised or lowered by more than 5 PA Score Units outside the average rating given by the other members of the team.  Note. The threshold will be adjusted to 10 PA Score Units by August 2022.

The warning will be generated only for members of a valid assessed team, as detailed in

FAQ: What is a valid assessed team?

Failure to agree across the whole team

When the team members in a particular team all fail to agree on their ratings, then most of the team will receive this Active Warning. This is a symptom of poor training in peer assessment or application of that training. This situation is identified through the Active Warning CRITICAL 0041 Insignificant team agreement: A team's ratings of each other fail to agree.

See FAQ:  How is insignificant team agreement identified? WARNING 0041        

Example calculations

Consider team Alpha containing 5 members where Adam has been assessed with the following Peer Assessed Subscores by the other four team members.

Impact of removing one Assessor from the calculation of Peer Assessed Score for Adam

Assessee

Assessor

PA Subscore

Team Size

PA Score

PA Score Exclusive

Assessor Impact

Impact Direction

Adam

Edward

53

5

73

80

-7

DEPRESSED

Adam

Mary

63

5

73

77

-4

Adam

Stephanie

78

5

73

72

1

Adam

Josef

100

5

73

64

9

RAISED

Adam has a Peer Assessed Score of 73, calculated from his four team members subscores as follows:

= (Edward + Mary + Stephanie + Josef) / (Team size - 1)

= (53 + 63 + 78 + 100) / (5 - 1)

= 294 / 4

= 73.5

To determine the impact of Edward’s assessment of Adam, we can calculate the Peer Assessed Score Adam would receive from just the other three members as follows:

PA Score Exclusive         = (Mary + Stephanie + Josef) / (Team size - 2)

= (63 + 78 + 100) / 3

= 241 / 3

= 80.3

Therefore, the Assessor’s impact is the difference between the whole-of-team’s originally-calculated PA Score and the PA Score Exclusive

Impact        = PA Score - PA Score Exclusive

                        = 73.5 - 80.3

                                = - 6.8

We observe that the impact of Edward’s relatively low assessment of Adam has an impact that DEPRESSED Adam’s overall Peer Assessed Score by about 7 PA Units.

Peer Assessed Pro presents the following detail:

Edward  assessed  Adam, awarding a PA Subscore of 53. Compared with the average rating by the other team members of 80 this subscore DEPRESSED the PA Score to 73 by 7 PA Score Units. Team Alpha

In contrast, we see Josef’s rating of 100 had an impact that raised Adam’s Peer Assessment score. The following detailed outlier warning is presented:

Josef  assessed  Adam, awarding a PA Subscore of 100. Compared with the average rating by the other team members of 64 this subscore RAISED the PA Score to 73 by 9 PA Score Units. Team Alpha

Note that Adam’s self-assessed score is never used to determine a Peer Assessed Score. Therefore, there is no requirement to test for the impact of dropping his self-assessment from calculation of the Peer Assessed Score.

Threshold for warning of outlier individual peer rating

The threshold for raising this Warning in Peer Assess Pro is +/- 5 PA Score units, the ThresholdOutlier constant. That is, if one assessor’s rating would affect the PA Score awarded to an assessee by more than 5 units, then the Outlier Warning will be raised.

In the previous example, the impact on Adam by assessors Mary and Stephanie is within the ThresholdOutlier constant of 5 PA Score units, so no outlier warning message is generated for these two assessors.

Note. The threshold will be adjusted to 10 PA Score Units by August 2022.

Alternative mathematical calculation of Assessor Impact

A more elegant method for calculating the Assessor Impact follows. First, calculate  the Peer Assessed Score for Assessed student s, excluding the PA Subscore  awarded by Assessor a. Self-assessments are excluded.

Where

t = the number of team members in the team in which s is a team member.

 = the Peer Assessed Score for assessed student s

 = the peer assessment score awarded by Assessor a to Assessee s

The Assessor Impact,  of removing Assessor a’s assessment of Assessee s is  

No calculation is made for the case a = s, since self-assessments are excluded from the process.

Alternative example calculations

Consider Edward’s assessment of Adam using the data from the table above.

Quick links and related information

FAQ: How is the Peer Assessed (PA) Score calculated?

FAQ: What is a valid assessed team?

FAQ:  How is insignificant team agreement identified? WARNING 0041        


FAQ: What is a mismatched self-assessment (IRSA)? WARNING 0040

If a team member submits a self-assessment that is ‘materially different’ than the assessments given by their other team members, this difference gives rise to an Active Warning in Peer Assess Pro titled

Critical Warning 0040 Mismatched self-assessment

A team member's self assessment is materially different to the peer assessment given by their team

Warning detail

The extended detail for the Active Warning displays one or more messages such as:

Gregor’s self-assessment of 63 is UNDERCONFIDENT compared with the peer assessment of 93 given by others in team  Charlie.  IRSA = 148

Daphne’s self-assessment of 68 is OVERCONFIDENT compared with the peer assessment of 34 given by others in team  Alpha.  IRSA = 51

The warning will be generated only for members of a valid assessed team, as detailed in

FAQ: What is a valid assessed team?

Furthermore, the warning will only be generated when a student has completed their self-assessment as part of their peer assessment submission.


Threshold for warning of mismatched self-assessment

The Peer Assess Pro system constant ThresholdIrsaUnderconfident is defined as 115. Values greater than or equal to ThresholdIrsaUnderconfident will raise the UNDERCONFIDENT active warning. In general, about 7 % to 16 % of students will be flagged with this warning.

The Peer Assess Pro system constant ThresholdIrsaOverconfident is defined as 75. Values less than or equal to ThresholdIrsaOvererconfident will raise the OVERCONFIDENT active warning. In general, about 7 % to 16 % of students will be flagged with this warning.

Example calculations

The Mismatched self-assessment warning is raised from the value of the Index of  Realistic Self Assessment (IRSA) that is calculated for each student.

See FAQ

FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?

The warning of UNDERCONFIDENT is raised when IRSA for a student is greater than 115, the ThresholdIrsaUnderconfident.

The warning of OVERCONFIDENT is raised when IRSA for a student is less than 75, the ThresholdIrsaOverconfident.

Sample of several Peer Assessed Scores and self-assessments

Name

PA Score

PA Self

IRSA

Confidence

Abel

96.7

100

96.7

Baker

100.0

82.5

121.2

UNDERCONFIDENT

Charlie

82.1

70

117.3

UNDERCONFIDENT

Daphne

34.2

67.5

50.6

OVERCONFIDENT

Edward

95.8

87.5

109.5

Consider the case of Daphne

Since 50.6 is less than 75, then Daphne’s self-assessment is regarded as OVERCONFIDENT. Consequently, the Mismatched self-assessment warning is raised.

The extended detail message is:

Daphne’s self-assessment of 68 is OVERCONFIDENT compared with the peer assessment of 34 given by others in team  Alpha.  IRSA = 51

Recommended action for facilitator

For UNDERCONFIDENT and OVERCONFIDENT students, Peer Assess Pro generates an email that the teacher can send optionally. The email recommends the student arrange an appointment to meet with the teacher to explore the reasons for the variation in self and others’ peer assessment.

Quick links and related information

FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?

FAQ: How do I interpret measures of realistic self-assessment?

FAQ: What is an ‘at risk’ team member? WARNING 0036

FAQ: How is the Peer Assessed (PA) Score calculated?

FAQ: What is a valid assessed team?


FAQ: What is an adjusted team arrangement request? WARNING 0006

Context: LMS version only

Through the Peer Assess Pro survey, one or more respondents may indicate that their team membership is incorrect. Perhaps a the respondent has been incorrectly assigned to an incorrect team. Or the respondent observes that team members should be added to, or deleted from their team. Peer Assess Pro streamlines the process of handling these adjustments and ensures tha the LMS team arrangement matches the team arrangement in the active running peer assessment activity.

 The survey response gives rise to an Active Warning in Peer Assess Pro titled

Critical Warning 0006 Adjusted teamset request

A participant has advised that the membership of a team requires urgent adjustment.

Warning detail

The extended detail for the Active Warning displays one or more messages such as:

Participant Jayden Williams in Team Khan advises that a team requires adjustment to its membership. Jayden Williams should be reassigned to Team Bravo.

Participant Jane Seymour in Team Bravo advises that a team requires adjustment to its membership. Micheal Brown should be added to Team Charlie.

Available actions

  1. Peer Assess Pro presents the teacher with a schedule of the proposed reassignments
  2. The teacher can override the proposed target team by selecting from a dropdown menu that shows all teams currently known within the running peer assessment activity
  3. Reassign: The teacher initiates the proposed reassignment of the student to a new team
  4. Peer Assess Pro confirms the proposed reassignment to the new team then immediately completes the reassignment in the team arrangement on the LMS, overriding the current LMS arrangement
  5. The teacher, having completed the reassignment of students, reviews Team Composition. The Team Composition will now highlight the current mismatch between the new LMS Team Arrangement and the current team arrangement in the running Peer Assess Pro activity. This step helps the teacher double-check that the new team arrangement is correct before making their final commitment to make the adjustments!
  6. In response to the Active Warning 0021 Team arrangement unsynchronised the teacher initiates ‘Synchronise all’ which commits to the new team arrangement
  7. The results of the reassignments are now synchronised between the LMS team arrangement, and that found within Peer Assess Pro.
  8. Notifications will be sent automatically to participants in affected teams
  1. Students who have already completed the survey will be asked to update their survey response by suppling additional reviews for the student(s) ADDED to the team
  2. Students reassigned to a new team will be asked to submit a new survey for their new team
  1. Students in teams where the only change is a REMOVAL of a team member will not receive a notification. Those teams’ survey responses will be recovered for the existing team members.

Special cases

Reassign a new participant to an existing or new team

There are two circumstances where the teacher can only respond to the Adjusted teamset request Active Warning through using the LMS team arrangement facilities

  1. Where an additional team is required that is not yet known to the running peer assessment
  2. Where an additional participant is required who is not yet known to the running peer assessment.

In these special cases the reassignment action must be undertaken by either

  1. Creating the new team using the LMS team arrangement facilities or
  2. Assigning the additional participant to an existing or a new team

In both cases, the new or existing team must then be incorporated into the team arrangement (grouping, group set, teamset) that was used to launch the peer assessment.

Upon returning to the Teachers Dashboard in Peer Assess Pro, the updated team arrangement will be detected and will raise the Active Warning 0021 Team arrangement unsynchronised

  1. Review and confirm the proposed team arrangement in Available Actions > Team Composition to confirm
  2. Initiate ‘Synchronise all’ in response to the Active Warning 0021 Team arrangement unsynchronised.

Quick links and related information

FAQ: How do I correct the team composition in a running peer assessment activity?

FAQ - How do I resolve an unsynchronised team arrangement? ACTIVE WARNING 0021

FAQ - What is an inactive team member? WARNING 0048


FAQ: What is an inactive team member? WARNING 0048

Context: LMS version only

Through the Peer Assess Pro survey, one or more respondents may indicate that one or more of their team members has been inactive. An inactive student may have

  1. Delivered contributions of work that were absent, late, plagiarised, or required substantial rework
  2. Generally failed to attend scheduled team meetings or class activities
  3. Generally failed to respond to communications from team members
  4. Generally failed to deliver on promises made to the team.

 The survey response gives rise to an Active Warning in Peer Assess Pro titled

Critical Warning 0048 Inactive team member

A team member has been reported as essentially absent or unproductive.

Warning detail

The extended detail for the Active Warning displays one or more messages such as:

Lazy Lizzie in Team Zealous has been reported as essentially inactive by 2 team member(s) Zane Gray, Zoe Green.

Absent Abe in Team High Achievers has been reported as essentially inactive by 1 team member(s) Dean Smith.

At the top of the list are presented those team members with the highest number of inactive reports: those most deserving of the teacher’s attention.

Available actions

Peer Assess Pro presents the teacher with several available actions

  1. Send message: Request a meeting with the alleged inactive student to explore the circumstances.
  2. Exclude from calculations: Eliminate the teammate peer assessment ratings given by and awarded to this student from the calculation of the team’s peer assessed scores and personal results. The inactive student remains in the team, earns a missing (zero) grade and can access their personal feedback report.
  3. Reassignment: Remove the student from the peer assessment activity; Reassign the student to another team; or reassign to a new ‘team of one’.

Good practice for addressing an alleged inactive student

  1. Review the personal feedback report of the alleged inactive student. Are the peer assessments of the inactive student consistently low? Do the qualitative remarks by most team members support the low rating and alleged inactivity?
  2. Message: Send a notification to the alleged inactive student
  3. Determine your choice of action in response to your communications with the alleged inactive student
  4. Mid-course formative: Gain a commitment from the inactive student to materially improve their contribution to the group work. Determine whether to reassign the student to another team, or to a ‘team of one’. Optionally, implement the reassignment option from the Active Warning on the Peer Assess Pro platform (LMS), or through the platform team arrangement features (Xorro and LMS).
  5. Final summative: If the peer assessment is near the end of the course, you would generally not transfer the student to another team. In this case, you select Exclude from calculations the peer assessment ratings. The result is that
  1. The inactive student is awarded missing (zero) for their Personal Result and Peer Assessed Score
  2. The personal feedback report made available to the inactive student contains all the survey ratings and qualitative feedback provided by their teammates. However, their Personal Result and Peer Assessed Score is missing (zero).
  3. The potential upward bias in calculating personal results for the active teammates that arises from including an inactive student is eliminated. This elimination of bias is illustrated in the example calculations.

Special cases

Reassigning a student to a new ‘team of one’

This reassignment action must be undertaken by creating the new team using the LMS team arrangement facilities.

  1. Create a new team within the team arrangement (grouping, group set, teamset) that was used to launch the peer assessment.
  2. Assign the inactive student to the new team.
  3. ‘Synchronise all’ in response to the Active Warning 0021 Team arrangement unsynchronised that will be raised in response to the creation of the new ‘team of one’

Exclude from calculations overrides actions that address insignificant team agreement

When the Exclude from calculations action has been set for a student through a teacher’s response to Active Warning 0048 Inactive Team Member this action OVERRIDES certain responses to Active Warning 0041 Insignificant Team Agreement. If the teacher chooses the action Assign same personal result to all members of the team then only the active team members will earn the relevant personal result. The inactive student will earn a missing result, as dictated by the ‘Exclude from Calculations’ action.


Example of exclusion from calculations

The following example shows how excluding the peer assessment result of team member Bridget leads towards

  1. An adjustment in the Peer Assessed Scores of the three remaining students, either up or down
  2. A reduction in the range of Personal Results awarded to the remaining active students when using the Normalised Personal Result Method (NPR).
  3. A downwards shift in Personal Results awarded to the remaining active students when using the Normalised Personal Result Method (NPR).
  4. The mean of the results for the remaining team members equals the Team Result in the specific case of the Normalised Personal Result (NPR) and Rank-Based Personal Result Method (RBPR)

Consider the following Peer Assessed SubScores awarded by each team member (Assessor) to their teammates (Assessees).

Calculation of Peer Assessed Scores (PAS) for a team of four members

ASSESSEE

ASSESSOR

Bridget

Julian

Lydia

Nigella

Bridget

-

62.5

75

72.5

Julian

87.5

-

87.5

82.5

Lydia

75

82.5

-

80

Nigella

0

77.5

82.5

-

Peer Assessed Score, †

54

74

82

78

Peer Assessed Score = mean of scores awarded by each assessor, excluding self-assessed score.

For Lydia, PAS = (75 + 87.5 + 82.5) / 3 = 82

See FAQ: How is the Peer Assessed (PAS) Score calculated?


From the Peer Assessed Score for each Assessee we can now calculate a variety of Personal Results from which the teacher may select.

Calculation of Personal Results in a team of four members - original

Spreadfactor = 2.0

ASSESSEE

ASSESSOR

Bridget

Julian

Lydia

Nigella

Mean

Peer Assessed Score, †

54

74

82

78

Peer Assessed Index, PA Index

66

90

100

95

Team Result, TR

50

50

50

50

Indexed Personal Result, IPR

33

45

50

48

44

Normalised Personal Result, NPR

(Spreadfactor = 2)

28

52

62

58

50  

By definition for the NPR method, the mean of a team’s NPR values equals the team result.

The method of calculation is specified in FAQ: How is the Normalised Personal Result (NPR) calculated? 

Suppose that Assessor Nigella advised the teacher that Assessee Bridget was essentially inactive. The teacher reviews the Bridget’s Personal Feedback Report, consults with Bridget, and determines that the ratings of Bridget by Julian and Lydia were somewhat generous! It’s an end-of course summative peer assessment. Consequently, the teacher chooses to Exclude from calculations the peer assessments relating to Bridget.

The revised calculation of Peer Assessed Scores for the team yields the following table based on the remaining, active team members. The subscores awarded by the three active assessors remain unchanged, but the Peer Assessed Scores will usually be adjusted, possibly up or down for each student due to the exclusion of the inactive student’s ratings.


Re-calculation of Peer Assessed Scores (PAS) with excluded team member

ASSESSEE

ASSESSOR

Bridget

Julian

Lydia

Nigella

Bridget

-

-

-

-

Julian

-

-

87.5 *

82.5 *

Lydia

-

82.5 *

-

80 *

Nigella

-

77.5 *

82.5 *

-

Peer Assessed Score

-

80

85

81.25

 * Indicates no change from original table

For Lydia, PAS = (87.5 + 82.5) / 2 = 85

The revised Peer Assessed Scores yield these consequential adjustments to the personal results of the team.

Calculation of Personal Results with excluded team member

Spreadfactor = 2.0

ASSESSEE

ASSESSOR

Bridget

Julian

Lydia

Nigella

Mean

Peer Assessed Score, PA Score

-

80

85

81

Peer Assessed Index, PA Index

-

94

100 *

95

Team Result, TR

50 *

50 *

50 *

50 *

Indexed Personal Result, IPR

-

47

50 *

48

48

Normalised Personal Result, NPR

(Spreadfactor = 2)

-

48

53

49

50  *

 * Indicates no change from original table

By definition, the mean of a team’s NPR values equals the team result.

Impact of exclusion from calculations

Comparing the two tables of personal results, observe that

  1. The Peer Assessed Scores of the remaining team members are changed. The impact could be up or down for any student
  2. The Peer Assesed Index (PAI) and Indexed Personal Result (IPR) for the top-ranked team member remains unchanged at 100. However, it is possible that the exclusion of the inactive student’s ratings could affect the within team ranking of the active team members, in response to adjustments to the Peer Assessed Scores.
  3. Most notably for the Normalised Personal Result (NPR) method
  1. The Personal Results of the active lower ranked students drops from usually being above the Team Result to below the Team Result.
  2. There will be a reduction in the range of Personal Results awarded to the remaining active students when using the Normalised Personal Result Method (NPR). For the original whole team case, including Bridget, the range is 28 to 62 = range 34. In the original case, the range of NPR for the active students is 52 to 62 = range 10. In the case of Bridget excluded from the calculations, the range is from 48 to 53 = range 5, a drop from 10 points to 5 points in this case.
  3. A downwards shift in Personal Results awarded to the remaining active students when using the Normalised Personal Result Method (NPR).

Quick links and related information

FAQ: How is the Peer Assessed (PA) Score calculated?

FAQ: How is the Normalised Personal Result (NPR) calculated?

FAQ: How are peer assessment and personal results calculated and defined mathematically?

FAQ: What is an ‘at risk’ team member? WARNING 0036

FAQ: What is an adjusted teamset request? WARNING 0006


FAQ: What is a low-quality team rating? WARNING 0050

Suppose a team collectively submits a set of peer assessments that are both

This feature is an indication that the team may have engaged unconstructively with the peer assessment process. When these conditions are both fulfilled, an Active Warning in Peer Assess Pro is generated:

Critical Warning 0050 Low-quality team rating

A team may have engaged unconstructively with peer assessment

Warning detail

The extended detail for the Active Warning displays one or more messages such as:

Team Alpha  may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 100 and low range 0.

Team Bravo  may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 96 and low range 8.

Team Charlie  may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 96 and low range 8.

The warning will be generated only for members of a valid assessed team, as detailed in

FAQ: What is a valid assessed team?


Threshold for warning of low-quality team rating

The Peer Assess Pro system constant ThresholdTeamAverage is defined as 90.

The Peer Assess Pro system constant ThresholdTeamRange is defined as 11.

The Active Warning Low-quality team rating 0050 is generated for a team when both conditions are true:

Example calculations

Suppose Team Mike contains 6 members, whose Peer Assessed Scores are shown below. The Average Peer Assessed Score and Range of Peer Assessed Scores are calculated.

Peer Assessed Scores for members of Team Mike

Name

Peer Assessed Score

Annie

93.75

Emma

92.55

Joe

90.85

Freddie

92.50

Tammy

95.88

Tilly

88.32

Team Average

92 = 553 / 6

Team Range

8 = 95.88 - 88.32

The Team Average Peer Assessed Score and Team Range are examined for every team. A low quality team rating is identified for those teams that breach the Threshold parameters defined below.

Identification of low quality team ratings

Team

Team Average Peer Assessed Score

Team Range

Low Quality Team Rating

Alpha

100

0

YES

Mike

92

8

YES

November

87

8

NO

Oscar

95

9

YES

Papa

85

9

NO

Quebec

95

10

YES

Romeo

92

12

NO

Recommended action for facilitator

In general, a team that has this warning may have engaged unconstructively with peer assessment. Most team members have not entered the spirit of the peer assessment process. They may have attempted to ‘game’ the peer assessment by giving everyone well above typical or average ratings.

Peer Assess Pro provides the facilitator with the option to send out an email to all members of the team suggesting they may wish to reconsider their ratings. Furthermore, the students are encouraged to provide qualitative evidence in support of the ratings they have provided.

High performing teams

In a small proportion of teams, it is possible that a high performing team will ALSO have this Active Warning generated. In a high performing team all team members contribute effectively to the results and team processes. This outcome will be evident to the teacher through the team gaining a high Team Result for their submitted work.


Example case

The following graph shows a large first year university class of 848 students who have undertaken their first, formative experience of Peer Assess Pro. Of the 84 valid teams, 24 teams are identified as potentially having a low quality team rating.

 

In a case like this, the teacher might consider guiding the class of students towards more constructive and discriminating peer assessment before undertaking the final, summative peer assessment. For example, remind students of the purpose of peer feedback, and how to provide useful feedback.

Quick links and related information

FAQ: What is a valid assessed team?

FAQ: What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process?

FAQ: What is the purpose of peer assessment?

FAQ: How do I provide useful feedback to my team members?

FAQ: What is a low quality assessor rating?

Active Warning 0050 Under development

Scheduled for deployment November 2022

Aim: Expanded scope to identify LOW RANGE of Peer Assessment Scores OR High Average Scores

Suppose a team collectively submits a set of peer assessments that is EITHER

These features are an indication that the team may have engaged unconstructively with the peer assessment process. When EITHER of these conditions is fulfilled, an Active Warning in Peer Assess Pro is generated:

Critical Warning 0050 Low-quality team rating

A team may have engaged unconstructively with peer assessment

Warning detail

The extended detail for the Active Warning displays one or more messages such as:

Team Alpha may have engaged unconstructively with peer assessment. HIGH Peer Assessment Scores awarded. Average 100. LOW range 10.

Team Mike may have engaged unconstructively with peer assessment. HIGH Peer Assessment Scores awarded. Average 96. Range 40.

Team Quebec may have engaged unconstructively with peer assessment. Peer Assessment Scores awarded. Average 5. LOW range 8.

The warning will be generated only for members of a valid assessed team, as detailed in

FAQ: What is a valid assessed team?


Threshold for warning of low-quality team rating

The Peer Assess Pro system constant ThresholdTeamAverage is defined as 90.

The Peer Assess Pro system constant ThresholdTeamRange is defined as 10.

The Active Warning is generated for a team when EITHER condition is true:

Example calculations

Suppose Team Mike contains 5 members, whose Peer Assessed Scores are shown below. The Average Peer Assessed Score and Range of Peer Assessed Scores are calculated.

Peer Assessed Scores for members of Team Mike

Name

Peer Assessed Score

Emma

93

Joe

90

Freddie

92

Tammy

96

Tilly

89

Team Average

92 = 460 / 5

Team Range

7 = 96 - 89

The following Active Warning is generated for Team Mike

Team Mike may have engaged unconstructively with peer assessment. HIGH Peer Assessment Scores awarded. Average 92. LOW Range 7.

The Team Average Peer Assessed Score and Team Range are examined for every valid-assessed team. A low-quality team rating is identified for those teams that breach either of the Threshold parameters defined earlier.

Identification of low-quality team ratings

Team

Team Average Peer Assessed Score

Team Range

Low Quality Team Rating?

Alpha

100 HIGH

0 LOW

YES

Mike

92 HIGH

7 LOW

YES

November

90 HIGH

20

YES

Oscar

87

8 LOW

YES

Papa

85

12

NO

Quebec

50

10 LOW

YES

Romeo

50

12

NO

Sierra

20

20

NO

Recommended action for facilitator

In general, a team that has this warning may have engaged unconstructively with peer assessment. Most team members have not entered the spirit of the peer assessment process. They may have attempted to ‘game’ the peer assessment by giving everyone well above typical or average ratings.

Peer Assess Pro provides the facilitator with the option to send out an email to all members of the team suggesting they may wish to reconsider their ratings. Furthermore, the students are encouraged to provide qualitative evidence in support of the ratings they have provided.

High-performing teams

In a small proportion of teams, it is possible that a high-performing team will ALSO have this Active Warning generated. In a high-performing team all team members contribute effectively to the results and team processes. This outcome will be evident to the teacher through the team gaining a high Team Result for their submitted work.

Example case

The following graph shows a large first year university class of 848 students who have undertaken their first, formative experience of Peer Assess Pro. Of the 84 valid teams, 24 teams are identified as potentially having a low quality team rating.

 

In a case like this, the teacher might consider guiding the class of students towards more constructive and discriminating peer assessment before undertaking the final, summative peer assessment. For example, remind students of the purpose of peer feedback, and how to provide useful feedback.

Quick links and related information

FAQ: What is a valid assessed team?

FAQ: What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process?

FAQ: What is the purpose of peer assessment?

FAQ: How do I provide useful feedback to my team members?

FAQ: What is a low quality assessor rating?


FAQ: What is a low-quality assessor rating? WARNING 0300

Suppose a team member submits a set of peer assessments that are both

This feature is an indication that the team member may have engaged unconstructively with the peer assessment process. When these conditions are both fulfilled, an Active Warning in Peer Assess Pro is generated:

Critical Warning 0300 Low-quality assessor rating

An assessor may have engaged unconstructively with peer assessment.

Warning detail

The extended detail for the Active Warning displays one or more messages such as:

Tony may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 100 and low range 0.  Team Alpha

Kathy may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 85 and low range 8. Team Bravo

The warning will be generated only for members of a valid assessed team, as detailed in

FAQ: What is a valid assessed team?

Threshold for warning of low quality assessor rating

The Peer Assess Pro system constant ThresholdAssessorAverage is defined as 85.

The Peer Assess Pro system constant ThresholdAssessorRange is defined as 9.

The Active Warning is generated for an assessor when BOTH conditions are true:

Example calculations

Suppose Kathy a member of Team Bravo assesses all her fellow team members as follows:

Peer Assessed SubsScores assessed by Kathy in Team Bravo

Name

Peer Assessed Sub Score

Garry

90

Dan

87.5

Sunny

82.5

Freddie

82.5

Robby

82.5

Average

85 = 425 / 5

Range

7.5 = 90 - 82.5

The Average Peer Assessed Score and Range are examined for every assessor. A low quality assessor rating is identified for those individuals that breach the threshold parameters defined above..

Identification of low quality individual assessor ratings

Assessor

Average PA Score (awarded)

Range (PAS Units)

Low quality assessor rating

Tony

100

0.0

Y

Andy

93

0.0

Y

Jess

75

0.0

N

Johnny

93

2.5

Y

Chance

82

2.5

N

Kathy

85

7.5

Y

Zara

83

7.5

N

Riley

97

10.0

N

Recommended action for facilitator

In general, an individual with this warning may (or may not!) have engaged unconstructively with peer assessment. The team member may not have entered the spirit of the peer assessment process. They may have attempted to ‘game’ the peer assessment by giving everyone well above typical or average ratings.

Peer Assess Pro provides the facilitator with the option to send out an email to the assessor suggesting they may wish to reconsider their ratings. Furthermore, the student is encouraged to provide qualitative evidence in support of the ratings they have provided.

High performing teams

In a small proportion of teams, it is possible that a member of a high performing team will ALSO have this Active Warning generated. In a high performing team all team members contribute effectively to the results and team processes. Consequently, it is reasonable to expect a high average Peer Assessed Score to be awarded most members, with a concurrent low range. This outcome will be evident to the teacher through the assessor’s team gaining a high Team Result for their submitted work.

Quick links and related information

FAQ: What is a valid assessed team?

FAQ: What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process?

FAQ: What is the purpose of peer assessment?

FAQ: How do I provide useful feedback to my team members?

FAQ: What is a low quality team rating?


FAQ: What is a valid assessed team? WARNING 0022

Peer Assess Pro restricts the display of results to teachers and students when a small number of peer assessments from a team has been submitted - less than three, or less than half. This restriction gives rise to an Active Warning in Peer Assess Pro titled

Critical Warning 0022 Insufficient team responses

The number of responses from a team is insufficient for presenting valid results.

Warning detail

The following extended detail is provided

Alpha  has received 2 team member responses. Minimum required 3 from team size 4

Bravo  has received 3 team member responses. Minimum required 4 from team size 6

Results not displayed to members of non-valid assessed teams

Peer Assess Pro restricts the display of results to valid assessed teams. The notion of a valid assessed team is to prevent the display of results to students (and facilitators) when a small number of peer assessments from a team has been submitted. Such a low response situation could distort the reliability and accuracy of both the team’s peer assessment and personal result calculations, and ACTIVE WARNING messages for a team. Consequently, class statistics such as mean, maximum, range, and standard deviation are calculated only for team members that are designated as part of a valid assessed team.

Students can only view results if they belong to a valid assessed team.

A facilitator may only view results from valid assessed teams.


How many valid and invalid teams do I have?

The Teacher’s Dashboard Active Warnings and (i) Information button inform you of the number of valid teams and valid assessments throughout the progress of managing the peer assessment responses. The Active Warning enables you to ‘hunt down’ the teams that have not yet achieved valid status.

Recommended action for facilitator

Peer Assess Pro generates an email that the teacher can send optionally to members of non-valid team who have not yet responded. The email reminds students to respond by the Due Date.

Mathematical definition

For teams with five or fewer members, a valid assessed team must have peer ratings from at least three members of the team. For teams with six or more team members ‘just over’ half the team members must peer assess. The required minimum number of team members  who must rate within a particular team of size n members is defined as:

Where

= the minimum number of team members required to rate within a particular team

is a function that selects the maximum of the calculated values

 is a function that calculates the integer value of the result

For teams of size 0, 1 and 2 peer assessment results are not calculated. The default Personal Pesult in these circumstances is the Team Result.

Example calculations

Team size,

Required minimum assessors,

Proportion of whole team

3

3

100%

4

3

75%

5

3

60%

6

4

66%

7

4

57%

8

5

62%

9

5

56%

10

6

60%

Quick links and related information

FAQ: How is the Peer Assessed (PA) Score calculated?

FAQ: How are peer assessment and personal results calculated and defined mathematically?


FAQ: What is an ‘at-risk’ team member? WARNING 0036

An ‘at-risk’ team member has been rated amongst the bottom 10 per cent of students in the class as measured by

A low rating on any of these assessments gives rise to an Active Warning in Peer Assess Pro titled

Critical Warning 0036 At-risk team member

A team member has been rated at-risk.

Warning detail

The following extended detail is provided

Anne Smith is at-risk. Personal Result 25 LOW. Recommendation 2.3 LOW. Peer Assessment Score 90. IRSA 60 OVERCONFIDENT. Team Alpha.

The warning identifies with ‘LOW’ which of the measures presented falls in the at-risk range.

The Index of Realistic Self Appraisal, IRSA, is also identified with OVERCONFIDENT or UNDERCONFIDENT when the threshold for overconfidence or underconfidence is exceeded, as defined in Critical Warning 0040 Mismatched self-assessment.

The most at-risk students are listed first. Specifically, the list of at-risk students is sorted by

  1. Personal Result ascending, then
  2. Recommendation ascending, then
  3. Peer Assessed Score ascending

Recommended action for facilitator

Peer Assess Pro generates an email that the teacher can send optionally to team members with an ‘at risk’ rating. The email requests that the team member make an appointment to meet promptly with the teacher to discuss their peer assessment results so they can develop a more productive contribution to the team's future outputs, processes and leadership.

The facilitator could view the Personal Feedback Report of low rated team members examining the qualitative feedback given. You should expect that the qualitative feedback will confirm the low Recommendation or Peer Assessed Scores. It will be helpful to have reviewed these Personal Feedback Reports prior to your interviewing and counselling the at risk students who visit you.

Furthermore, a low value of IRSA, less than 75, suggests that the at-risk student is likely to be surprised or angered by the low peer assessment and/or recommendation provided by their teammates. See the WARNING that is generated by this latter condition:

FAQ: What is a mismatched self-assessment (IRSA)? WARNING 0040

Threshold for warning of ‘at-risk’ team member

The Peer Assess Pro system constants ThresholdPaScore, ThresholdRecommendation and ThresholdPersonalResult are defined to identify approximately the bottom 10 per cent of students in the class.

Where

The Active Warning is generated for an assessor when ANY condition is true:

The test is conducted only when the team has sufficient assessments to qualify as a valid assessed team.

The PERCENTILE function identifies the point in the sorted array of results below which 10 per cent of the scores in the array fall. Note that the MEDIAN of a set of results is equivalent to PERCENTILE(50%,...) of those results.


Example calculation

Consider this dataset of results from an end-of-course group assignment comprising 16 students in four teams.

Original dataset of personal results and other peer assessment data sorted by personal results

Name

Team

Personal result

Recommend

ation

Peer Assessed Score

IRSA

Realism

Jesse Crane

Game Plan

27.6

3.0

70.0

72

OVERCONFIDENT

Jakob Bradley

Game Plan

29.3

4.5

73.8

87

REALISTIC

Brogan Madden

Game Plan

36.1

4.0

88.8

178

UNDERCONFIDENT

Jayson Mayo

Atomic Bombs

40.7

3.0

41.7

48

OVERCONFIDENT

Cecilia Rosales

Cupcakes

50.6

4.0

75.0

97

REALISTIC

Alfredo Koch

Cupcakes

62.9

5.0

90.0

113

REALISTIC

Ean Cisneros

Atomic Bombs

64.3

3.7

63.3

101

REALISTIC

Kylan Shea

Cupcakes

64.9

4.7

92.5

92

REALISTIC

Aubrey Jarvis

Atomic Bombs

69.5

3.5

68.1

MISSING

Jesse Hughes

Cupcakes

69.7

4.7

98.4

98

REALISTIC

Elsie Riggs

College Dropouts

73.3

3.3

59.2

64

OVERCONFIDENT

Dalton Vincent

College Dropouts

75.4

3.7

60.9

66

OVERCONFIDENT

Aryan Huffman

Atomic Bombs

83.3

4.0

80.8

92

REALISTIC

Ariel Peterson

Atomic Bombs

84.3

4.7

81.7

91

REALISTIC

Cortez Farmer

College Dropouts

100.0

5.0

97.5

126

UNDERCONFIDENT

Dakota Stafford

College Dropouts

100.0

5.0

95.1

106

REALISTIC

The following Active Warnings will be presented, with the most at-risk presented first in the list.

A team member has been rated at-risk in class (4)

Active Warning 0036 At-risk team member

Jesse Crane is at-risk.  Personal Result (NPR) 28 LOW.  Recommendation 3.0 LOW.  Peer Assessed Score (PAS) 70.  Self-assessment (IRSA) 72 OVERCONFIDENT.   Team Game Plan.

Jakob Bradley is at-risk.  Personal Result (NPR) 29 LOW.  Recommendation 4.5.  Peer Assessed Score (PAS) 74.  Self-assessment (IRSA) 87 REALISTIC.   Team Game Plan.

Jayson Mayo is at-risk.  Personal Result (NPR) 41.  Recommendation 3.0 LOW.  Peer Assessed Score (PAS) 42 LOW.  Self-assessment (IRSA) 48 OVERCONFIDENT.   Team Atomic Bombs.

Elsie Riggs is at-risk.  Personal Result (NPR) 73.  Recommendation 3.3.  Peer Assessed Score (PAS) 59 LOW.  Self-assessment (IRSA) 64 OVERCONFIDENT.   Team College Dropouts.

The thresholds that determine the foregoing at-risk selections are calculated from the class results in the following table, and compared with other statistics

Calculation of 10 per cent thresholds

Selected statistics

personal result

p_rec

pa_score

SDEV

22.6

0.7

16.2

MAXIMUM

100.0

5.0

98.4

MEAN

64.5

4.1

77.3

MEDIAN

67.2

4.0

77.9

MINIMUM

27.6

3.0

41.7

THRESHOLD @ 10.% PERCENTILE

29.3

3.0

59.2

In this case of n=16 assessed students, the 10th percentile occurs at the 2nd lowest item in the sorted results for each variable, according to the formula

The original dataset of results is colour-coded below in which the at-risk candidates for selection are highlighted with *. Threshold values are highlighted by ¶, the second item counted upwards in the sorted column of results, since for this case.

Note how Brogan Madden is not selected, as he is better than the cutoff threshold on all three statistics. In contrast, Jayson Mayo is selected because of his below-threshold Recommendation and Peer Assessed Score, the lowest values in the dataset. Similarly, Elsie Riggs is selected because of her low-ranking Peer Assessed Score.

Source data sorted sidentifying threshold points and at-risk selections

Name

Team

Personal result

Recommend

ation

Peer Assessed Score

IRSA

Realism

* Jesse Crane

Game Plan

27.6

3.0

70.0

72

OVERCONFIDENT

* Jakob Bradley

Game Plan

¶ 29.3

4.5

73.8

87

REALISTIC

Brogan Madden

Game Plan

36.1

4.0

88.8

178

UNDERCONFIDENT

* Jayson Mayo

Atomic Bombs

40.7

 ¶ 3.0

41.7

48

OVERCONFIDENT

Cecilia Rosales

Cupcakes

50.6

4.0

75.0

97

REALISTIC

Alfredo Koch

Cupcakes

62.9

5.0

90.0

113

REALISTIC

Ean Cisneros

Atomic Bombs

64.3

3.7

63.3

101

REALISTIC

Kylan Shea

Cupcakes

64.9

4.7

92.5

92

REALISTIC

Aubrey Jarvis

Atomic Bombs

69.5

3.5

68.1

MISSING

Jesse Hughes

Cupcakes

69.7

4.7

98.4

98

REALISTIC

* Elsie Riggs

College Dropouts

73.3

3.3

 ¶ 59.2

64

OVERCONFIDENT

Dalton Vincent

College Dropouts

75.4

3.7

60.9

66

OVERCONFIDENT

Aryan Huffman

Atomic Bombs

83.3

4.0

80.8

92

REALISTIC

Ariel Peterson

Atomic Bombs

84.3

4.7

81.7

91

REALISTIC

Dakota Stafford

College Dropouts

100.0

5.0

97.5

106

REALISTIC

Cortez Farmer

College Dropouts

100.0

5.0

95.1

126

UNDERCONFIDENT

* At-risk students selected

¶ Threshold value based on 10th PERCENTILE

Alternative approaches to identifying at-risk students

The teacher has several graphical approaches to identifying the most at-risk students in their class

FAQ: What is a mismatched self-assessment (IRSA)? WARNING 0040


Identifying at-risk students from sorted table of Recommendations


Identifying at risk students from sorted table of Peer Assessed Scores with concurrent examination of Self Assessment

Quick links and related information

FAQ: What steps can I take to get a better personal result?

FAQ: What is a valid assessed team?

FAQ: How is the Peer Assessed (PA) Score calculated?

FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?

FAQ: What is a mismatched self-assessment (IRSA)? WARNING 0040


FAQ: What is a team with low psychological safety? WARNING 0034

Low team psychological safety is often associated with poor teamwork processes and, therefore, the prospect of poor team results (Cauwelier et al., 2016; Edmondson, 1999; Edmondson & Lei, 2014; Google Re;Work, n.d.; Google re:Work, n.d.; Kim et al., 2020).

A team is collectively designated ‘unsafe’ when the team’s median response to the Peer Assess Pro survey psychological safety statement lies within the bottom class responses.

Critical Warning 0034 Low team psychological safety

A team has identified that several members feel psychologically unsafe in their team.

Warning detail

The following extended detail is provided

Team Bravo teammates have stated they feel strongly unsafe about taking risks, making mistakes, or challenging their teammates. Team Safety 1.5.

The warning details are sorted by increasing value of Team Safety. Those teams at the top of the list are more likely to require the teacher’s intervention.

Definitions

The Active Warning 0034 Low team psychological safety is raised when

 

The team psychological safety for is defined as

Where

        = the psychological safety of team member i

         = the number of survey respondents in team t

 = 1.5

Rather than using the mean value, the median is used because it is not skewed by a small proportion of extremely large or small values. The median provides a better representation of a ‘typical’ value for the team.

In  general, psychological safety is measured using multiple statements to which team members are asked to rate their strength of agreement or disagreement (Edmondson, 1999; Google Re:Work).

  1. If you make a mistake on this team, it is often held against you.
  2. Members of this team are able to bring up problems and tough issues.
  3. People on this team sometimes reject others for being different.
  4. It is safe to take a risk on this team.
  5. It is difficult to ask other members of this team for help.
  6. No one on this team would deliberately act in a way that undermines my efforts.
  7. Working with members of this team, my unique skills and talents are valued and utilized.

In pursuit of parsimony, Peer Assess Pro uses one statement that captures the most important essence of the construct of team psychological safety.

‘In my team, I feel safe to take risks, make mistakes, speak about tough issues, or challenge my teammates.’

On a five-point Likert scale from 1 to 5, the team members respond 1 = Strongly disagree’ through 5 = Strongly Agree. This is the value for presented in the earlier definition for Team Safety.

Team Safety is calculated only for teams that meet the definition of a ‘valid assessed team’.

Example calculation for Team Safety

The team members of Team Alpha have self-reported their psychological safety as follows.

Example 1

Psychological safety self-reported from a team of five members

Survey respondent: Safety

Team

Bridget

Patrick

Julian

Lydia

Nigella

Alpha

1

5

2

1

5

The team psychological safety for is defined as

Therefore, for Team Alpha

 

Example 2

The team members of Team Bravo have self-reported their psychological safety as follows.

Psychological safety self-reported from a team of four members

Survey respondent: Safety

Team

Gillian

Jenny

Jules

Jollion

Bravo

1

5

1

2

Therefore, for Team Bravo

 

The Active Warning 0034 Low team psychological safety will be raised for Team Bravo, but not for Team Alpha.

Note that the related Critical Warning 0035 Unsafe team member will be raised for Bridget and Lydia in Team Alpha, and for Gillian and Jules in Team Bravo.

Recommended action for facilitator

Peer Assess Pro generates an email that the teacher can send optionally to all members of teams that present an unsafe team psychological health. The email requests that the team members make an appointment to meet promptly with the teacher.

The email states

…several of your team members stated that they 'Strongly Disagree' with the survey question 'In my team, I feel safe to take risks, make mistakes, speak about tough issues, or challenge my teammates'. The team's response is a symptom that your team may have low psychological safety. Low team psychological safety is often associated with poor teamwork processes by your team as a whole, and, therefore, poor team results in the future. Please make an appointment to meet promptly with your teacher to discuss the team's response, so the teacher can coach your team to improve its future processes, leadership and outputs.

Consider undertaking these actions with the team.

  1. Help the team apply the Manager Actions for Psychological Safety (Google Re:Work)
  2. Promote courageous conversations among your students (Step 6 in Mellalieu, 2020).
  3. Conduct a feedback event amongst the team under your guidance (Step 6 in Mellalieu, 2020).
  4. Help students understand and respond to personal feedback reports that mismatch their self-expectation.

Quick links and related information

FAQ: What is an unsafe team member? WARNING 0035

FAQ: What is an ‘at risk’ team member? WARNING 0036

Cauwelier, P., Ribière, V. M., & Bennet, A. (2016). Team Psychological Safety and Team Learning: A Cultural Perspective. The Learning Organization, 23(6), 458–468.

Edmondson, A. C. (1999). Psychological Safety and Learning Behavior in Work Teams. Administrative Science Quarterly, 44(2), 350–383. https://doi.org/10.2307/2666999

Edmondson, A. C., & Lei, Z. (2014). Psychological Safety: The History, Renaissance, and Future of an Interpersonal Construct. Annual Review of Organizational Psychology and Organizational Behavior, 1(1), 23–43. https://doi.org/10.1146/annurev-orgpsych-031413-091305

Google Re:Work. (n.d.). [Re_Work] Manager Actions for Psychological Safety. https://docs.google.com/document/d/1PsnDMS2emcPLgMLFAQCXZjO7C4j2hJ7znOq_g2Zkjgk/export?format=pdf

Google re:Work. (n.d.). Understand Team Effectiveness [Guide]. Retrieved 18 November 2019, from https://rework.withgoogle.com/print/guides/5721312655835136/

Kim, S., Lee, H., & Connerton, T. P. (2020). How Psychological Safety Affects Team Performance: Mediating Role of Efficacy and Learning Behavior. Frontiers in Psychology, 11, 1581. https://doi.org/10.3389/fpsyg.2020.01581

Mellalieu, P. J. (2020). STEP 6—Promote courageous conversations among your students. In How to teach using group assignments: The 7 step formula for fair and effective team assessment (ePub 1.0, Chapter 8). Peer Assess Pro. https://www.peerassesspro.com/encourage-courageous-conversations/


FAQ: What is an unsafe team member? WARNING 0035

Low team psychological safety is often associated with poor teamwork processes and, therefore, the prospect of poor team results (Cauwelier et al., 2016; Edmondson, 1999; Edmondson & Lei, 2014; Google Re;Work, n.d.; Google re:Work, n.d.; Kim et al., 2020).

A team member is designated ‘unsafe’ when they respond ‘Strongly disagree’ to the survey statement

‘In my team, I feel safe to take risks, make mistakes, speak about tough issues, or challenge my teammates.’

On a five-point Likert scale from 1 to 5, the student has responded 1 = Strongly disagree’.

A ‘Strongly disagree’ rating gives rise to an Active Warning in Peer Assess Pro titled

Critical Warning 0035 Unsafe team member

A team member has identified they feel psychologically unsafe in their team.

Warning detail

The following extended detail is provided

Anna Smith has stated they feel strongly unsafe about taking risks, making mistakes, or challenging their teammates. Recommendation 1.2. Team Alpha.

The warning details are sorted by increasing value of Recommendation. Those team members at the top of the list are more likely to require the teacher’s intervention.

Recommended action for facilitator

Peer Assess Pro generates an email that the teacher can send optionally to team members with an unsafe self-rating. The email requests that the team member make an appointment to meet promptly with the teacher. The team member is advised

‘Your response is a symptom that your team as a whole has relatively weak psychological safety. Low team psychological safety is often associated with poor teamwork processes and, therefore, poor team results in the future. Please make an appointment to meet promptly with your teacher to discuss your response, so the teacher can coach your team to improve its future processes, leadership and outputs.’

For further guidance on how to improve a team member’s psychological health, see the recommended actions for FAQ: What is a team with low psychological safety? WARNING 0034

Limitations

In pursuit of parsimony, Peer Assess Pro uses one statement to capture the most important essence of the construct of team psychological safety. In  general, psychological safety is measured using multiple statements to which team members are asked to rate their strength of agreement or disagreement (Edmondson, 1999; Google Re:Work).

  1. If you make a mistake on this team, it is often held against you.
  2. Members of this team are able to bring up problems and tough issues.
  3. People on this team sometimes reject others for being different.
  4. It is safe to take a risk on this team.
  5. It is difficult to ask other members of this team for help.
  6. No one on this team would deliberately act in a way that undermines my efforts.
  7. Working with members of this team, my unique skills and talents are valued and utilized.

 

Quick links and related information

FAQ: What is a team with low psychological safety? WARNING 0034

FAQ: What is an ‘at risk’ team member? WARNING 0036

Cauwelier, P., Ribière, V. M., & Bennet, A. (2016). Team Psychological Safety and Team Learning: A Cultural Perspective. The Learning Organization, 23(6), 458–468.

Edmondson, A. C. (1999). Psychological Safety and Learning Behavior in Work Teams. Administrative Science Quarterly, 44(2), 350–383. https://doi.org/10.2307/2666999

Edmondson, A. C., & Lei, Z. (2014). Psychological Safety: The History, Renaissance, and Future of an Interpersonal Construct. Annual Review of Organizational Psychology and Organizational Behavior, 1(1), 23–43. https://doi.org/10.1146/annurev-orgpsych-031413-091305

Google Re:Work. (n.d.). [Re_Work] Manager Actions for Psychological Safety. https://docs.google.com/document/d/1PsnDMS2emcPLgMLFAQCXZjO7C4j2hJ7znOq_g2Zkjgk/export?format=pdf

Google re:Work. (n.d.). Understand Team Effectiveness [Guide]. Retrieved 18 November 2019, from https://rework.withgoogle.com/print/guides/5721312655835136/

Kim, S., Lee, H., & Connerton, T. P. (2020). How Psychological Safety Affects Team Performance: Mediating Role of Efficacy and Learning Behavior. Frontiers in Psychology, 11, 1581. https://doi.org/10.3389/fpsyg.2020.01581


FAQ - What emails have been sent by the platform?

Survey notifications history

Notifications History shows the email notifications that have been sent by the Peer Assess Pro platform to participants. The history also records event notifications sent by email to the Facilitator. The delivery status of the email is designated as SENT, DELIVERED or FAILED.

Notifications History shows emails that are sent automatically by the Peer Assess Pro platform, and those initiated by the facilitator in response to Active Warnings.

The Notifications History feature is presented at the very bottom of the Facilitator Dashboard. Click on the column Emails sent, then Message/View to examine the email sent to a specific recipient.

Track-and-trace of emails to participants

The Notifications History is helpful for audit purposes such as when a student denies receiving an email from you.

The delivered status of the message is designated as

The internet will take a few minutes, even hours to confirm the final state of SENT emails. You’ll need to REFRESH or RELOAD your Running Activity to update the status of the Notifications History.

Overview of Survey Notification History feature in Peer Assess Pro

Delivered status of messages sent from Peer Assess Pro platform

Quick links and related information

FAQ: What is the content of emails sent by Peer Assess Pro to Participants?

FAQ: What is the content of emails sent by Peer Assess Pro to Facilitators?

FAQ - How do I fix an invalid, missing or failed email delivery? WARNING 0026


FAQ: What is the content of emails sent by Peer Assess Pro to Participants?

Preview email from Active Warnings

In the Active Warnings section of the Facilitators Dashboard, select Preview Email

  1. You can send the email using the Peer Assess Pro platform selecting Send Emails
  2. Optionally, you can copy the body of the email into your own email app, then edit and send as you desire.

Note that a record of the email sent from Active Warnings is recorded in the Survey Notifications History. See FAQ - What emails have been sent by the platform?

Preview all emails available for sending

The first table shows the SUBJECT title of the email generated in response to automated various events and responses to warnings activated by the Facilitator.

The second table shows the detailed content of each email. The Facilitator can, of course, copy this template text, modify, and send their own email.

Some emails are automatically generated by the Peer Assess Pro platform, such as 0011 Request to COMPLETE peer assessment and 0013 RESUBMIT peer assessment due to TEAM CHANGE.

Other emails are sent under the direction of the Facilitator when they respond to an Active Warning. Examples include 0103 WARNING Request to RECONSIDER peer assessment: Assessor unconstructive

The content of email generated by Peer Assess Pro undergoes regular review and improvement. Details may not match exactly the detail presented here.

Quick links and related information

FAQ: What is the content of emails sent by Peer Assess Pro to Facilitators?

FAQ - What emails have been sent by the platform?

FAQ - How do I fix an invalid, missing or failed email delivery? WARNING 0026


Table of email subjects sent to participants

Email ID - Priority - Short Descriptor

SUBJECT

0011 CRITICAL - Participant - Request to COMPLETE peer assessment

SUBJECT - Please complete peer assessment due by << Due Date >>. <<Activity Title>>

0012 CRITICAL - Participant - REMINDER to complete peer assessment

SUBJECT - REMINDER! Please complete peer assessment due by << Due Date >>. <<Activity Title>>

0013 CRITICAL - Participant - RESUBMIT peer assessment due to TEAM CHANGE

SUBJECT - RESUBMIT! Please complete peer assessment due by << Due Date >>. <<Activity Title>>

0020 CRITICAL - Participant - ABANDONED Peer Assessment activity.

SUBJECT - ABANDONED peer assessment for peer assessment due by << Due Date >>. <<Activity Title>>

0103 WARNING - Participant - Request to RECONSIDER peer assessment: Assessor unconstructive

SUBJECT - Request to reconsider peer assessment due by << Due Date >>. <<Activity Title>>

1001 ADVISORY - Participant - Personal results PUBLISHED and available to view

SUBJECT - Please view your personal results for peer assessment due by << Due Date >>. <<Activity Title>>

1002 ADVISORY - Participant - REVISED personal results published and available to view

SUBJECT - REVISED RESULTS! Please view your personal results for peer assessment due by << Due Date >>. <<Activity Title>>

1003 ADVISORY - Participant - FINALISED personal results published and available to view

SUBJECT - FINALISED RESULTS! Please view your personal results for peer assessment <<Activity Title>>. Available until << finalisation date + 2 weeks >>

1004 ADVISORY - Participant - Personal results PUBLISHED but NOT available to view

SUBJECT - Incomplete submissions from your team for peer assessment due by << Due Date >>. <<Activity Title>>

1005 ADVISORY - Participant - FINALISED personal results published but NOT available to view

SUBJECT - FINALISED RESULTS: Incomplete submissions from your team for peer assessment due by << Due Date >>. <<Activity Title>>


Table of email body text sent to participants, listed by Email ID and Subject

Email ID - Priority - Recipient - Short Descriptor

SUBJECT - Subject

Detail

0011 - CRITICAL - Participant - Request to COMPLETE peer assessment

SUBJECT - Please complete peer assessment due by << Due Date >>. <<Activity Title>>

Dear <<team member>>,

Please complete the Peer Assess Pro peer assessment activity << Activity Title>> for your team before << Due Time>> << Due Date >>.

To complete the activity, please visit the Activity URL << Activity Specific URL>>.

The peer assessment requires a Login ID. Usually, the Login ID will be your student id, unless your teacher has advised an alternative.

The Activity URL will become available for your responses from << Activity Start Time>> << Activity Start Date >>.

Team membership check

The following are your team members. If there is a mistake in this list please urgently advise your teacher the correct composition, using the email listed below.

<< Team Name>>

<<List of Team members>>

Further information

For further information about preparing for, and using the results of the peer assessment, please visit https://www.peerassesspro.com/resources/resources-for-students/.

You may find the answers to these Frequently Asked Questions helpful at this stage:

FAQ: What is the purpose of peer assessment?

FAQ: What questions are asked in the peer assessment survey?

FAQ: How do I provide useful feedback to my team members?

FAQ: I am unable to login. My login failed

Do not reply to this email. Rather, contact your teacher whose email is listed below.

Sent by Peer Assess Pro on behalf of

<< Teacher fullname >>

<< Teacher email >>

0012 - CRITICAL - Participant - REMINDER to complete peer assessment

SUBJECT - REMINDER! Please complete peer assessment due by << Due Date >>. <<Activity Title>>

Dear <<team member>>,

The peer assessment activity for << Activity Title>> will soon become unavailable for you to complete. Therefore, please complete the Peer Assess Pro peer assessment activity for your team before << Due Time>> << Due Date >>.

To complete the activity, please visit the Activity URL << Activity Specific URL>>.

The peer assessment requires a Login ID. Usually, the Login ID will be your student id, unless your teacher has advised an alternative.

The Activity URL became available for your responses from << Activity Start Time>> << Activity Start Date >>.

Further information

For further information about preparing for, and using the results of the peer assessment, please visit https://www.peerassesspro.com/resources/resources-for-students/.

You may find the answers to these Frequently Asked Questions helpful at this stage:

FAQ: What is the purpose of peer assessment?

FAQ: What questions are asked in the peer assessment survey?

FAQ: How do I provide useful feedback to my team members?

FAQ: I am unable to login. My login failed

Do not reply to this email. Rather, contact your teacher whose email is listed below.

Sent by Peer Assess Pro on behalf of

<< Teacher fullname >>

<< Teacher email >>

0013 - CRITICAL - Participant - RESUBMIT peer assessment due to TEAM CHANGE

SUBJECT - RESUBMIT! Please complete peer assessment due by << Due Date >>. <<Activity Title>>

Dear <<team member>>,

You may have already completed the Peer Assess Pro peer assessment for <<Activity Title>> due before << Due Time>> << Due Date >>.

I regret to advise that I require you to resubmit your survey. You response submitted to date has been deleted from the analysis. The reasons for this request may be due to a change to the membership of your team, such as a deletion or addition of a team member.

To complete the activity, please visit the Activity URL << Activity Specific URL>>.

Please resubmit your peer assessment for <<Activity Title>> due before << Due Time>> << Due Date >>.

We apologise for your inconvenience.

Team membership check

The following are your team members. If there is a mistake in this list please urgently advise your teacher the correct composition, using the email listed below.

<< Team Name>>

<<List of Team members>>

Further information

For further information about preparing for, and using the results of the peer assessment, please visit https://www.peerassesspro.com/resources/resources-for-students/.

You may find the answers to these Frequently Asked Questions helpful at this stage:

FAQ: How do I login to my peer assessment Activity URL

FAQ: How do I provide useful feedback to my team members?

FAQ: I am unable to login. My login failed

Do not reply to this email. Rather, contact your teacher whose email is listed below.

Sent by Peer Assess Pro on behalf of

<< Teacher fullname >>

<< Teacher email >>Team membership check

0020 - CRITICAL - Participant - ABANDONED Peer Assessment activity.

SUBJECT - ABANDONED peer assessment for peer assessment due by << Due Date >>. <<Activity Title>>

Dear <<team member>>,

You and your team members were invited to participate recently in the peer assessment for << Activity Title>> due << Due Date >>.

The Teacher ABANDONED the activity on << Abandoned Date >> due to exceptional circumstances. Please disregard any previous interim published results.

I apologise for your inconvenience.

Do not reply to this email. Rather, contact your teacher whose email is listed below.

Sent by Peer Assess Pro on behalf of

<< Teacher fullname >>

<< Teacher email >>

0103 - WARNING - Participant - Request to RECONSIDER peer assessment: Assessor unconstructive

SUBJECT - Request to reconsider peer assessment due by << Due Date >>. <<Activity Title>>

Dear <<team member>>,

You recently completed the peer assessment of << Activity Title>>. However your teacher noted that your individual responses suggest you have not engaged constructively with the peer assessment process. Specifically, you may have:

- Rated all team members over a narrow range and/or

- Rated all team members overgenerously and/or

- The qualitative comments in your feedback failed to justify the ratings you provided.

If you feel that your ratings and feedback are justified, you need take no further action. For example, such high ratings may be justified for a team with evidence of exceptionally high performance on its tasks and outputs.

Alternatively, if you wish to resubmit a more accurate survey, please use the URL below to submit a replacement peer assessment survey. Please take special care to provide useful and accurate qualitative feedback that will help your team member(s) and teacher understand the ratings you have provided.

To complete the activity, please visit the Activity URL << Activity Specific URL>>.

Complete the revised Peer Assess Pro peer assessment activity for your team before << Due Time>> << Due Date >>.

Further information

For further information about preparing for, and using the results of the peer assessment, please visit https://www.peerassesspro.com/resources/resources-for-students/.

You may find these answers to these Frequently Asked Questions helpful at this stage:

FAQ: How do I provide useful feedback to my team members?

FAQ: What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process?

FAQ: Is my self-assessment used to calculate my Peer Assessed Score?

Do not reply to this email. Rather, contact your teacher whose email is listed below.

Sent by Peer Assess Pro on behalf of

<< Teacher fullname >>

<< Teacher email >>

1001 - ADVISORY - Participant - Personal results PUBLISHED and available to view

SUBJECT - Please view your personal results for peer assessment due by << Due Date >>. <<Activity Title>>

Dear <<team member>>,

You recently completed the peer assessment of << Activity Title>>. You may now view your Personal Result and feedback. Please visit the Activity URL << Activity Specific URL>>.

Your results will be available for you to view for a period of two weeks following the finalisation of the activity.

If you have specific questions or concerns about your Personal Results please contact the teacher promptly so that a remedy can be determined.

Further information

For further information about using the results of the peer assessment, please visit https://www.peerassesspro.com/resources/resources-for-students/.

You may find these answers to these Frequently Asked Questions helpful at this stage:

FAQ: How do I interpret the feedback results I've received from the peer assessment?

FAQ: What steps can I take to get a better personal result?

FAQ: I don't understand what my teammates are trying to tell me. How do I ask for better feedback?

FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern?

Do not reply to this email. Rather, contact your teacher whose email is listed below.

Sent by Peer Assess Pro on behalf of

<< Teacher fullname >>

<< Teacher email >>

1002 - ADVISORY - Participant - REVISED personal results published and available to view

SUBJECT - REVISED RESULTS! Please view your personal results for peer assessment due by << Due Date >>. <<Activity Title>>

Dear <<team member>>,

You recently completed the peer assessment of << Activity Title>>. You may now view your revised Personal Result and feedback. Please visit the Activity URL << Activity Specific URL>>.

Your results may have been revised from those previously made available to you. Reasons for revisions include:

- A change in Team Results

- Late peer assessment responses

- An adjustment to the method the teacher has used to calculate your personal result.

Your results will be available for you to view for a period of two weeks following the finalisation of the activity.

If you have specific questions or concerns about your Personal Results please contact the teacher promptly so that a remedy can be determined.

Further information

For further information about using the results of the peer assessment, please visit https://www.peerassesspro.com/resources/resources-for-students/.

You may find the answers to these Frequently Asked Questions helpful at this stage:

FAQ: How do I interpret the feedback results I've received from the peer assessment?

FAQ: What steps can I take to get a better personal result?

FAQ: I don't understand what my teammates are trying to tell me. How do I ask for better feedback?

FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern?

Do not reply to this email. Rather, contact your teacher whose email is listed below.

Sent by Peer Assess Pro on behalf of

<< Teacher fullname >>

<< Teacher email >>

1003 - ADVISORY - Participant - FINALISED personal results published and available to view

SUBJECT - FINALISED RESULTS! Please view your personal results for peer assessment <<Activity Title>>. Available until << finalisation date + 2 weeks >>

Dear <<team member>>,

You recently completed the peer assessment of << Activity Title>>. You may now view your final Personal Result and feedback. Please visit the Activity URL << Activity Specific URL>>.

Your results are available for you to view for a period of two weeks following the finalisation of the activity. That is, from now until << Finalisation date + two weeks >>.

Your results may have been revised from those previously made available to you. The revisions may have been due to:

- A change in Team Results

- Late peer assessment responses

- An adjustment to the method the teacher has used to calculate your personal result.

If you have specific questions or concerns about your Personal Results please contact the teacher promptly so that a remedy can be determined.

Further information

For further information about using the results of the peer assessment, please visit https://www.peerassesspro.com/resources/resources-for-students/.

You may find the answers to these Frequently Asked Questions helpful at this stage:

FAQ: How do I interpret the feedback results I've received from the peer assessment?

FAQ: What steps can I take to get a better personal result?

FAQ: I don't understand what my teammates are trying to tell me. How do I ask for better feedback?

FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern?

Do not reply to this email. Rather, contact your teacher whose email is listed below.

Sent by Peer Assess Pro on behalf of

<< Teacher fullname >>

<< Teacher email >>

1004 - ADVISORY - Participant - Personal results PUBLISHED but NOT available to view

SUBJECT - Incomplete submissions from your team for peer assessment due by << Due Date >>. <<Activity Title>>

Dear <<team member>>,

You recently completed the peer assessment << Activity Title>>. However, several team of your members have yet to complete their peer assessment. Consequently, you are restricted from viewing the results of the peer assessment as the results would not yet be valid.

You may wish to take action by reminding your team members to complete the peer assessment. Once the remainder of your team have completed their peer assessments, you will be able to view your final Personal Result and feedback. Please visit the Activity URL << Activity Specific URL>>.

Further information

For further information about using the results of the peer assessment, please visit https://www.peerassesspro.com/resources/resources-for-students/.

You may find the answers to these Frequently Asked Questions helpful at this stage:

FAQ: What is a valid assessed team?

FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern?

Do not reply to this email. Rather, contact your teacher whose email is listed below.

Sent by Peer Assess Pro on behalf of

<< Teacher fullname >>

<< Teacher email >>

1005 - ADVISORY - Participant - FINALISED personal results published but NOT available to view

SUBJECT - FINALISED RESULTS: Incomplete submissions from your team for peer assessment due by << Due Date >>. <<Activity Title>>

Dear <<team member>>,

You and your team members were invited to participate recently in the peer assessment for << Activity Title>> due << Due Date >>.

The Teacher finalised the results on << Finalisation Date >>. However, several of your team members failed to complete their peer assessment. Consequently, you are restricted from viewing the results of the peer assessment as the results are not valid.

Since the activity has been finalised, there is no option for further peer assessments to be submitted from your team.

Further information

For further information about using the results of the peer assessment, please visit https://www.peerassesspro.com/resources/resources-for-students/.

You may find the answers to these Frequently Asked Questions helpful at this stage:

FAQ: What is a valid assessed team?

FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern?

Do not reply to this email. Rather, contact your teacher whose email is listed below.

Sent by Peer Assess Pro on behalf of

<< Teacher fullname >>

<< Teacher email >>

FAQ: What is the content of emails sent by Peer Assess Pro to Facilitators?

The first table shows the SUBJECT of the emails sent TO the Facilitator in response to various automated events happening during the launch and management of a Peer Assess Pro activity.

The second table shows the detailed content of each email.

The content of email generated by Peer Assess Pro undergoes regular review and improvement. Details may not match exactly the detail presented here.

Quick links and related information

FAQ: What is the content of emails sent by Peer Assess Pro to Participants?

FAQ - What emails have been sent by the platform?

FAQ - How do I fix an invalid, missing or failed email delivery? WARNING 0026


Table of email subjects sent to facilitators

Email ID - Priority - Short Descriptor

SUBJECT

2001 ADVISORY - Facilitator - Launch successful

SUBJECT - SUCCESSFUL LAUNCH: Your peer assessment <<Activity Title>> . Due by <<Due Date>>

2002 ADVISORY - Facilitator - Manage progress

SUBJECT - MANAGE PROGRESS: Your peer assessment <<Activity Title>> . Due by <<Due Date>>

2006 ADVISORY - Facilitator - Due Date imminent

SUBJECT - DUE DATE IMMINENT: Review your peer assessment <<Activity Title>> . Due by <<Due Date>>

2008 ADVISORY - Facilitator - Due Date reached

SUBJECT - DUE DATE REACHED: Finalise your peer assessment <<Activity Title>> . Due by <<Due Date>>


Table of email body text sent to facilitators, listed by Email ID and Subject

Email ID - Priority - Recipient - Short Descriptor

SUBJECT - Subject

Detail

2001 - ADVISORY - Facilitator - Launch successful

SUBJECT - SUCCESSFUL LAUNCH: Your peer assessment <<Activity Title>> . Due by <<Due Date>>

Dear << Teacher Fullname >>,

You have launched successfully the peer assessment activity << Activity Title>>.

Number of participants << class size>> allocated to << Number of teams >> teams.

The peer assessment is available for students from <<Start Date >> and due for completion by <<Due Date>>.

Students complete the peer assessment at the survey URL <<Students Survey URL>>>.

To manage this activity, view your Teacher's Peer Assess Pro Dashboard here <<Teacher's Activity URL>>.

Manage the peer assessment

The Peer Assess Pro Quickstart Guide reminds you of the next steps you will take as you wait for students to respond. See https://www.peerassesspro.com/quickstart-guide-for-teachers/

Prepare your students for peer assessment

These multi-media resources will help you prepare your students for undertaking peer assessment through giving honest, fair, and useful feedback. See https://www.peerassesspro.com/resources/introducing-students-peer-assessment/

As a backup communication to your students, we recommend that the material in the next section 'Advise your students' is emailed and/or posted to your students on your Learning Management System Messaging facility.

Advise your students [Teacher, send this section by email and/or post on your LMS]

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Dear students,

The peer assessment << Activity Title>> is available for you to complete from <<Start Date >> and due for completion by <<Due Date>>.

Complete the peer assessment at the survey URL <<Students Survey URL>>>.

You will be sent emails from @peerassesspro.com to advise you when feedback results are available. and where to complete the survey. Please check your junk and spam email. Ensure you allow emails from @peerassesspro.com into your Important mailbox and add @peerassesspro.com as a Contact.

You may find these Frequently Asked Questions (FAQs) relevant before you start the peer assessment, view here https://www.peerassesspro.com/frequently-asked-questions-2/.

FAQ: What is the purpose of peer assessment?

FAQ: How do I provide useful feedback to my team members?

FAQ: What questions are asked in the peer assessment survey?

FAQ: What if I am unable to login to the peer assessment survey?

Students can find further information about peer assessment here https://www.peerassesspro.com/resources/resources-for-students/.

Kind regards

<< Teacher fullname >>

<< Teacher email >>

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Further information

For additional advice on managing a Peer Assess Pro activity, review the Frequently Asked Questions here https://www.peerassesspro.com/frequently-asked-questions-2/

FAQ: When and how is the peer assessment conducted?

FAQ: How do I correct the Team Composition in a running peer assessment activity?

FAQ: How do I take action on the Active Warnings presented in the Peer Assess Pro™ Teacher’s Dashboard?’ What if I ignore the Warnings?

FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?

FAQ: What is the content of emails sent by Peer Assess Pro?

FAQ: Can I adjust the Start Date or Due Date for a running peerassessment activity?

Kind regards,

Peer Assess Pro

https://www.peerassesspro.com/support/

2002 - ADVISORY - Facilitator - Manage progress

SUBJECT - MANAGE PROGRESS: Your peer assessment <<Activity Title>> . Due by <<Due Date>>

Dear << Teacher Fullname >>,

Your peer assessment activity << Activity Title>> is available for students from <<Start Date >> and due for completion by <<Due Date>>. However, students can continue submitting responses beyond the Due Date until you personally FINALISE the activity on the Peer Assess Pro Dashboard.

To manage this activity, view your Teacher's Peer Assess Pro Dashboard here <<Teacher's Activity URL>>.

Review Active Warnings

At this mid-point of the peer assessment process, we suggest you review carefully the Active Warnings on your Peer Assess Pro dashboard. In particular

1. Review the Class Statistics particularly for students rated with the lowest Peer Assessed Scores. The Personal Snapshots for these students may identify absent students or students at risk of course failure

2. Review teams that have not yet submitted sufficient responses to become assessed validly. Remind the team's members to submit

3. Review students who have rated a team member significantly differently than the other team members. This outlier rating may be a sign of dysfunction within the team

4. Review the Qualitative Feedback report, examining especially students who have been peer assessed with a low rating by other team member(s)

5. Review students with an OVERCONFIDENT or UNDERCONFIDENT Index of Realistic Self Assessment (IRSA). Suggest they meet with you to discuss their peer assessment results

6. Identify students or teams who have not engaged constructively with the peer assessment process. That is, they have rated their team members over a narrow range of scores or have rated their team members well above average. Encourage them to resubmit and justify the ratings they have provided.

Enter team results

At this stage you can enter your (provisional or final) Team Results. You can test the impact of the alternative methods for calculating students' Personal Results. Consider downloading the Statistics, Qualitative Feedback, and Teachers Feedback to preview the format of the final reports you will receive from Peer Assess Pro.

Provisional publication of results

Consider publishing provisionally the Personal Results for members of valid teams. Valid teams that have met the minimum threshold required number of responses. You can preview students' Personal Snapshots in 'live' mode to see what students will see on their login dashboard before your Publish or Update the results. This is helpful for your quality management of the peer assessment process.

Prepare your students for interpreting the results of their peer assessment

These multi-media resources will help you prepare your students for making productive learning from their peer feedback and results. See https://www.peerassesspro.com/resources/introducing-students-peer-assessment/

Furthermore, the class may find these FAQs relevant from now

FAQ: How do I interpret the feedback results I've received from the peer assessment?

FAQ: How do I provide useful feedback to my team members?

FAQ: I don't understand what my teammates are trying to tell me. How do I ask for better feedback?

FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern?

Students can find further information about peer assessment here https://www.peerassesspro.com/resources/resources-for-students/.

Managing the peer assessment

The Peer Assess Pro Quickstart Guide reminds you of the next steps you will take as you remind the remaining students to respond. See https://www.peerassesspro.com/quickstart-guide-for-teachers/

Further information

For additional advice on managing a Peer Assess Pro activity, review the Frequently Asked Questions here: https://www.peerassesspro.com/frequently-asked-questions-2/

FAQ: How do I take action on the Warnings presented in the Peer Assess Pro™ Teacher’s Dashboard?’ What if I ignore the Warnings?

FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?

FAQ: What is a valid assessed team?

FAQ: How is an outlier peer assessment rating identified?

FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?

Kind regards,

Peer Assess Pro

https://www.peerassesspro.com/support/

2006 - ADVISORY - Facilitator - Due Date imminent

SUBJECT - DUE DATE IMMINENT: Review your peer assessment <<Activity Title>> . Due by <<Due Date>>

Dear << Teacher Fullname >>,

You scheduled the peer assessment activity << Activity Title>> due for completion by <<Due Date>>. However, students can continue submitting responses beyond the Due Date until you personally FINALISE the activity.

View your Teacher's Peer Assess Pro Dashboard here <<Teacher's Activity URL>>.

Review Active Warnings

Prior to publishing and finalising the peer assessment, we suggest you review carefully the Active Warnings on your Peer Assess Pro dashboard. In particular,

1. Review the Class Statistics particularly for students rated with the lowest Peer Assessed Scores. The Personal Snapshots for these students may identify absent students or students at risk of course failure

2. Review the Qualitative Feedback report, examining especially students who have been peer assessed with a low rating by other team member(s). There may be feedback comments upon which you wish to take proactive intervention with the assessor or assessed student

3. Review students with an OVERCONFIDENT or UNDERCONFIDENT Index of Realistic Self Assessment (IRSA). Suggest they meet with you to discuss their peer assessment results.

Enter team results

At this stage you may enter your Team Results. Next, confirm the method for calculating the Personal Result that will be awarded to each student.

Prepare your students for interpreting the results of their peer assessment

These multi-media resources will help you prepare your students for making productive learning from their peer feedback and results. See https://www.peerassesspro.com/resources/introducing-students-peer-assessment/

Furthermore, the class may find these FAQs relevant from now

FAQ: How do I interpret the feedback results I've received from the peer assessment?

FAQ: I don't understand what my teammates are trying to tell me. How do I ask for better feedback?

FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern?

FAQ: How do I interpret measures of realistic self-assessment?

Students can find further information about peer assessment here https://www.peerassesspro.com/resources/resources-for-students/.

Further information

For additional advice on managing and finalising a Peer Assess Pro activity, review the Frequently Asked Questions in the section 'Manage the peer assessment activity' here: https://www.peerassesspro.com/frequently-asked-questions-2/

FAQ: How do I take action on the Warnings presented in the Peer Assess Pro™ Teacher’s Dashboard?’ What if I ignore the Warnings?

FAQ: What is a valid assessed team?

FAQ: How is an outlier peer assessment rating identified?

FAQ: What steps can I take to get a better personal result?

Kind regards,

Peer Assess Pro

https://www.peerassesspro.com/support/

2008 - ADVISORY - Facilitator - Due Date reached

SUBJECT - DUE DATE REACHED: Finalise your peer assessment <<Activity Title>> . Due by <<Due Date>>

Dear << Teacher Fullname >>,

You scheduled the peer assessment activity << Activity Title>> due for completion by <<Due Date>>. However, students can continue submitting responses beyond the Due Date until you personally FINALISE the activity.

View your Teacher's Peer Assess Pro Dashboard here <<Teacher's Activity URL>>.

Review Active Warnings

Prior to publishing and finalising the peer assessment, we suggest you review carefully the Active Warnings on your Peer Assess Pro dashboard. In particular,

1. Review the Class Statistics particularly for students rated with the lowest Peer Assessed Scores. The Personal Snapshots for these students may identify absent students or students at risk of course failure

2. Review the Qualitative Feedback report, examining especially students who have been peer assessed with a low rating by other team member(s). There may be feedback comments upon which you wish to take proactive intervention with the assessor or assessed student

3. Review students with an OVERCONFIDENT or UNDERCONFIDENT Index of Realistic Self Assessment (IRSA). Suggest they meet with you to discuss their peer assessment results.

Enter team results

At this stage you should enter your Team Results and select the method for calculating the Personal Result that will be awarded to each student.

Prepare your students for interpreting the results of their peer assessment

These multi-media resources will help you prepare your students for making productive learning from their peer feedback and results. See https://www.peerassesspro.com/resources/introducing-students-peer-assessment/

Furthermore, the class may find these FAQs relevant from now.

FAQ: How do I interpret the feedback results I've received from the peer assessment?

FAQ: I don’t understand what my teammates are trying to tell me. How do I ask for better feedback?

FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern?

FAQ: How do I interpret measures of realistic self-assessment?

Students can find further information about peer assessment here https://www.peerassesspro.com/resources/resources-for-students/.

Further information

For additional advice on managing and finalising a Peer Assess Pro activity, review the Frequently Asked Questions in the section 'Manage the peer assessment activity' here: https://www.peerassesspro.com/frequently-asked-questions-2/

FAQ: How do I take action on the Warnings presented in the Peer Assess Pro™ Teacher’s Dashboard?’ What if I ignore the Warnings?

FAQ: What is a valid assessed team?

FAQ: How is an outlier peer assessment rating identified?

FAQ: What steps can I take to get a better personal result?

Kind regards,

Peer Assess Pro

https://www.peerassesspro.com/support/


FAQ: How do I login to my peer assessment Activity URL

Activity URL

To access the Peer Assess Pro survey you require an activity-specific URL. In general, the format of the Activity URL is:

https://q.xorro.com/teacherid/activityid

Example

https://q.xorro.com/smup/23021

The teacherid is usually four letters, such as smup. The teacher is ALWAYS identified by these letters.

The activityid is usually several digits, such as 23021.

The Activity URL is provided to a student through:

Participant URL

The Participant URL lists ALL the activities currently running that have been started by one teacher. The format is a truncated form of the Activity URL. That is, no Activityid, just the teacherid:

https://q.xorro.com/teacherid

Successful login through Activity URL

When everything is working correctly, you follow the link to the Activity URL. You should see the Login Page. Note the Activity title in the top left corner. That information should confirm you have the correct Activity URL for the peer assessment you are required to undertake.

Login to Peer Assess Pro Activity

Enter you ID. For students, this is usually your Student ID or Student Registration. Your teacher or facilitator will advise you if a different system of identification is being used.

Successful login confirms your name, and details about the institution and teacher that should be familiar to you!

Select ‘Next’ to proceed to the peer assessment.


Successful login to Peer Assess Pro Activity

Quick links and related information

FAQ: I am unable to login. My login failed


FAQ: I am unable to login. My login failed

There are several reasons why a student’s login may fail to be successful. Steps to effect a remedy are detailed later.

  1. You entered your ID incorrectly.
  2. Your teacher or facilitator has entered your ID incorrectly
  3. The Xorro Activity related to the Activity URL has not yet reached its Start Date
  4. The Xorro Activity related to the Activity URL has been Finalised and Finished.
  5. The Xorro Activity related to the Activity URL has been Abandoned.
  6. The institution manager for Xorro has not maintained payment of the subscription to use Xorro and/or Peer Assess Pro
  7. An exceptional system fault has occurred with the Xorro participants database entry for your ID: duplicate identical ids
  8. Some other mysterious fault

This FAQ explains how to login correctly:

FAQ: How do I login to my Activity URL

Investigation and remedies for login failure

  1. You entered your ID incorrectly.

  1. Your teacher or facilitator has entered your ID incorrectly

FAQ: How do I correct the Team Composition in a running peer assessment activity?

  1. The Xorro Activity related to the Activity URL has not yet reached its Start Date

Select the required Activity Title from the list of teacher’s activities

  1. The Xorro Activity related to the Activity URL has been Finalised and Finished.

  1. The Xorro Activity related to the Activity URL has been Abandoned

  1. The institution manager for Xorro has not maintained payment of the subscription to use Xorro and/or Peer Assess Pro

  1. An exceptional system fault has occurred with the Xorro participants database entry for your ID: duplicate identical ids

There is a rare exception that can prevent a student’s login. This exception occurs when there are two or more identical IDs in the Xorro institution participants database.


Search All Participants by ID to identify duplicate ID matches

8. Some other mysterious fault

If none of the previous explanations or solutions resolve the issue, contact Peer Assess Pro providing full details of the messages shown, activity, activity URL, participants CSV, and institution.

Quick links and related information

FAQ: How do I login to my Activity URL


FAQ: Can I adjust the Start Date or Due Date for a running activity?

The short answer is No, you cannot adjust either of these dates. However, there are workarounds described below.

Adjusting the Start Date

When the Start Date and time is reached a multiplicity of emails are sent to the students in the class advising them

Since there is NO MECHANISM to recall the despatched emails, the sole workaround to adjust the Start Date is to abandon the peer assessment, then launch a new per assessment. The abandon process is detailed below, Worst case scenario: Abandon the peer assessment

Adjusting the Due Date

The teacher establishes the deadline for completing a peer assessment when the activity is first created then launched. The deadline is termed the Due Date.

You cannot extend the Due Date once the activity has been launched. However, the Due Date is advisory only. See later!

The Due Date is the date that students will be advised by which they should complete the peer assessment. The Due Date is advised to students through:

The Due Date is also used to prompt the teacher to conduct important administration and management activities during the activity, and prior to Finalisation.

The good news: The Due Date date is advisory only

The ‘Due Date’ date is advisory only. Students can CONTINUE to submit responses beyond the Due Date UNTIL the teacher Finalises or Abandons the activity. After Finalisation, the students will have up to two weeks to review their results after the Finalisation Date.

Advise students of your extended deadline

Given that the Due Date is advisory only, we suggest you announce in class, or by email that you have ‘extended’ the Due Date for peer assessment submissions until some arbitrary future date you select. On that date you can then choose to Finalise the peer assessment. Check the progress results first!

Worst case scenario: Abandon the peer assessment

Abandoning a peer assessment is a worst case, last ditch attempt which we advise against.

However if you insist on changing the Start Date or the Due date, then from the Peer Assess Pro Dashboard

The Activity URL will become invalid. The students will be advised the peer assessment has been abandoned. All survey results collected to date WILL BE nulled.

Now launch a new activity with the revised Start Date and/or Due Date. A few students will be confused by receiving the (old, abandoned) peer assessment Activity URL and the superseding peer assessment request. However, only the new activity will be available to students. Furthermore, the URL link to the abandoned activity will direct the student to a list of all Xorro running activities initiated by that teacher. Hopefully, you have clearly identified the title of the peer assessment activity so that students can select it correctly.

Abandoning a peer assessment is a worst case, last ditch attempt which we advise against.

Quick links and related information

The importance of correctly selecting the Start Date and Due Date is detailed in the Reference Manual:

2.3 Launch and create the peer assessment activity

About Finalisation and Abandonment

4. Finalise the peer assessment activity


FAQ - I’m having problems importing my participants csv

“I am frustrated beyond reason when it comes to creating or editing my teamset csv file!”

What are the common problems when importing a participants file?

When you attempt to IMPORT or UPDATE a teamset csv into Peer Assess Pro, you may be unsuccessful for a variety of reasons. This table lists the most common issues.

Problem

Reason and solution

My specified file is rejected

The Participants CSV file specified is not a comma-separated variables (csv) file. You MUST supply a csv file. You cannot attempt to launch using an xls, tsv, pdf or other file type. Here is a sample.csv file from the Xorro site.

I get a message like ‘missing group

The first row of your Participants CSV file must contain these column headers id, first, last, email, team, group_code. REQUIRED in any order AND separated with a comma or semicolon delimiter. Check your file follows the rules here

Is my Participants CSV in the correct format for launching a peer assessment activity?

I don’t understand the importance of column headers

In a Participants CSV file intended for use in a peer assessment the first row must specify literally these column headers id, first, last, email, team, group_code. Ensure there are NO leading or trailing spaces or hidden characters in the headers. The subsequent data, presented row by row in the csv, is defined in Xorro Help Importing Participants, Teams and Groups

I am completely mystified about the distinction between the group_code and team columns of data.

The team column designates the allocation of all the participants in the file who belong to the same team, such as Tiger in the  sample.csv illustrated later..

The group_code designates the allocation of all the participants in the file who belong to a higher-order arrangement, such as the SAME class, a tutorial, an assignment. Explained exhaustively in Xorro Help Importing Participants, Teams and Groups

I receive several error messages during the peer assessment launch process

When you upload or revise your team composition participants csv there is extensive quality assurance before you commit to launch or update your Peer Assess Pro activity. RED-coloured errors will halt the launch until you correct your csv. Other errors are warnings that allow you to ‘proceed with care’.

Error notifications upon upload of a teamset CSV to a peer assessment

Other mystifying messages

Here is a comprehensive list of all the potential errors that can occur when you attempt to import your teamset csv file, and how to correct those errors

Comprehensive list of potential errors when attempting to import participants csv

Give me confidence I’m following the correct steps

Practice downloading the sample.csv file from the Xorro site. Then launch the peer assessment using the same unchanged sample file.

After building confidence that process works, use your spreadsheet editor or text editor to make changes such as replacing the email addresses to those you know, and adding new team members. Practice saving as a csv. 

Confirm by opening your csv with a text editor that you have created a file with the extension and format csv. Now, launch the peer assessment using your adjusted Participants csv.

Is my Participants CSV in the correct format for launching a peer assessment activity?

When all else fails

You could

  1. Review our  QUICK START GUIDE
  2. Review our How to videos, especially the Platform Overview and Launch a new peer assessment with your class list of teams: The teamset
  3. Contact Peer Assess Pro Support


Is my Participants CSV in the correct format for launching a peer assessment activity?

You create or edit a Participants CSV file using a spreadsheet editor (Google Sheets, Apple Numbers, Microsoft Excel) or a text editor (Apple Textedit, or Microsoft Windows Notepad).

After you have created your arrangement of participants into teams using your editor, you must create a CSV file. For example, see FAQ: How do I create a CSV file from a Google Sheet?

Download the Xorro Participants CSV sample file here https://www.xorro.com/wp-content/uploads/2020/10/Participant_CSV_Sample.csv

Sample of participants csv file opened using a text editor

id,first,last,group_code,team,email,

BOWI12,Bob,Wilson,123.101,Tiger,Bob.Wilson@xorroinstitution.com,

ALJO11,Alice,Jones,123.101,Panda,Alice.Jones@xorroinstitution.com,

JOSM13,John,Smith,123.101,Tiger,John.Smith@xorroinstitution.com,

JOSM13,John,Smith,123.202,,John.Smith@xorroinstitution.com,

GRGR15,Greta,Green,123.101,Panda,Greta.Green@xorroinstitution.com,

GRGR15,Greta,Green,123.204,,,

HEJO19,Henry,Jones,123.101,Tiger,Henry.Jones@xorroinstitution.com,

AMTO01,Amanda,Tolley,123.101,Bear,Amanda.Tolley@xorroinstitution.com,

JEWA06,Jeff,Wang,123.101,Panda,Jeff.Wang@xorroinstitution.com,

HOBR03,Holly,Brown,123.101,Bear,Holly.Brown@xorroinstitution.com,

HOBR03,Holly,Brown,123.202,,Holly.Brown@xorroinstitution.com,

THWI18,Thomas,Windsor,123.101,Tiger,Thomas.Windsor@xorroinstitution.com,

ANWO08,Anna,Worth,123.101,Bear,Anna.Worth@xorroinstitution.com,

ANWO08,Anna,Worth,123.202,,Anna.Worth@xorroinstitution.com,

ANWO08,Anna,Worth,123.204,,,


For the purposes of improving the explanation that follows, the Xorro sample file has been sorted in Google Sheets by id, group_code, then team within group_code. Download here the Sorted Participants csv shown here.

Sorted Participants CSV viewed in Google Sheets

id

first

last

group_code

team

email

AMTO01

Amanda

Tolley

123.101

Bear

Amanda.Tolley@xorroinstitution.com

ANWO08

Anna

Worth

123.101

Bear

Anna.Worth@xorroinstitution.com

HOBR03

Holly

Brown

123.101

Bear

Holly.Brown@xorroinstitution.com

ALJO11

Alice

Jones

123.101

Panda

Alice.Jones@xorroinstitution.com

GRGR15

Greta

Green

123.101

Panda

Greta.Green@xorroinstitution.com

JEWA06

Jeff

Wang

123.101

Panda

Jeff.Wang@xorroinstitution.com

BOWI12

Bob

Wilson

123.101

Tiger

Bob.Wilson@xorroinstitution.com

HEJO19

Henry

Jones

123.101

Tiger

Henry.Jones@xorroinstitution.com

JOSM13

John

Smith

123.101

Tiger

John.Smith@xorroinstitution.com

THWI18

Thomas

Windsor

123.101

Tiger

Thomas.Windsor@xorroinstitution.com

ANWO08

Anna

Worth

123.202

Anna.Worth@xorroinstitution.com

ANWO08

Anna

Worth

123.204

GRGR15

Greta

Green

123.204

HOBR03

Holly

Brown

123.202

Holly.Brown@xorroinstitution.com

JOSM13

John

Smith

123.202

John.Smith@xorroinstitution.com

Note these features of the Participants CSV illustrated by the two technically identical identical samples

  1. The first row, containing the column headers must be stated as id, first, last, group_code, team, email. REQUIRED in any order.
  2. The definitions and rules for what data may be listed within each of the subsequent rows is explained in Xorro help Importing Participants, Teams and Groups
  3. The data on each row is separated by the delimiter comma (,). Alternatively, a semicolon (;) may be used. The delimiter is illustrated above in the Sample of participants csv file opened using a text editor. You cannot observe the delimiter in the Sorted Participants CSV viewed in Google Sheets. To check, see FAQ: How do I create a CSV file from a Google Sheet?
  4. This example shows three unique group_codes, corresponding to three distinct classes 123.101, 123.202, 123.204. Good practice hint! Keep it simple for your first trials with Peer Assess Pro. Use just ONE group_code, corresponding typically to one class, one tutorial group, or one assignment.
  5. Just one group_code, 123.101 is associated with team membership data The three teams associated with group_code 123.101 are Bear, Panda and Tiger.
  6. The following four participants are members of the team Tiger associated with group_code (class) 123.101 - BOWI12 Bob Wilson, JOSM13 John Smith, HEJ019 Henry Jones, THWI18 Thomas Windsor.
  7. The subset of all data rows comprising all unique ids associated with group_code 123.101 is called a teamset. In this case, the teamset for group_code 123.101 includes the FROM AMTO01 Amanda Tolley 123.101 Team Bear … through THWI18 Thomas Windsor 123.101 Team Tiger
  8. It is never permissible to allocate the same unique individual to more than one team within the same group_code. Conceptually, an individual can’t be in two places at once!
  9. In this example file only participants with group_code 123.101 form membership of a valid teamset for use with Peer Assess Pro. Reason: Group_codes 123.202 and 123.204 do not have data presented for the team allocation for the participants in those groups.

Extra for experts

It is permissible in the file to have a unique individual, allocated to three separate group_codes (classes). The participant ALJ011 Alice Jones is allocated to all three group_codes 123.101, 123.202, 123.204.

This example file of participants when imported into Xorro during a launch peer assessment activity will give you the option of selecting which group_code will be used for the peer assessment, either 123.101, 123.202, or 123.204. This capability enables tremendous flexibility for peer assessment management when you have, for example, large classes with many arrangements and team assignments. Discuss your requirements with members of peer assess pro Ltd.

Key points!

The team column designates the allocation of all the participants in the file who belong to the same team, such as Tiger in the example below.

The group_code designates the allocation of all the participants in the file who belong to a higher-order arrangement, such as the SAME class, a tutorial, an assignment.

More detail on Xorro help Importing Participants, Teams and Groups

Error notifications upon upload of a participants CSV to a peer assessment

When you upload or revise your participants csv the Xorro Import operation provides extensive quality assurance before you commit to launch or update your peer assessment activity. For example, you’ll get RED coloured error warnings, and the launch will not proceed when

You also receive advisory ORANGE coloured error warnings. In these cases, the launch can proceed, but take care! For example

You receive Advisory notifications when you make a change to a student’s data, such as their id, team or email. The launch can proceed, but take care! The following example shows the different classes of Fatal, non-critical and Advisory notifications displayed to a Facilitator attempting to import a Participants CSV that updates previous data.  

Examples of error notifications upon upload of a participants CSV to a peer assessment

Good practice hint!

Confirm the notifications make sense to you before committing to the launch or update. Did you really want to change that email? Did you really want to change someone’s id or name? Did you really mean to have two or more people share the same email? (that is permitted!).


Comprehensive list of potential errors when attempting to import participants csv

Items in red will prevent the peer assessment activity from launching. Correct the identified errors

Items in orange will allow you to proceed to launch. Proceed with care!

Items in green are not notified during the launch process. The launch will proceed. However, you might check your source csv file for these symptoms of a potential problem.

Potential errors in a participants csv

Short Descriptor

Active Warning Message

Recommended Action

Import file must be csv

The file you supplied is not in the required format of csv, comma separated variables.

Create a comma separated variable (csv) file from your source data. Attempt re-import.

Column headers incorrect

The csv file does not include the required data description header row.

Ensure the Participants csv header row includes id, first, last. Optionally, include email, team, group_code.

Column headers incorrect for teamset

The csv file does not specify the required data description headers for a peer assessment teamset.

A Participants csv that includes teamset data requires the header row to include id, first, last, email, team, group_code. Remove all trailing or leading spaces or hidden characters.

Missing values in csv

Values must always be provided for id, first and last.

Supply the required data into the csv.

Missing values in teamset

In a teamset, values must always be provided for id, first, last, email, team and group_code.

Supply the required data into the csv.

Id repeated within teamset

An id is restricted to one use within a teamset.

Remove or correct the duplicate ids in the csv.

Id contains invalid characters

An id contains invalid characters.

Remove invalid characters from the id data such as blanks, tilde (~), ampersand (&).

Best practice:

- Begin the id with a letter

- Contains no spaces

- Only use alphanumeric characters

- Optionally, use these special characters @ $ # _

Email is invalid

An email address is not valid.

Correct the email address.

Data record includes invalid characters

A data record contains invalid characters.

Remove invalid characters from the data such as tilde (~), ampersand (&).

Best practice:

- Use only alphanumeric characters

- Optionally, use only these special characters @ $ # _

- Blanks may be used within the data

Group_code data missing for teamset

A data record containing team data has a missing group_code.

Supply a group_code for all rows of data containing team data.

Team size is too small

Team size is too small for effective peer assessment. A minimum of three participants per team is required.

Check to confirm you have allocated participants to the correct teams.

Caution: teams of less than three participants will receive limited feedback results.

Email data is missing

A participant who is a member of a teamset has no email specified.

Check to confirm that the participant has a valid email address.

Caution: It is not good practice to proceed with missing emails in a teamset.

Email repeated within teamset

An email has been used multiple times within a teamset.

Good practice: Ensure that each participant within a teamset has a different email.

Caution: It is not good practice to proceed with missing emails in a teamset.

Group_code data is missing in csv

A data record has missing group_code.

Caution: Good practice suggests that a group_code is provided for all rows of data.

Name repeated within teamset

A first and last name pair has been used more than once within the same teamset.

Check to confirm that participants named similarly are different people.

Redundant header descriptions

The CSV header row contains column headers and data that will be ignored by Xorro.

Check to confirm row headers are correct. If you proceed now, the data in the additional columns will be ignored.

Team repeated within csv

The same team name is associated with several group_codes (teamsets).

Confirm your use of the same team name within several group_codes is correct.

Id repeated within csv

The same id is associated with several group_codes.

Check to confirm your multiple use of ids within the Participants csv.

Name repeated within csv

The same name is associated with several group_codes.

Check to confirm your multiple use of similar names within the Participants csv.

Quick links and related information

FAQ: How do I create a CSV file from a Google Sheet?

FAQ - Problems editing and creating participants CSV files

Reference Guide Section 2.2 Create the peer assessment Participants CSV

About Comma-separated values (CSV)

Comma-separated values. (2020). In Wikipedia. https://en.wikipedia.org/w/index.php?title=Comma-separated_values&oldid=940422860

decimal point—Wiktionary. (n.d.). Retrieved 25 February 2020, from https://en.wiktionary.org/wiki/decimal_point

Delimiter-separated values. (2019). In Wikipedia. https://en.wikipedia.org/w/index.php?title=Delimiter-separated_values&oldid=916302992

List of text editors. (2020). In Wikipedia. https://en.wikipedia.org/w/index.php?title=List_of_text_editors&oldid=93574916

Zobel, D. (2010, March 11). I have trouble opening CSV files with Microsoft Excel. Is there a quick way to fix this? Paessler Knowledge Base. https://kb.paessler.com/en/topic/2293-i-have-trouble-opening-csv-files-with-microsoft-excel-is-there-a-quick-way-to-fix-this

Zobel, D., & Schoch, G. (2014, November 4). Trouble With Opening CSV Files With Excel? The Comma and Semicolon Issue in Excel Due to Regional Settings for Europe. https://kb.paessler.com/en/topic/2293-i-have-trouble-opening-csv-files-with-microsoft-excel-is-there-a-quick-way-to-fix-this#reply-5193


FAQ - Problems editing and creating participants CSV files

“I am frustrated beyond reason when it comes to creating or editing my teamset csv file!”

This technical note is mostly obsolete. We think we have fixed this problem. The Xorro teamset csv import now accepts either comma (,) or semicolon (;) delimited CSV files.

This FAQ should address most issues

FAQ - I’m having problems importing my participants csv

Trouble opening csv files with Excel due to regional settings

This technical note refers to difficulties related to

  1. Opening the sample.csv Comma Separated Variable (CSV) file correctly in Microsoft Excel so that the correct teamset headers and columns of data are shown. In some cases, the csv opens so that all imported data is presented as a single text string for each row.
  2. Saving your teamset.csv so that the required comma (,) is used as the column delimiter for the header variables and all data
  3. Importing your teamset.csv file into the Launch Peer Assessment operation so that the column headers, correct number of columns, and all data are recognised.

The foregoing difficulties are associated with factors including

  1. Microsoft Excel failing to recognise the comma (,) delimiter in a CSV file when the file is OPENED in Excel
  2. Microsoft Excel and Apple Numbers EXPORTING a CSV file with an alternative delimiter, such as semicolon (;) rather than comma (,)

Explanation

One reason for these mysterious behaviors is that Xorro and Peer Assess Pro currently requires that a CSV file has its contents STRICTLY delimited using the comma (,). However, in regions such as non-Brexit Europe and South Africa the semicolon (;) is used as the delimiter in a so-called Commas Separated Variables file. It’s complicated! This behavior is determined by the Language and Region setting of your computer.

Workaround solution

Peer Assess Pro has provided a practical solution to the issue of regional language and display settings, such as CSV delimiters. The Xorro teamset csv import now accepts either comma (,) or semicolon (;) delimited CSV files.

Here are two alternative workaround solutions that applied before this problem was resolved..

Solution 1: Use a simple text editor to find and replace semicolons with commas

After you have edited and EXPORTED your CSV from Numbers or Microsoft Excel

  1. Open the resultant CSV file with a simple text editor app such as Apple Textedit, or Microsoft Windows Notepad
  2. Identify if the delimiter used is comma (,) or semicolon (;)
  3. If the semicolon is used as the delimiter use a Find and replace edit command to replace all the semicolons with commas
  4. Save the file as unformatted text.

Illustration using Mac TextEdit

  1. Original sample.csv file opened correctly with Apple Numbers where comma (,) delimiter has been recognised correctly upon File Open.

  1. Teamset.csv file Exported from Apple Numbers then opened in Apple Textedit with Euro and South Africa Language and Region. Note semicolon (;) as delimiter. This csv file will NOT be interpreted correctly by the Xorro Import.CSV facility when you launch a peer assessment.

id;first;last;group_code;team;email

BOWI12;Bob;Wilson;123.101;Tiger;Bob.Wilson@xorroinstitution.com

ALJO11;Alice;Jones;123.101;Panda;Alice.Jones@xorroinstitution.com

JOSM13;John;Smith;123.101;Tiger;John.Smith@xorroinstitution.com

JOSM13;John;Smith;123.202;;John.Smith@xorroinstitution.com

GRGR15;Greta;Green;123.101;Panda;Greta.Green@xorroinstitution.com

GRGR15;Greta;Green;123.204;;

HEJO19;Henry;Jones;123.101;Tiger;Henry.Jones@xorroinstitution.com

AMTO01;Amanda;Tolley;123.101;Bear;Amanda.Tolley@xorroinstitution.com

  1. Adjust the csv using the TextEdit app Find and replace

  1. Replace all semicolons (;) with comma (,)

  1. Here, the Replace is completed OK. Commas (,) are now the data delimiters

  1. Get ready to Save the edits. Ensure the file save Format is Make Plain Text

  1. File Save the corrected csv file

Solution 2: Adjust Language and Region to use point as decimal marker

Illustration using Mac OS on an Apple computer

The following three screenshots illustrate how to adjust your Apple Mac current system region to the UK region. The UK (and several former British colonies such as the US, Canada and New Zealand) use the period, full stop or point (.) as the decimal mark (radix or separatrix)  in a number such as 12,345.678. That is, twelve thousand three hundred forty five point six seven eight.

Consequently, the comma (,) is the system default data delimiter in a CSV file….  In contrast, in South Africa and non-Brexit European countries such as France and Germany, use the comma (,) as the decimal mark in a number such as 12 345,678. Consequently, the semicolon (;) is the CSV data delimiter.

  1. Click on the Apple logo, top left of menu bar, to locate System Preferences: Language and Region.

 

  1. Adjust the Region to UK (or NZ, Canada, or Australia). Note how the Decimal Marker near the bottom of the screen adjusts from comma (,) to point (.)

Peer Assess Pro is working actively on a practical solution to the issue of regional language and display settings, such as CSV delimiters. It looks a simple matter, but here there be dragons! RESOLVED. Participants CSV import now accepts both comma (,) and semicolon (;) as the delimiter in a CSV file.

Quick links and related information

FAQ: How do I create a CSV file from a Google Sheet?

2.2 Create the peer assessment Participants CSV

About Comma-separated values (CSV)

Comma-separated values. (2020). In Wikipedia. https://en.wikipedia.org/w/index.php?title=Comma-separated_values&oldid=940422860

decimal point—Wiktionary. (n.d.). Retrieved 25 February 2020, from https://en.wiktionary.org/wiki/decimal_point

Delimiter-separated values. (2019). In Wikipedia. https://en.wikipedia.org/w/index.php?title=Delimiter-separated_values&oldid=916302992

List of text editors. (2020). In Wikipedia. https://en.wikipedia.org/w/index.php?title=List_of_text_editors&oldid=935749166

Zobel, D. (2010, March 11). I have trouble opening CSV files with Microsoft Excel. Is there a quick way to fix this? Paessler Knowledge Base. https://kb.paessler.com/en/topic/2293-i-have-trouble-opening-csv-files-with-microsoft-excel-is-there-a-quick-way-to-fix-this

Zobel, D., & Schoch, G. (2014, November 4). Trouble With Opening CSV Files With Excel? The Comma and Semicolon Issue in Excel Due to Regional Settings for Europe. https://kb.paessler.com/en/topic/2293-i-have-trouble-opening-csv-files-with-microsoft-excel-is-there-a-quick-way-to-fix-this#reply-5193


FAQ - How do I fix an invalid, missing or failed email delivery? WARNING 0026

The Active Warning ‘Email rejected, missing or invalid - Notifications will FAIL to be delivered’ is generated when the email specified in the teamset used to create the peer assessment activity is either

  1. Rejected by the recipient’s email service because the service or recipient has determined that Peer Assess Pro has delivered spam.
  2. Missing - no email address was supplied in the teamset.csv used to create the peer assessment
  3. Invalid in that the @ sign is missing in the email or a domain address is missing
  4. Invalid in that it contains characters not acceptable in an email
  5. Invalid or rejected in that the email address does not belong to a recognised recipient anywhere on the internet

For ANY of the above conditions, the Survey Notifications History will subsequently report the email as having been FAILED to deliver to that recipient.

We have received reports that emails sent to email addresses hosted by certain Microsoft services may be rejected, and then reported by Peer Assess Pro as ‘rejected, missing, or invalid’. The message may be generated even when the email has been confirmed as correct by the student or teacher.

Importantly, when an email address is reported by a Peer Assess Pro Active Warning  as ‘invalid’ through being blocked by the recipient’s email service then emails sent from the Peer Assess Pro platform to these emails, even if correct, WILL NOT BE DELIVERED.

Similarly, if an email to a recipient is reported in the Survey Notifications Log as having FAILED to be delivered, that is a symptom the email service of the recipient has rejected or blocked emails from the Peer Assess Pro platform.

Corrective action: avoid these emails

Consequently, please avoid using emails to addresses that include

@yahoo.com - sometimes successful

@live.com - sometimes successful

@hotmail.com

@msn.com

@outlook.com

Corrective action: use these emails

We recommend that in your team composition teamset.csv you send emails to

Your institution’s designated email address for your students

@gmail.com

@me.com

@qq.com

Peer Assess Pro is progressing work to overcome emails being rejected by Microsoft’s email services.

In the meantime, the following table provides recommended actions when you receive the warning ‘Email rejected, missing or invalid’. Furthermore, the Survey Notifications History provides further detail about the status of an email’s delivery to a recipient.

Email status

Explanation

Facilitator’s action

Sent

Peer Assess Pro has attempted to send the notification to a recipient. The email is still percolating through the internet attempting to find the recipient’s mailbox.

Wait a few minutes. Reload the dashboard then review the Survey Notification History to update the status.

Delivered

Peer Assess Pro has successfully delivered the email notification to a recipient… but the recipient may not have seen the message yet.

Opened

The recipient appears to have opened the email… but you can’t be completely sure!

This advisory depends on settings enabled by the recipient. The OPENED email status is not an entirely reliable indicator.

Failed - temporary

Peer Assess Pro has made at least one attempt to deliver an email but has not yet made a successful delivery. Further attempts to deliver will be made up to certain limits.

Wait patiently. Check again after 24 hours.

Failed - permanent

Peer Assess Pro has made several attempts to deliver an email but has not made a successful delivery. No further attempt to deliver will be made. The email may have Failed to deliver because it was (a) Rejected by the recipient’s mail server, or (b) invalid in some aspect or (c) Missing

Try one of the actions as for Rejected, Missing, or Invalid below.

Avoid emails such as yahoo.com, @hotmail.com, @msn.com, @outlook.com and @live.com

Quick links and related information

This FAQ reports the status of emails sent from the platform, such as SENT, DELIVERED, or FAILED:

FAQ - What emails have been sent by the platform?

Adjust the emails for a recipient using this FAQ:

FAQ: How do I correct the Team Composition in a running peer assessment activity?


FAQ - How do I resolve an unsynchronised team arrangement? ACTIVE WARNING 0021

Context: LMS implementations of Peer Assess Pro.

This Active Warning arises when the arrangement of students and teams on the learning management system fails to match the teams in the launched peer assessment. Peer Assess Pro has effective mechanisms for resolving such issues as late student enrolments, withdrawals and inactive students.

A mismatch arises because one or more of the following events has happened.

1. One or more new class members enrolled on the LMS have not yet been arranged by the teacher into a team on the LMS.

2. Since launching the peer assessment, team membership has been adjusted by the teacher on the LMS for some teams which now fails to match the team arrangement in the peer assessment.

3. Previously-known class members have been removed from the class on the LMS, but still remain registered within the peer assessment arrangement.

4. Teams have been added, deleted, or renamed on the LMS. In general, the teacher should avoid renaming teams in a running activity, as the team’s survey responses may be deleted.

Peer Assess Pro LMS regularly checks that the team membership arrangement on the LMS matches the team arrangement (number of teams, membership of teams) in the currently active peer assessment. When there is a mismatch, Peer Assess Pro

  1. Reveals differences in the Team Composition display selected from the Teacher’s Dashboard
  2. Presents details to explain the Active Warning 0021 Team arrangement unsynchronised.

Active Warning 0021 Team arrangement unsynchronised

The arrangement of students and teams on the learning management system fails to match the teams in the launched peer assessment.

Team arrangement unsynchronised

Id 0021

New class members enrolled on the LMS have not yet been arranged into a team on the LMS. Jill ROBERTSON, Jason SMITH.

Team membership has been adjusted on the LMS for some teams that no longer match the peer assessment arrangement. Teams Black Robins, Brown Kiwis, Red Rooks (renamed from Red Ruru)

New team(s) created on LMS: Teams Waxeyes

New class members enrolled on the LMS into new teams. Kael BRIDGES, Jonathan CHANG, Kyleigh COHEN into team Waxeyes.

Previously-known class members have been removed from the class on the LMS: Estrella HAWKINS in Team Brown Kiwis.

Corrective Action

Peer Assess Pro provides several actions to resolve the mismatches identified,

  1. Review Team Composition
  2. Synchronise All
  3. Postpone

What is a team arrangement?

A team arrangement refers to the memberships of teams (groups) by participants (teammates, students). Depending on your Learning Management System (LMS) these terms may be used to describe a team arrangement: teamset, grouping (Moodle) or group set (Canvas)

Review team composition

The teacher can review the team arrangement as known by Peer Assess Pro LMS in the current running survey. Additionally, the teacher can selectively choose which teams should be synchronised between the LMS and the Peer Assess Pro platform.

In the Team Composition view, symbols indicate teams that fail to match the arrangement configured on the LMS. The symbol (-) signifies that a team member has been withdrawn from the team on the LMS.  In general, it is safe to synchronise such teams.

Most crucially, if a student has been allocated by the teacher on the LMS, subsequent to the peer assessment launch, then the symbol (+) will indicate a discrepancy that must be resolved. Students on the LMS not yet arranged into teams will also be listed in a ghost team named Unassigned, again with the (+) symbol.

The design principle of Peer Assess Pro is that adjustments to team arrangements must be made on the LMS using the standard LMS participant grouping modules. Then the teacher, optionally, commits those adjustments to a running peer assessment, thereby achieving synchronisation.

The teacher should use the standard LMS participants, groups, or groupings modules to view, compare, and adjust the team membership to that which is required, such as adding or removing team members to teams. Following the LMS adjustment, the teacher returns to complete the Synchronise All action to update the corrected team(s) arrangement in the peer assessment. Alternatively, individual teams can be selectively synchronised using Review Team Composition.

Synchronise all team arrangements

The Synchronise All action will update the team arrangement for all teams in the currently-running peer assessment to match the corresponding arrangement of the teams specified on the LMS.

Peer Assess Pro will conserve all responses for the team members as far as possible. If a team gains additional team member(s), then the existing team members will be prompted to submit additional response(s) for the additional team member(s). Specifically, the Email CRITICAL 0013: RESUBMIT peer assessment due to TEAM CHANGE will be dispatched to relevant team members. New class members now synchronised from the LMS into Peer Assess Pro will receive Email CRITICAL 0011: Request to COMPLETE peer assessment. Class members  transferred from one team to another will also receive CRITICAL 0011: Request to COMPLETE peer assessment.        

If a team loses one or more team members, then the remaining team members do not receive any communication as their responses can still be applied to the revised team arrangement.

A student must not be allocated to two or more teams within one Peer Assess Pro activity. Consequently, if a student in the Peer Assess Pro Activity is allocated on the LMS into two or more teams then the exclamation symbol (!) will display against all instances when you Review Team Composition. Team synchronisation will be forbidden until the student is allocated to one team within the proposed team arrangement.

In general, avoid renaming teams in a running activity! In the special case that a team has been renamed AND the team composition remains the same, then the renamed team can be applied with no further action. Previous responses can be applied to the renamed team.

If a renamed team loses one or more team members, then the remaining team members do not receive any communication as their responses can still be applied to the revised team arrangement.

In all other cases PEER ASSESS PRO may regard a renamed team as a new team. Whilst some responses may be recoverable, the existing team members will be prompted to submit an additional response for the additional team member(s), as per CRITICAL 0013: RESUBMIT peer assessment due to TEAM CHANGE.

Here there be dragons!

The teacher should postpone synchronising the team arrangement until after they have checked and confirmed carefully that the arrangement of students into teams on the LMS is complete and accurate. Otherwise, there is the danger of an abundance of alerts from students or the platform about incorrect or unsynchronised team arrangements.

Postpone

If the teacher anticipates receiving notification of additional changes to the team arrangement, they might postpone committing to synchronising the updates. However, note that Peer Assess Pro LMS will continue to receive survey responses from members of teams as currently arranged on the survey platform and shown in the Team Composition view.

An example scenario

This example scenario presents the case of a Peer Assess Pro activity launched early in a course. A week later, new students have been enrolled in the class, whilst others have withdrawn. To accommodate these changes, the teacher creates a new team, reassigns some existing team members, and allocates new team members to existing teams. The example shows how the mismatch between the LMS team arrangement is reported to the teacher through Peer Assess Pro’s Active Warning system. Finally, the teacher is supported to make an accurate update to the team arrangement required for their peer assessment activity delivered through the Peer Assess Pro platform.

A peer assessment has been launched with 19 students in 5 teams. The Team Composition status for Peer Assess Pro LMS at the point of launch shows

Black Robins

Kamryn MILLER , Alexander SAMPSON , Mikaela RAY , Ramon MCKNIGHT

Brown Kiwis

Estrella HAWKINS , August DAUGHERTY , Nehemiah MCCONNELL

Grey Warblers

Joslyn HOOVER , Alyvia YANG , Mariyah POLLARD , Arianna SCHROEDER

Pukekos

Dorian SULLIVAN , Elisha NUNEZ , Muhammad HOLT , Skylar MCCLURE

Red Ruru

Alberto UNDERWOOD , Annika KLINE , June MCKINNEY , Jaylee MURRAY

One week later

One week later, Estrella HAWKINS has withdrawn from the class (-) and has been removed automatically from the LMS participants’ schedule. The teacher also chose to reallocate Kamryn MILLER from Team Black Robins into Team Brown Kiwis on the LMS.

Two new students have been enrolled into the course LMS, but remain unassigned to teams. Furthermore, the teacher has been alerted by a member of Team Pukekos that their team member Dorian SULLIVAN is inactive (?).

The teacher has renamed Team Red Ruru to Red Rooks, and added a new team Waxeyes comprising new class members Kael BRIDGES, Jonathan        CHANG and Kyleigh COHEN. T

Alert to unsynchronised team arrangement

Immediately upon recognising the mismatch, Peer Assess Pro will alert the teacher to the mismatch between the LMS version of the team arrangement, and the team arrangement underway on the Peer Assess Pro platform. The alert will be made through the Message Notification feature of their LMS, and through this Active Warning presented on the Teachers Dashboard.

Active Warning 0021:

Team arrangement unsynchronised

The arrangement of students and teams on the learning management system fails to match the teams in the launched peer assessment. 

New class members enrolled on the LMS have not yet been arranged into a team on the LMS. Jill ROBERTSON, Jason SMITH.

Team membership has been adjusted on the LMS for some teams that no longer match the peer assessment arrangement. Teams Black Robins, Brown Kiwis, Red Rooks (renamed from Red Ruru)

New team(s) created on LMS: Teams Waxeyes

New class members enrolled on the LMS into new teams. Kael BRIDGES, Jonathan CHANG, Kyleigh COHEN into team Waxeyes.

Previously-known class members have been removed from the class on the LMS: Estrella HAWKINS in Team Brown Kiwis.

Unsynchronised Team Composition

In support of the Active Warning, Peer Assess Pro reports the following Team Composition view, highlighting the lack of synchronisation between the LMS and the initial launch. The symbol (-) signifies that, following synchronisation, a student would be dropped from a team. The symbol (+) indicates that a student will be added. Adjustments, contingent upon synchronisation, are highlighted in red.

The (?) symbol denotes a team member or team for which either the Active Warnings CRITICAL 0048: Inactive team member or CRITICAL 0006: Adjusted teamset request have been raised.

Black Robins

Kamryn MILLER (-), Alexander SAMPSON , Mikaela RAY , Ramon MCKNIGHT

Brown Kiwis

Estrella HAWKINS (-), August DAUGHERTY , Nehemiah MCCONNELL, Kamryn MILLER (+)

Grey Warblers

Joslyn HOOVER , Alyvia YANG , Mariyah POLLARD , Arianna SCHROEDER

Pukekos

Dorian SULLIVAN (?), Elisha NUNEZ , Muhammad HOLT , Skylar MCCLURE

Red Rooks < Red Ruru

Alberto UNDERWOOD, Annika KLINE , June MCKINNEY , Jaylee MURRAY

Waxeyes (+)

Kael BRIDGES (+), Jonathan CHANG (+), Kyleigh COHEN (+) 

Unassigned (?)

Jill ROBERTSON (+), Jason SMITH (+)

Teacher’s actions

Having examined the previous Team Composition View the teacher, on the LMS, assigns Jill ROBERTSON to Team Black Robins, and Jason SMITH to Brown Kiwis. The teacher leaves Dorian SULLIVAN in Team Pukekos pending their further investigation. In response to the Teacher’s actions, Peer Assess Pro will show an updated view of the Team Composition and mismatches.

Before synchronisation

Before the teacher initiates synchronisation, the refreshed Team Composition view for the Peer Assess Pro activity will now show the updated team arrangement, combining the intentions newly-stated by the teacher on the LMS, and the current status of the peer assessment.

Black Robins

Kamryn MILLER (-), Alexander SAMPSON , Mikaela RAY , Ramon MCKNIGHT, Jill ROBERTSON (+)

Brown Kiwis

Estrella HAWKINS (-), August DAUGHERTY , Nehemiah MCCONNELL,  Kamryn MILLER (+), Jason SMITH (+)

Grey Warblers

Joslyn HOOVER , Alyvia YANG , Mariyah POLLARD , Arianna SCHROEDER

Pukekos

Dorian SULLIVAN (?), Elisha NUNEZ , Muhammad HOLT , Skylar MCCLURE

Red Rooks < Red Ruru

Alberto UNDERWOOD , Annika KLINE , June MCKINNEY , Jaylee MURRAY

Waxeyes (+)

Kael BRIDGES (+), Jonathan CHANG (+), Kyleigh COHEN (+)

Successful synchronisation of the corrected team arrangement

The Teacher views the foregoing Team Composition, and is satisfied with the proposed team arrangement. The teacher commits to Synchronise All teams.

After the synchronisation is complete successfully, the resulting refreshed Team Composition is revealed to the Teacher.  

Black Robins

Alexander SAMPSON , Mikaela RAY , Ramon MCKNIGHT, Jill ROBERTSON

Brown Kiwis

August DAUGHERTY , Nehemiah MCCONNELL, Kamryn MILLER, Jason SMITH

Grey Warblers

Joslyn HOOVER , Alyvia YANG , Mariyah POLLARD , Arianna SCHROEDER

Pukekos

Dorian SULLIVAN (?), Elisha NUNEZ , Muhammad HOLT , Skylar MCCLURE

Red Rooks

Alberto UNDERWOOD , Annika KLINE , June MCKINNEY , Jaylee MURRAY

Waxeyes

Kael BRIDGES, Jonathan CHANG, Kyleigh COHEN

The two teams Black Robins and Brown Kiwis now include the new class mates Jill ROBERTSON and Jason SMITH, and the relocated Kamryn MILLER. The founding members of teams Black Robins and Brown Kiwis will be alerted to the requirement to submit an updated survey response for their re-configured teams, as will the relocated Kamryn MILLER. Email CRITICAL 0013: RESUBMIT peer assessment due to TEAM CHANGE.

The newly-enrolled class members, Kael BRIDGES, Jonathan CHANG, Kyleigh COHEN, Kael BRIDGES, Jonathan CHANG and Kyleigh COHEN will be advised to submit the peer assessment. Email CRITICAL 0011: Request to COMPLETE peer assessment.        

The Teacher will continue their investigations to determine what to do with Dorian SULLIVAN, denoted by (?) in Team Pukekos.

Quick links and related information

FAQ - What is an adjusted team arrangement request? WARNING 0006

FAQ - What is an inactive team member? WARNING 0048

FAQ - What emails have been sent by the platform?

FAQ: How do I take action on the Warnings presented in the Peer Assess Pro™ Teacher’s Dashboard?’ What if I ignore the Warnings?


FAQ - What is the benefit of a standardized peer assessment rubric?

Peer Assess Pro restricts the Teacher to using a standard rubric that yields several benefits.

Feature

Benefit

Authoritative

The questions used in the survey are based on long-established research about the teamwork capabilities required (a) for effective teamwork by students and (b) by employers.

In-class progress

Using the same rubric within a class for both formative and summative teammate peer assessment enables progress within the class to be measured.

Calibration

A standard rubric enables at-risk students and teams to be readily identified, as the rubric provides for comparison of peer assessment results, especially the Peer Assessed Score, against calibrated benchmarks.

Time-saving

Reduces the time needed to make decisions about what questions to deploy.

Capacity development

Self-directed learning resources for students and teachers are developed more efficiently when a standardised set of teamwork and leadership capabilities are surveyed.

Institutional progress

Results from one class can be compared with the results of another class, and institution, according to a standard basis of measurement.

Validation

A standardised, authoritative rubric supports claims that a course and/or academic programme delivers teamwork and leadership learning outcomes sought by accreditation agencies, such as the Washington Accord Graduate Profile.

Scholarship

Insights drawn from scholarly research using a standardised assessment rubric can inform creative development of teaching and learning practices in several institutions.

The ten questions used in the Peer Assess Pro survey, used as the basis for calculating the Peer Assessment Score, are adapted from:

Deacon Carr, S., Herman, E. D., Keldsen, S. Z., Miller, J. G., & Wakefield, P. A. (2005). Peer feedback. In The Team Learning Assistant Workbook. New York: McGraw Hill Irwin.

Quick links and related information

FAQ: What questions are asked in the peer assessment survey?

FAQ: How are peer assessment and personal results calculated and defined mathematically?


FAQ - What if a student mistakenly advises they are in an incorrect team?

Context: Xorro-based survey

If a student mistakenly advises they are in an incorrect team, then the student simply resumes the survey and selects ‘Actually they are correct (Proceed)’

Team membership confirmation by student

At the start of the Peer Assess Pro survey, a student is given the option to

  1. Confirm they are in the correct team (team name, team members), OR
  2. Advise they have been placed in an incorrect team.

In the second case, a notification email is immediately despatched to the teacher stating

In the activity <<Activity Name>>, << Student Name>> claims that he/she is in an incorrect team. Please check and update teamset.

Example

In the activity "Ornithologists 101 Formative", Peter MELLALOO claims that he/she is in an incorrect team. Please check and update teamset.

In the normal workflow, the teacher will adjust the team composition according to

FAQ: How do I correct the Team Composition in a running peer assessment activity?

Mistaken team membership notification

A student may mistakenly advise they are in an incorrect team.

In this case

  1. The teacher should confirm to the student their correct team membership is as stated in the survey.
  2. At any time, the student can simply login to the Activity URL for the survey. The Activity URL is provided in the email inviting the student to complete the Peer Assess Pro survey. Th Activity URL is also stated on the Teacher’s Dashboard.
  3. The survey will present the following screen to the student. The student should select

‘Actually they are correct (Proceed)’

  1. The survey will proceed with the original team as specified.

Quick links and related information

FAQ: How do I correct the Team Composition in a running peer assessment activity?


CONTACT US

Please do not hesitate to ask us for help.

Patrick Dodd patrick@peerassesspro.com +64 21 183 6315

Peter Mellalieu peter@peerassesspro.com +64 21 42 0118 Skype myndSurfer

We especially welcome your advice on how the app and Reference Guide could be improved. We’d also like to know which features you expect to value highly in your teaching and use of Peer Assess Pro™.

Thank you for your participation.

Frequently Asked Questions

FAQs on the web at https://www.peerassesspro.com/frequently-asked-questions-2/

Quickstart Guide for Peer Assess Pro

https://www.peerassesspro.com/quickstart-guide-for-teachers/

Home/Table of Contents for Reference Manual

http://tinyurl.com/papRefWeb2

Website

https://www.peerassesspro.com/

Quick links and related information

TOP                                                                                        FAQs


[1] Hyperlinked to Xoro site. Teacher must be a registered Xorro user.

[2] Hyperlinked to the web version of the Reference Guide.

[3] Internal links within the pdf version of Reference Guide.

[4] Conditions of use apply to a free Xorro Account. See Discover Xorro-Q