Peer Assess Pro Reference Guide /

Manage a Peer Assessment Activity using Xorro

Reference Guide for Teachers and Students

Version 2.0 2019-09-18

Peter Mellalieu peter@peerassesspro.com 

+64 21 42 0118 Skype myndSurfer

Patrick Dodd patrick@peerassesspro.com 

+64 21 183 6315


QUICK START GUIDE

This table shows the sequence of steps required to register, launch, manage, and download the final gradebook for a Peer Assess Pro peer assessment using Xorro.

Step / Xorro[1]

Task / Reference Manual[2]

Section[3]

REGISTER

once for your free[4] Xorro Teacher’s account.

1.1

TEAMSET

Create a TeamSet CSV (Comma Separated Variables) file containing your class list organised into teams.

Adapt the sample format.

2.1

LOGIN

to your Xorro Teacher’s account dashboard, at https://qf.xorro.com/

1.2

ORIENT

yourself to launching and managing a peer assessment activity using Xorro.

1.3

LAUNCH

a peer assessment activity. You will be presented with the option to import your TeamSet CSV.

2.3

MANAGE

your running (launched) peer assessment activity through carrying out the following tasks.

3.0

WARNING

Action your responses to Warning Alerts presented in the Peer Assess Pro Teacher’s Dashboard.

3.1

TEAM RESULTS

may be entered. Not required in certain cases.

3.3

SELECT

the Personal Result Calculation Method.

3.4

REVIEW

class, team, and individual results, charts, statistics and qualitative feedback.

3.5

PUBLISH

Provisional (optional) then final official results and feedback for view by team members.

3.6

FINALISE

the peer assessment activity to prevent further responses from students.

4.0

DOWNLOAD

Finalised Teacher’s Gradebook, Qualitative and Teacher’s Feedback as CSV files.

4.3

Hyperlinked at www.peerassesspro.com/quickstart-guide-for-teachers


Most Frequently Asked Questions (FAQs)

>>> More FAQs at www.peerassesspro.com/frequently-asked-questions-2 

FAQ: Show me a quick video overview of the whole Peer Assess Pro system

FAQ: What is the purpose of peer assessment?

FAQ: What questions are asked in the peer assessment survey?

FAQ: How do I decide which Personal Result method to apply in my peer assessment activity

FAQ: How do I correct the Team Composition in a running peer assessment activity?

Quick Link Map

Everyone

For teachers

For team members

Xorro-Q help

Login to Xorro-Q

Join peer assessment activity

www.peerassesspro.com

The peer assessment survey

The purpose of peer assessment

Reference guide: Table of contents

Login and orientation

Undertake the peer assessment

FAQs on the web at http://tinyurl.com/papFAQ

Launch peer assessment activity

Use peer assessment results for better performance

Videos

Manage the peer assessment activity

Quickstart guide for teachers

Definitions, calculations, and examples

Contact us

Miscellaneous

Launching Peer Assess Pro™ using Xorro-Q

>>> Reference Guide at http://tinyurl.com/papRefPdf 

This guide helps teachers familiar with peer assessment[5] to use our Peer Assess Pro™ team peer assessment platform through the Xorro-Q interface. Once logged in to Xorro-Q, you launch a peer assessment activity. During the launch process, you upload a Peer Assessment Teamset that specifies team members, their team, login id and, optionally, their email. The Teamset is a comma separated variable (csv) file illustrated below. The Peer Assess Pro peer assessment activity notifies by email to each team member a unique, student-specific survey activity URL that gathers the peer assessments of their team members. Timely reminders and final result announcements are generated automatically and communicated to the students from Peer Assess Pro via the email addresses you provided in the Teamset.

Teachers Process Flowchart: Overview

>>> Hyperlinked chart at http://tinyurl.com/papChart

Questions, Feedback and Contact

Ask us for help, give us feedback, and request additional features.

                https://www.peerassesspro.com/contact/

Patrick Dodd         patrick@peerassesspro.com +64 21 183 6315

Peter Mellalieu        peter@peerassesspro.com         +64 21 42 0118 Skype myndsurfer

Example Peer Assessment TeamSet CSV File

id

first

last

group_code

team

email

ALJO11

Alice

Jones

BUS123.101/PMell/TutB/2020-05-28/SUM

Panda

Alice.Jones@noreply.com

AMTO01

Amanda

Tolley

BUS123.101/PMell/TutB/2020-05-28/SUM

Bear

Amanda.Tolley@noreply.com

ANWO08

Anna

Worth

BUS123.101/PMell/TutB/2020-05-28/SUM

Bear

Anna.Worth@noreply.com

BOWI12

Bob

Wilson

BUS123.101/PMell/TutB/2020-05-28/SUM

Tiger

Bob.Wilson@noreply.com

GRGR15

Greta

Green

BUS123.101/PMell/TutB/2020-05-28/SUM

Panda

Greta.Green@noreply.com

HEJO19

Henry

Jones

BUS123.101/PMell/TutB/2020-05-28/SUM

Tiger

Henry.Jones@noreply.com

HOBR03

Holly

Brown

BUS123.101/PMell/TutB/2020-05-28/SUM

Bear

Holly.Brown@noreply.com

JEWA06

Jeff

Wang

BUS123.101/PMell/TutB/2020-05-28/SUM

Panda

Jeff.Wang@noreply.com

JOSM13

John

Smith

BUS123.101/PMell/TutB/2020-05-28/SUM

Tiger

John.Smith@noreply.com

>>> Download CSV,  EXCEL,  or Google Sheet

Example Survey Questions for a Team Member

Screenshots of the peer assessment activity


PEER ASSESS PRO REFERENCE GUIDE

QUICK START GUIDE        1

Most Frequently Asked Questions (FAQs)        2

Quick Link Map        2

Teachers Process Flowchart: Overview        3

Questions, Feedback and Contact        4

Example Peer Assessment TeamSet CSV File        4

Example Survey Questions for a Team Member        4

Screenshots of the peer assessment activity        5

PEER ASSESS PRO REFERENCE GUIDE        6

1. Login to your Xorro HOME page        18

1.1 First time users: Register        19

Register a new Xorro Teacher’s Account as a Free Facilitator        19

Getting started with Xorro Q        19

Extended free trial for New Zealand higher education institutions        19

1.2 Login from your registered Xorro Account        20

1.3 Orient yourself to the Xorro HOME Dashboard        21

1.4 Orient yourself to the Peer Assess Pro Dashboard        23

1.5 Peer Assess Pro system flowchart detail        24

2. Launch Peer Assessment activity        26

2.1 Quick start launch        27

2.2 Create the peer assessment teamset CSV        28

Alternative TeamSet CSV templates        28

Instructions and column explanations for the Peer assessment TeamSet CSV        29

Requirements for a peer assessment teamset CSV file        31

Create a CSV version of your teamset file        31

Why won’t Xorro load my Teamset file?        32

Good practice hint: Create distinctive group codes for every peer assessment activity you launch        32

Large, multi-cohort streams in a class        33

Here There Be Dragons!        33

2.3 Launch and create the peer assessment activity        34

Select ACTIVITIES from the top menu bar        34

Launch Peer Assessment        35

Good practice hint: Avoid using the Xorro default Due Date        35

The Due Date date is advisory only        36

Initiate Create Activity        36

Here There Be Dragons        36

View the Peer Assess Pro Teacher’s dashboard        37

Invite team members to respond and other automated activities        38

2.4 Use a Teamset Group to launch a peer assessment        41

From the Xorro HOME page select the PARTICIPANTS page        41

Select ‘Import Participants’        41

Browse to your Team Members Group CSV file        42

Load, check and confirm correct team membership, then Import        42

Check class and team membership        43

3. Manage the Peer Assessment Activity        45

3.1 Action responses to warnings        46

Adjusting team composition        47

3.2 Automated and manual notifications        48

3.3 Enter Team Results        48

3.4 Select the Personal Result Calculation Method        50

3.5 Review class, team, and individual statistics        51

Review Class Results        51

Good practice hint: How to identify at risk students        52

The Individual Personal Snapshot        52

Four possible views of the Individual Personal Snapshot        55

Team Statistics        57

Qualitative Feedback        58

Teacher’s Feedback        58

Advanced Statistics        59

3.6 Publish provisional Personal Results to team members        60

Unpublished status        60

Published status        61

Results hidden when insufficient responses        61

4. Finalise the peer assessment activity        63

4.1 Why Finalise?        64

4.2 Publish Finalised Results to students        64

4.3 Download Teacher’s Gradebook of Results        65

4.4 Finalise the Activity … irrevocably!        66

FREQUENTLY ASKED QUESTIONS        67

Quick Link Map        67

FAQs for teachers        68

The peer assessment survey        68

Login and orientation        68

Launch peer assessment activity        69

Manage the peer assessment activity        69

Responding to Active Warnings        70

Definitions, calculations, and examples        70

Miscellaneous        71

FAQs for team members        72

The purpose of peer assessment        72

Undertaking the peer assessment        72

Using the results from peer assessment for better performance        73

How peer assessment affects personal results        73

FAQ: What is the purpose of peer assessment?        74

Defining peer assessment        74

Developmental feedback        74

Determination of course personal result        74

Criteria for peer assessment in Peer Assess Pro™        75

Peer Assess Pro assesses competencies valued by employers        75

FAQ: When and how is the peer assessment conducted?        77

Formative assessment: optional but valuable        77

No surprises!        78

FAQ: How do I provide useful feedback to my team members?        80

FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern?        82

Symptoms of an unfair assessment        82

Steps to address an unfair peer assessment        82

A note on appealing a peer assessment result        83

Prevention is better than cure        84

FAQ: How do I interpret the feedback results I've received from the peer assessment?        86

FAQ: I don’t understand what my teammates are trying to tell me. How do I ask for better feedback?        87

FAQ: What steps can I take to get a better personal result?        88

Raise your Team Result        88

Use your institution’s academic support services        88

Raise your Peer Assessed Score        89

How do I address proactively the challenges of team work?        89

Learning constructively from mid-course peer assessment feedback        90

FAQ: What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process?        92

The tricks we know!        92

Examples: Highly specific and individualized information        93

1. Low quality assessor rating        93

2. Low quality team rating        93

3. Outlier individual rating        94

4. Mismatched self-assessment        94

5. At risk team member        95

Example: Better feedback. Better teams        95

Which teams will raise the Active Warning: Low quality team rating?        96

Which teams tend to have a higher team result?        97

Which teams have worked most productively as a team?        97

Active Warnings, thresholds parameters, and program logic        99

FAQ: Give me a quick overview of how to launch a Peer Assess Pro™ activity through Xorro        100

FAQ: What are the design objectives, key features, and benefits of the Peer Assess Pro development?        102

Design objectives        102

Benefits for students        102

Benefits for teachers        103

Peer Assess Pro™ is a work in progress        103

Where’s the latest?        104

FAQ: How do I find the Peer Assess Pro Xorro Teacher’s dashboard?        105

HOME: Running Activities        105

Alternative method: ACTIVITIES: Running Activities        105

FAQ: How do I navigate the PARTICIPANTS page for Peer Assess Pro?        108

Select PARTICIPANTS Tab        108

Orientation note: Select an existing Group        109

Inactive functions in PARTICIPANTS page        110

FAQ: How do I correct the Team Composition in a running peer assessment activity?        111

Take care! Here there be dragons!!        111

Key check points        111

View the team composition        111

Correct the team composition        112

Subtle technical note        113

FAQ: Can I create a peer assessment activity without having all my teams correctly identified by team name and/or team membership?        114

FAQ: How do I create a CSV file from a Google Sheet?        115

FAQ: How do I view a demonstration version of Peer Assess Pro?        116

FAQ: How do I correct the participants (team members) in a group already uploaded to Xorro?        117

View an existing imported Group        117

Correct the team members associated with an existing Xorro TeamSet Group        118

FAQ: Where may I view the most recent version of the user guides?        119

Quickstart Guide        119

Video guides        119

Latest reference guide        119

Work in progress Google DOCS development version        119

Frequently Asked Questions for teachers and team members        120

Teachers Process Flowchart        120

FAQ: How do I decide which Personal Result method to apply in my peer assessment activity        121

FAQ: How are peer assessment and personal results calculated and defined mathematically?        127

Calculation methods that exclude a team result        128

Calculation methods that incorporate a team result from team outputs        128

FAQ: How do students know where and when to complete the peer assessment activity then review their results?        129

Automated communications to students        129

Standard operating mode        129

Alternative mode for student access to assessment and results        130

FAQ: How do I view and experience what the students experience?        131

View your student’s personal results directly from your Teacher’s Dashboard        131

View your students’ experience of the Peer Assess Pro™ survey        131

Enter your Participants’ URL into your browser        131

Select the activity you wish to experience        132

Login in using the Identification (id) of a student in the Team List Group used to create the activity        132

View a survey ready and waiting for responses        133

View a sample question        133

View a student’s published results        134

View the peer assessment survey for a demonstration class        136

FAQ: Why are different terms used to display peer assessment results in the Xorro and previous Google versions of Peer Assess Pro™?        137

FAQ: How do I take action on the Warnings presented in the Peer Assess Pro™ Teacher’s Dashboard?’ What if I ignore the Warnings?        138

Critical and catastrophic warnings!        138

Important warnings        138

Informational warnings        139

Optional emails generated for team members        139

FAQ: When, why, and how do I Refresh and Update Results?        140

When to recalculate        140

Why recalculate?        140

How to recalculate        141

FAQ: What questions are asked in the peer assessment survey?        142

Example Peer Assessment Survey: Quantitative        143

Example Peer Assessment Survey: Qualitative        144

FAQ: How is the Peer Assessed (PA) Score calculated?        145

The self-assessment is excluded from calculating PA Score        145

Mathematical definition of Peer Assessed Score, PA Score        146

Example calculations of Peer Assessed Score        149

Alternative mathematical formulations of PA Score        151

Calculation from Average Rating        151

Calculation from Average Team and Leadership Contributions        152

FAQ: How is the self-assessment used to calculate Peer Assessed Score?        153

Spider chart of individual and averaged team peer ratings        153

Index of Realistic Self-Assessment (IRSA)        154

FAQ: How is the Peer Assessed Index (PA Index) calculated?        155

Mathematical definition of Peer Assessed Index        155

Example calculations of Peer Assessed Index        156

FAQ: How is the Indexed Personal Result (IPR) calculated?        158

Mathematical definition of Indexed Personal Result        158

Example calculations of Indexed Personal Result        159

FAQ: How is the Normalised Personal Result (NPR) calculated?        161

Mathematical definition of Normalised Personal Result        162

Example calculations of Normalised Personal Result        163

Impact of adjusting the Spread Factor on Normalised Personal result        165

FAQ: How is the Rank Based Personal Result (RPR) calculated?        167

Mathematical definition of Rank-Based Personal Result        168

Example calculations of Rank-Based Personal Result        169

Example calculation with tied ranks        171

FAQ: How is Standard Peer Assessed Score (SPAS) calculated?        172

Design features of Standard Peer Assessed Score        173

Mathematical calculation        174

Example calculations of Standard Peer Assessed Score        176

Example charts for Standard Peer Assessed Score        178

Assumptions about Standard Peer Assessed Score        179

The impact of gaming peer assessment        180

FAQ: What is the influence on Standard Peer Assessed Score (SPAS) if a team rates ALL its members with a Peer Assessed Score of 100?        180

FAQ: Would a student received the same Standard Peer Assessed Score (SPAS) if rated in another class?        181

FAQ: What is Employability? How is it  calculated?        182

Mathematical calculation of Employability        182

Conditioning transformations to de-emphasise unsubstantiated precision        183

Example calculations of Employability        183

FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?        186

Mathematical definition of the Index of Realistic Self Assessment        186

Example calculations of the Index of Realistic Self Assessment        187

Why an IRSA of 100 is not a perfect score!        188

FAQ: How do I interpret measures of realistic self-assessment?        190

Interpreting the Index of Realistic Self Assessment (IRSA)        190

Typical IRSA        190

Overconfident IRSA        190

Underconfident IRSA        190

Developing an exceptionally realistic self image, ERSA        191

What are the benefits of having an Exceptionally Realistic Self-Image?        191

What can get in the way of having an Exceptionally Realistic Self-Image?        191

How do I develop my Exceptionally Realistic Self-Image, ERSI?        192

FAQ: How is an outlier peer assessment rating identified? WARNING 0042        194

Warning detail        194

Example calculations        194

Threshold for warning of outlier individual peer rating        196

Alternative mathematical calculation of Assessor Impact        196

Alternative example calculations        197

FAQ: What is a mismatched self-assessment (IRSA)? WARNING 0040        199

Warning detail        199

Threshold for warning of mismatched self-assessment        200

Example calculations        200

Recommended action for facilitator        201

FAQ: What is a low quality team rating? WARNING 0050        202

Warning detail        202

Threshold for warning of low quality team rating        203

Example calculations        203

Recommended action for facilitator        204

High performing teams        204

Example case        204

FAQ: What is a low quality assessor rating? WARNING 0300        206

Warning detail        206

Threshold for warning of low quality assessor rating        206

Example calculations        207

Recommended action for facilitator        208

High performing teams        208

FAQ: What is a valid assessed team? WARNING 0022        209

Warning detail        209

Results not displayed to members of non-valid assessed teams        209

How many valid and invalid teams do I have?        210

Recommended action for facilitator        210

Mathematical definition        210

Example calculations        211

FAQ: What is an ‘at risk’ team member? WARNING 0044        212

Warning detail        212

Recommended action for facilitator        212

Threshold for warning of ‘at risk’ team member        213

Example calculation        213

Limitation        214

Alternative approaches to identifying at risk students        214

FAQ: What is the content of emails sent by Peer Assess Pro?        218

FAQ: How do I login to my peer assessment Activity URL        228

Activity URL        228

Participant URL        228

Successful login through Activity URL        229

FAQ: I am unable to login. My login failed        231

Investigation and remedies for login failure        231

You entered your ID incorrectly.        231

Your teacher or facilitator has entered your ID incorrectly        232

The Xorro Activity related to the Activity URL has not yet reached its Start Date        232

The Xorro Activity related to the Activity URL has been Finalised and Finished.        233

The Xorro Activity related to the Activity URL has been Abandoned        234

The institution manager for Xorro has not maintained payment of the subscription to use Xorro and/or Peer Assess Pro        234

An exceptional system fault has occurred with the Xorro participants database entry for your ID: duplicate identical ids        235

FAQ: Can I adjust the Start Date or Due Date for a running activity?        237

Adjusting the Start Date        237

Adjusting the Due Date        237

The good news: The Due Date date is advisory only        238

Advise students of your extended deadline        238

Worst case scenario: Abandon the peer assessment        238

CONTACT US        240


1. Login to your Xorro HOME page


1.1 First time users: Register

Register a new Xorro Teacher’s Account as a Free Facilitator

Sign up as a Free Facilitator to trial the use of Peer Assess Pro using the Xorro-Q interface:

Sign up as a Free Facilitator 

https://www.xorro.com/free_accounts/pap/new

Getting started with Xorro Q

For related information relevant to registering as a new facilitator:

Getting Started with Xorro Q

Extended free trial for New Zealand higher education institutions

New Zealand higher education institutions have an extended free trial period for the use of Peer Assess Pro. This free access is under an arrangement with Te Ako Aotearoa.

For further details contact Patrick Dodd at the offices of Peer Assess Pro.


1.2 Login from your registered Xorro Account

After you login, The your  Xorro HOME Dashboard  page shows will display, as shown in Section 1.3 Orient yourself to the Xorro HOME Dashboard

Now proceed to follow the steps in the Quickstart Guide, or the detailed explanations in Section 2. Launch Peer Assessment Activity

Quick links and related information

VIDEO: Login and orientation 

View: Quick Start Guide

Section 1.3 Orient yourself to the Xorro HOME Dashboard

FAQ: How do I find the the Peer Assess Pro Teacher’s dashboard?


1.3 Orient yourself to the Xorro HOME Dashboard

Your  Xorro HOME Dashboard  page shows

Quick links and related information

VIDEO: Login and orientation 

View: Quick Start Guide

FAQ: How do I find the the Peer Assess Pro Teacher’s dashboard?

FAQ: How do I view a demonstration version of Peer Assess Pro?

Peer Assess Pro system flowchart detail

Peer Assess Pro system flowchart detail http://tinyurl.com/papChart

Each process box in the flowchart pdf version of the flowchart links directly to the specific page in this Reference Guide that explains that step in the process.


1.4 Orient yourself to the Peer Assess Pro Dashboard

(To come)


1.5 Peer Assess Pro system flowchart detail

PDF with hyperlinks at Xorro  Peer Assess ProTM Teachers Process Flowchart http://tinyurl.com/papChart

(To come)


2. Launch Peer Assessment activity


2.1 Quick start launch

Create a Comma Separated Variables (CSV) file containing your class list that shows every team member organised into teams. This is your TeamSet CSV file. A sample of the file format is shown in Section 2.2 Create the peer assessment teamset CSV file

Use any of these following templates to adapt and create your TeamSet CSV file using your preferred editor.

After editing the template, remember to create a CSV file type using  SAVE AS CSV, DOWNLOAD AS CSV or EXPORT AS CSV, depending on your spreadsheet editor.

Excel sheet

Google Sheet

CSV file

For a registered Xorro user, use this link to launch a new peer assessment activity. You will be presented with an option to import directly your CSV teamset.

https://qf.xorro.com/pap/launches/new

If your CSV refuses to load, or the activity fails to create, review the detailed steps in the next sections to ensure your CSV is perfectly created.

Check carefully that the specifications detailed in the INSTRUCTIONS and COLUMN EXPLANATIONS presented within the template are followed strictly.

One cause of catastrophic failure is when a student is specified in more than one team. Sometimes this can happen when you have two similarly named, but not identical teams and the ids are repeated in the two teams.


2.2 Create the peer assessment teamset CSV

Use a spreadsheet editor, such as Google Sheets, Excel or Numbers to produce a file that contains columns of data with these column headers id, first, last, group_code, team, and email. Precise INSTRUCTIONS and COLUMN EXPLANATIONS for each of these data are detailed below.

Alternative TeamSet CSV templates

Use any of these templates to adapt and create your TeamSet CSV (comma separated variables file) using your preferred editor. The templates contain the example data and instructions shown below.

CSV file

Excel sheet

Google Sheet

In the sample files, only the group BUS123.101/PMell/TutB/2020-05-28/SUM is a valid teamset suitable for processing by Peer Assess Pro. This is the only group that specifies membership of teams by the students in the class, the teams being Panda, Bear and Tiger.

Sample peer assessment teamset csv

id

first

last

group_code

team

email

ANWO08

Anna

Worth

ARTS123.204/WShak/2021-02-28

GRGR15

Greta

Green

ARTS123.204/WShak/2021-02-28

ALJO11

Alice

Jones

BUS123.101/PMell/TutB/2020-05-28/SUM

Panda

Alice.Jones@noreply.com

AMTO01

Amanda

Tolley

BUS123.101/PMell/TutB/2020-05-28/SUM

Bear

Amanda.Tolley@noreply.com

ANWO08

Anna

Worth

BUS123.101/PMell/TutB/2020-05-28/SUM

Bear

Anna.Worth@noreply.com

BOWI12

Bob

Wilson

BUS123.101/PMell/TutB/2020-05-28/SUM

Tiger

Bob.Wilson@noreply.com

GRGR15

Greta

Green

BUS123.101/PMell/TutB/2020-05-28/SUM

Panda

Greta.Green@noreply.com

HEJO19

Henry

Jones

BUS123.101/PMell/TutB/2020-05-28/SUM

Tiger

Henry.Jones@noreply.com

HOBR03

Holly

Brown

BUS123.101/PMell/TutB/2020-05-28/SUM

Bear

Holly.Brown@noreply.com

JEWA06

Jeff

Wang

BUS123.101/PMell/TutB/2020-05-28/SUM

Panda

Jeff.Wang@noreply.com

JOSM13

John

Smith

BUS123.101/PMell/TutB/2020-05-28/SUM

Tiger

John.Smith@noreply.com

THWI18

Thomas

Windsor

BUS123.101/PMell/TutB/2020-05-28/SUM

Tiger

Thomas.Windsor@noreply.com

ANWO08

Anna

Worth

COMP123.201/PDod/TutA/2020-10-01

Anna.Worth@noreply.com

HOBR03

Holly

Brown

COMP123.201/PDod/TutA/2020-10-01

Holly.Brown@noreply.com

JOSM13

John

Smith

COMP123.201/PDod/TutA/2020-10-01

John.Smith@noreply.com

Instructions and column explanations for the Peer assessment TeamSet CSV

INSTRUCTIONS

1. Organise your participants data into the columns corresponding to those shown in columns A to F, the first 6 columns headed 'id' through 'email'.

You might find it helpful to paste your data from row 17, below the sample data provided in rows 2 through 16.

The sample data provided demonstrates ten unique individuals (ids), organised into three different groups.

A group might comprise all members of a class, or subdivisons such as streams, cohorts, sections, or tutorial groups.

A group is not a team. A group may contain several teams, in which case that's a Xorro teamset.

In the group called BUS123.101/PMell/TutB/2020-05-28/SUM the participants are additionally subdivided into three different teams, Bear, Panda and Tiger.

Only group BUS123.101... is a Xorro teamset suitable for a peer assessment activity.

2. If you are preparing a separate file, ensure you use exactly the same column headers for your list as shown in row 1.

That is, 'id', 'first', 'last', 'group_code', 'team', 'email'.

These headers are case sensitive. NO CAPITALS.

The column header sequence is NOT IMPORTANT.

You may supply additional headers and columns of data. This data will be ignored by Xorro.

3. Read carefully the COLUMN EXPLANATIONS, below, for each type of data.

Some data is optional, and can be skipped, as shown for group_code ARTS123.204/WShak/2021-02-28

4. If you have used this page as your template, DELETE this 'instructions' column.

That is, delete anything not part of your data.

You still need the column headers. The headers must be on row 1 of your file.

5. Delete the sample data, immediately below the header row.

That is, everything between row 2 and row 16.

CHECK you do not have duplicate ids in the same group.

CHECK you do have all the ids in your class allocated to to a group, and, optionally, a team

6. Save (Download, Export, Save As) the file as a CSV, giving it an appropriate filename.

7. From Xorro-Q, browse to PARTICIPANTS, then upload the CSV file.

Alternatively, when you Launch a Peer Assessment Activity, you can IMPORT directly the CSV to create or update the activity.

From this sample file, upon upload three groups would be created in Xorro: ARTS..., BUS.... and COMP....

Only one of the groups is a teamset containing the three teams Panda, Bear, and Tiger.

8. COLUMN EXPLANATIONS

id - Compulsory field.

Identifier for this participant, must be unique for the entire institution.

For a peer assessment activity, this is the participant's login id.

No blanks or characters such as #@$%&*()+

first - Compulsory field.

Participant's first name

last - Compulsory field.

Participant's last name

group_code - Optional field. Required for a peer assessment activity.

The code for the group (ie course, class, stream, cohort) into which the participant is being enrolled.

If the participant is in multiple groups, supply a separate line for each group in which the participant is a member.

Good practice. Append to your root code, such as BUS123.101, abbreviations that indicate the teacher, activity date (start or due), subdivision (stream, cohort), summative or formative.

Note that Anna Worth is enrolled in three groups and in one team.

team - Optional field. Required for peer assessment activity.

The name of the team in which the participant is a member.

The participant can be a member of one team in the same group.

A participant may belong to different teams in different groups.

email - Optional field.

The participant's email.

Ideally required for a peer assessment activity when you require autogenerated warnings and notifications from Peer Assess Pro.


Requirements for a peer assessment teamset CSV file

Create a CSV version of your teamset file

After editing the template, remember to create a CSV version of  your file. Depending on your editor, the appropriate command is:

FILE… SAVE AS … TEXT CSV

FILE… DOWNLOAD AS … Commas separated values (.csv)

FILE… EXPORT AS CSV

FILE… EXPORT TO… CSV

Why won’t Xorro load my Teamset file?

Using the FILE… SAVE command in your spreadsheet editor will produce a file with the incorrect file format, such as .xls, .sheet, or .numbers. 

Xorro will reject those file formats. Xorro accepts and loads only .csv.

See, for example,

FAQ: How do I create a CSV file from a Google Sheet

Good practice hint: Create distinctive group codes for every peer assessment activity you launch

We advise creating a new, unique group_code for each Xorro Activity you create, even for repeat peer assessments within the same class term or semester.

Use a group_code like this

BT123.101/PJM/2020-03-28/FORM 

We suggest your group_code include these elements as per the example above:

We recommend your resulting group_code should distinguish uniquely this semester’s mid-semester formative peer assessment(s) from last semester’s end of class summative where, perhaps, the same institutional class code would have a different set of student names.

The group_code is specified in the Team Members Group CSV file you import prior to launching a Peer Assess Pro™ Activity.


Large, multi-cohort streams in a class

In the general case, a very large class could comprise several cohorts, streams or tutorial sets, each subclass containing several teams conducting one or more peer assessment activities. Consequently, your group_code should help distinguish these separate peer assessment activities. For example,

BT123.101/PJM/TutB/2020-05-28/SUM

Here There Be Dragons! 

Consider two teachers at the same institution teaching the same course but with different tutorial groups. If they use the same goup_code, such as BT101, they will load their own team sets into the same Xorro Participants’ Group, additively, thereby causing mutual confusion and dismay. Similarly, a teacher using the same group_code from term to term, semester to semester, and year to year will experience similar grief.

Quick links and related information

FAQ: How do I correct the participants (team members) in a group I uploaded?

FAQ: How do I correct the Team Composition in a running peer assessment activity?


2.3 Launch and create the peer assessment activity

In summary

Select ACTIVITIES from the top menu bar


Launch Peer Assessment

Enter the following details, in this sequence

Good practice hint: Avoid using the Xorro default Due Date

Set a realistic Due Date that is your target for when you expect and want most students to have completed the peer assessment. In practice, typical Due Dates are set to within four days to seven days beyond the Start Date.

The Due Date is used by Peer Assess Pro to generate automatically:

If you use the Xorro default Due Date, which currently is NOW, the Start Date, you will not receive the benefits of the automated processes conducted by Peer Assess Pro that are triggered by a practical Due Date.


The Due Date date is advisory only

The ‘Due Date’ date is advisory only. Students can CONTINUE to submit responses beyond the Due Date UNTIL the teacher Finalises the activity. After the Finalisation Date, the students will have no more than two weeks to review their results.

FAQ: How do I adjust the Due Date or deadline?

The short answer is ‘You can’t adjust the Due Date!’

Initiate Create Activity

After setting the Start At and Due Dates, select  Create Activity .

Here There Be Dragons

Double check your Start Date and Due Date carefully!

Once you   Create Activity   you cannot adjust the Start Date. The peer assessment Survey and the Email notifications to students requesting their response are created immediately. Furthermore, the email advises the students the Start Date and Due Date.

An adjustment to the Start Date would confuse the students as the Participant Activity URLs would be announced to students. Those Activity URLs could become unavailable to the students if dates were adjustable.

For a similar reason, you cannot adjust the Due Date. However, the ‘Due Date’ date is advisory only. Students can CONTINUE to submit responses beyond the Due Date UNTIL the teacher Finalises the activity.

FAQ: Can I adjust the Start Date or Due Date for a running activity?

In short, No! But you can abandon the activity and start again. Review the foregoing FAQ for details.

View the Peer Assess Pro Teacher’s dashboard

Peer Assess Pro Teacher’s Dashboard

Invite team members to respond and other automated activities

When the Start Date occurs, Peer Assess Pro automates several activities:

A unique peer assessment survey is created for every team and team member


Quick links and related information

FAQ: How do I correct the Team Composition in a running peer assessment activity?

FAQ: Can I adjust the Start Date or Due Date for a running activity?

FAQ: How do I view a list of the participants (team members) in the group I uploaded?

FAQ: How do students know where and when to complete the peer assessment activity then review their results?

FAQ: How do I view and experience what the students experience?

FAQ: How do I find the the Peer Assess Pro Teacher’s dashboard?


2.4 Use a Teamset Group to launch a peer assessment

This is an alternative approach to launching a peer assessment activity. This is a two stage process where you can

From the Xorro HOME page select the PARTICIPANTS page

(Image to come)

Select ‘Import Participants’

This uploads your Teamset CSV within which you have classified your students into teams, as detailed in Section 2.2 Create the peer assessment teamset CSV file

Note that multiple teamset groups may be created using this import process. This is potentially useful for managing peer assessment in large, multi-stream  classes.

Browse to your Team Members Group CSV file

Load, check and confirm correct team membership, then Import

You should see a list of all the students belonging to the class for whom you wish to run the peer assessment activity.

Note: The message ‘Exists’ or ‘Conflict’ means that the id (Identification) code has already been identified within your institution, or a previous Group you have uploaded. Carry on!

Check class and team membership

At this point you are unable to confirm the team membership of your team class. You must first launch a peer assessment activity selecting (one of) the Group Codes that existed within the original Teamset CSV.

Quick links and related information

FAQ: How do I view a list of the participants (team members) in the group I uploaded?

FAQ: How do I view or change the participants (team members) in a group I uploaded?

FAQ: How do I correct  the Team Composition in a running peer assessment activity?

FAQ: Can I adjust the Start Date or Due Date for a running activity?

In short, No! Please check carefully your Start Date and Due Dates.

3. Manage the Peer Assessment Activity


3.1 Action responses to warnings

Active Warnings show when you need to take action to remedy an issue during execution of the peer assessment activity.

Every email is copied to the email account you used to register for Peer Assess Pro.

In the following example, one member of Team Brazilia has completed the assessment of their four team members. Consequently, a warning is generated for Team Brazilia that the number of responses from the team is insufficient for presenting valid results. In contrast, all four team members of Team Kublas have completed the assessment.

The warnings displayed in this case are

Click through the warning to gain advice on how to remedy the situation. For example, you can remind the students to complete the survey. Emails are automatically generated and sent on your behalf to all or selected students.

Adjusting team composition

Upon commencing the peer assessment survey, team members are asked first to confirm that the team members identified or their team are correct. If not, the student initiates a request notification to the teacher to readjust their team’s membership.

Once the peer assessment activity has been launched, you can only modify the team composition as per the following FAQ. Changes to the Xorro Group will have NO EFFECT on a currently running activity, unless you cancel the activity. Then re-launch a new activity with the revised Group. This is an extreme response, and should not generally be required, if you follow the following FAQ.

Quick links and related information

FAQ: How do I correct the Team Composition in a running peer assessment activity?

FAQ: How do I take action on the Warnings presented in the Peer Assess Pro™ Teacher’s Dashboard?’ What if I ignore the Warnings?

3.2 Automated and manual notifications

Students who have NOT completed the survey are sent an email reminder 72 hours, 24 hours and 12 hours before the Due Date.

Similarly, if a student is required to resubmit a response because a team has been reconstituted, an automatic reminder is sent.

Quick links and related information

FAQ: What is the content of emails sent by Peer Assess Pro?

3.3 Enter Team Results

The Team Results for each team must be entered should you intend to select any of these methods to calculate the Personal Result.

You can later revise the Team Results anytime before you publish the results to your students.

Whenever you enter or revise Team Results, you may select the Publish or ‘Update’ button to update and communicate the revised Personal Results to your class.


Team Results are not used to calculate:


3.4 Select the Personal Result Calculation Method

This is the method you choose to calculate the Personal Result you will award to each team member.

Quick links and related information

FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?


3.5 Review class, team, and individual statistics

You can explore progress and final results at the class, team, and individual level.

Review Class Results

In the Class Results, select a Bucket Range to identify the specific students lying within the range of a histogram bar chart.

Before reviewing results, see:

FAQ: When, why, and how do I ‘Recalculate Results’?

Example class statistics


In any of the tables, you may

Good practice hint: How to identify at risk students

The Individual Personal Snapshot

The Individual Personal Snapshot enables you to view all data related to one student. The Student View version of the Personal Snapshot shows exactlt the report the student will receive when the teacher Publishes the results of the current Peer Assess Pro activity.

However, the teacher may wish to view how the results will appear to students BEFORE they are Published. Consequently, there are four possible views of an Individual Personal Snapshot. They are variations on the following example. The four views are explained later.

  1. Teacher’s Live View.
  2. Student’ Live View
  3. Student’s Published View.
  4. Teacher’s Published View.


Example Individual Personal Snapshot (1 of 3)


Example Individual Personal Snapshot (2 of 3)


Example Individual Personal Snapshot (3 of 3)

Four possible views of the Individual Personal Snapshot

Note there are four possible views of an Individual Personal Snapshot.

  1. Teacher’s Live View. Shows the feedback the student would view once the current (live) results are Published or Updated. Furthermore, this snapshot includes qualitative feedback in a transparent form where the teacher can view specifically
  1. Student’ Live View. The view not yet made available to the student, but what the student would view once the current results are Published or Updated. This snapshot includes qualitative feedback  ‘who said what’ in anonymised form, just as the student would see the report.
  2. Student’s Published View. The view you may have already Published to the Student, and available for their view. This snapshot includes qualitative feedback  ‘who said what’ in anonymised form, just as the student would see the report
  3. Teacher’s Published View. This view is similar to the view that is Published to the Student, and available for their view. Additionally, like the Teacher’s Live View, this snapshot shows ‘who said what’ and ‘who rated who’.

If the view is not yet Published, the student will see this remark.

Results unpublished

The same message will be also be displayed if the team is not a valid assessed team, even if the results have been Published to the class as a whole.


Team Statistics

Select an individual team to probe the results of its team members. Sort by Peer Assessed Score or Index of Realistic Self Assessment. Then you can quickly review the Individual Personal Snapshot of each team member as part of your diagnosis to identify ‘star performers’ , ‘at risk’ team members, and those with outlier degrees of over confidence or underconfidence.

Example Team Statistics


Qualitative Feedback

(To come)

Teacher’s Feedback

(To come)


Advanced Statistics

There are many advanced statistics and charts you can view. Furthermore, from ‘Available Actions’ you can Download Full Statistics to conduct more detailed investigations beyond the scope of what we have conceived.

Quick links and related information

FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?


3.6 Publish provisional Personal Results to team members

Results of the peer assessment are hidden from team members until you initiate Publish Survey on the Peer Assess Pro Teacher’s dashboard.

Before Publishing, see:

FAQ: When, why, and how do I ‘Recalculate Results’?

Unpublished status


Published status

The foregoing ‘Refresh and Recalculate’ steps provide you with the opportunity to quality review results before publishing and republishing personal results and qualitative peer feedback comments. In short, as the peer assessment activity progresses towards the due date, results ARE NOT automatically updated and made available for viewing by the students.

Take Care! Once an activity is Published, the results can never be unpublished. However, you may re-publish results if new responses are submitted and/or you make adjustments to Team Results, Team Composition, etc. To reiterate, even if interim results have been published to students, as the peer assessment activity continues to progress towards the due date, results ARE NOT automatically updated and made available for viewing by the students.

Results hidden when insufficient responses

Results will be hidden from the teacher and ALL team member in teams where less than one-half of team members have submitted the peer assessment. Peer assessment results are possibly not valid and representative at this stage of the survey activity processing. For small teams, at least 3 team members must have submitted a response. That is, team sizes of 3, 4, 5 and 6 team members require at least three team members to have peer assessed each other. A team of 7 or 8 requires a minimum of 4 responses. Team members who have already submitted a response will ALSO be advised their results are hidden until more of their team members have submitted responses.

Quick links and related information

FAQ: How do students know where and when to complete the peer assessment activity then review their results?

FAQ: How do I view and experience what the students experience?


4. Finalise the peer assessment activity


4.1 Why Finalise?

Survey responses from Team Members are received and available for incorporation into the Peer Assessment activity UNTIL the you explicitly Finalise the Survey. Even responses submitted after the Due Date announced to students, at the launch of the Activity, will be available for incorporation UNTIL the survey is Finalised deliberately by the Teacher. Until Finalisation, you can request a student to reconsider. They will then optionally resubmit their responses.

4.2 Publish Finalised Results to students


4.3 Download Teacher’s Gradebook of Results

From the Peer Assess Pro Teacher’s Dashboard, select either

Example Gradebook Summary Statistics

Example Gradebook Full Statistics


4.4 Finalise the Activity … irrevocably!

Quick links and related information

FAQ: How do students know where and when to complete the peer assessment activity then review their results?


FREQUENTLY ASKED QUESTIONS

Quick Link Map

Everyone

For teachers

For team members

Xorro-Q help

Login to Xorro-Q

Join peer assessment activity

www.peerassesspro.com

The peer assessment survey

The purpose of peer assessment

Table of contents: Reference guide

Login and orientation

Undertake the peer assessment

FAQs on the web at http://tinyurl.com/papFAQ

Launch peer assessment activity

Use peer assessment results for better performance

Videos

Manage the peer assessment activity

Quickstart guide for teachers

Definitions, calculations, and examples

Contact us

Miscellaneous


FAQs for teachers

Quickstart Guide for teachers

The peer assessment survey

Login and orientation

Launch peer assessment activity

Manage the peer assessment activity

Responding to Active Warnings

Definitions, calculations, and examples

Miscellaneous

The peer assessment survey

FAQ: When and how is the peer assessment conducted?

FAQ: What is the purpose of peer assessment?

FAQ: What questions are asked in the peer assessment survey?

FAQ: How do students know where and when to complete the peer assessment activity then review their results?

FAQ: How are peer assessment and personal results calculated and defined mathematically?

FAQ: Is the self-assessment used to calculate Peer Assessed Score?

Login and orientation

FAQ: Give me a quick overview of how to launch a Peer Assess Pro™ activity through Xorro 

FAQ: How do I navigate the PARTICIPANTS page for Peer Assess Pro?

FAQ: How do I view and experience what the students experience?

FAQ: How do I view a demonstration version of Peer Assess Pro?

Launch peer assessment activity

FAQ: How do I create a CSV file from a Google Sheet?   

FAQ: Can I create a peer assessment activity without having all my teams correctly identified by team name and/or team membership?

FAQ: How do I correct the participants (team members) in a group already uploaded to Xorro?

FAQ: How do students know where and when to complete the peer assessment activity then review their results? 

FAQ: Can I adjust the Start Date or Due Date for a running activity?

Manage the peer assessment activity

FAQ: How do I correct the Team Composition in a running peer assessment activity?  

FAQ: What is the content of emails sent by Peer Assess Pro?

FAQ: What is a valid assessed team?

FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?

FAQ: When, why, and how do I ‘Update and Recalculate Results’?

FAQ: How do I take action on the Warnings presented in the Peer Assess Pro™ Teacher’s Dashboard?’ What if I ignore the Warnings?

FAQ: What happens if [a student tries] to 'game' (fool? play? disrupt?) the peer assessment process?

FAQ: Can I adjust the Start Date or Due Date for a running activity?

FAQ: How do I advise a student who feels they have been unfairly treated?

Responding to Active Warnings

FAQ: How do I take action on the Warnings presented in the Peer Assess Pro™ Teacher’s Dashboard?’ What if I ignore the Warnings?

FAQ: What is a valid assessed team? WARNING 0022

FAQ: How is an outlier peer assessment rating identified? WARNING 0042 

FAQ: What is a mismatched self-assessment (IRSA)? WARNING 0040 

FAQ: What is a low quality team rating? WARNING 0050

FAQ: What is a low quality assessor rating? WARNING 0300

Definitions, calculations, and examples

FAQ: How are peer assessment and personal results calculated and defined mathematically?

FAQ: How is the Peer Assessed (PA) Score calculated?

FAQ: Is the self-assessment used to calculate Peer Assessed Score?

FAQ: How is the Peer Assessed Index (PA Index) calculated?

FAQ: How is the Indexed Personal Result (IPR) calculated?

FAQ: How is the Normalised Personal Result (NPR) calculated?

FAQ: How is the Rank Based Personal Result (RPR) calculated?

FAQ: How is Standard Peer Assessed Score (SPAS) calculated?

FAQ: What is Employability? How is it calculated?

FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?

FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?

Miscellaneous

FAQ: Where may I view the most recent version of the user guides?

 FAQ: What is the content of emails sent by Peer Assess Pro?

FAQ: Why are different terms used to display peer assessment results in the Xorro and previous Google versions of Peer Assess Pro™?  

FAQ: What are the design objectives, key features, and benefits of the Peer Assess Pro development? 

FAQ: How do I contact people at Peer Assess Pro?


FAQs for team members

The purpose of peer assessment

Undertaking the peer assessment

Using peer assessment results for better performance

How peer assessment affects personal results

The purpose of peer assessment

FAQ: What is the purpose of peer assessment?

FAQ: How are peer assessment and personal results calculated and defined mathematically?

Undertaking the peer assessment

FAQ: What questions are asked in the peer assessment survey?

FAQ: I am unable to login. My login failed

FAQ: How do I login to my peer assessment Activity URL

FAQ: When and how is the peer assessment conducted?

FAQ: How do I provide useful feedback to my team members?

FAQ: How do students know where and when to complete the peer assessment activity then review their results?

FAQ: How do I view and experience what the students experience?

FAQ: Is the self-assessment used to calculate Peer Assessed Score?

FAQ: What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process?


Using the results from peer assessment for better performance

FAQ: How do I interpret the feedback results I've received from the peer assessment?

FAQ: How do I interpret measures of realistic self-assessment?

FAQ: What steps can I take to get a better personal result?

FAQ: Is the self-assessment used to calculate Peer Assessed Score?

FAQ: I don’t understand what my teammates are trying to tell me. How do I ask for better feedback?

FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern?

FAQ: What is Employability? How is it calculated?

How peer assessment affects personal results

FAQ: How are peer assessment and personal results calculated and defined mathematically?

FAQ: Is the self-assessment used to calculate Peer Assessed Score?

FAQ: What steps can I take to get a better personal result?

FAQ: What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process?

FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern?


FAQ: What is the purpose of peer assessment?

Defining peer assessment

Peer assessment is an educational activity in which students judge the performance of their peers, typically their teammates. Peer assessment takes several forms including

Developmental feedback

The ability to give and receive constructive feedback is an essential skill for team members, leaders, and managers.

Consequently, your teacher has chosen to use  Peer Assess Pro™  to help you provide developmental feedback to your team members, for both formative and/or summative purposes.

The goal of developmental feedback is to highlight both positive aspects of performance plus areas for performance improvement. The result of feedback is to increase both individual and team performance (Carr, Herman, Keldsen, Miller, & Wakefield, 2005).

Determination of course personal result

Additionally, your teacher may use the quantitative results calculated by Peer Assess Pro™ to determine your Personal Result for the team work conducted by your team. Your Personal Result may contribute to the final (summative) assessment grade you gain for the course in which Peer Assess Pro™ is applied.

In general, your Personal Result is calculated from two factors:

Criteria for peer assessment in Peer Assess Pro™

There are many possible criteria for assessing your contribution to your team’s work. Peer Assess Pro has chosen to place equal weight on two groups of factors based on a well-established instrument devised by  Deacon Carr, Herman, Keldsen, Miller, & Wakefield (2005), Task Accomplishment, and Contribution to Leadership and team processes:

Peer Assess Pro assesses competencies valued by employers

The selection of the criteria used in the Peer Assess Pro is reinforced by the results from a recent survey that asked employers to rate the importance of several competencies they expected to see in new graduates from higher education. The figure shows that teamwork, collaboration, professionalism, and oral communications rate amongst the most highly needed Career Readiness’ Competencies (CRCs) sought by employers. All these CRC competencies rate at least as ‘Essential’, with Teamwork and Collaboration rating almost Absolutely Essential (National Association of Colleges and Employers), 2018).


Employers rate their essential need for Career Readiness Competencies

Source: National Association of Colleges and Employers (NACE). (2018). Figure 42, p. 33.

Quick links and related information

FAQ: What questions are asked in the peer assessment survey?

FAQ: How are peer assessment and personal results calculated and defined mathematically?

FAQ: How is the Peer Assessed (PA) Score calculated?

References

Deacon Carr, S., Herman, E. D., Keldsen, S. Z., Miller, J. G., & Wakefield, P. A. (2005). Peer feedback. In The Team Learning Assistant Workbook. New York: McGraw Hill Irwin.

National Association of Colleges and Employers (NACE). (2018). Job Outlook 2019. Bethlehem, PA. https://www.naceweb.org/

FAQ: When and how is the peer assessment conducted?

The best practice for conducting peer assessment in an academic course follows several stages.

  1. Introduction. The teacher introduces the team activity and related course assignments
  2. Peer assessment purpose. The teacher explains the role, purpose, and process of peer assessment
  3. The team activity commences
  4. A formative peer assessment is conducted using Peer Assess Pro™ early or mid-way through the team activity.
  5. Team members receive formative feedback generated by Peer Assess Pro™  that indicates their provisional peer assessment rating and (optionally) their indicative end of class personal result. More importantly, they receive qualitative information that provides guidance on what behaviors are required to improve their contribution towards the results the team is seeking.
  6. The team activity continues towards its conclusion. Team members confirm informally with each other that they are correctly applying the more productive behaviours identified through the formative peer assessment.
  7. The team activity concludes. 
  8. The summative peer assessment using Peer Assess Pro™ is conducted at the conclusion of the team activity and/or before the conclusion of the course.

Formative assessment: optional but valuable

The midpoint formative peer assessment is an optional element of peer assessment within the classroom. As a minimum, the formative peer assessment gives the team members experience of the Peer Assess Pro™ mechanism including the questions that will be used to conduct the final, summative peer assessment.

More importantly, the midpoint formative assessment helps ensure that team members have the opportunity to respond proactively to the peer feedback they receive generated immediately following the conclusion of the peer assessment activity. Through undertaking appropriate corrective action mid-way through the course, team members have the opportunity to raise their peer assessment rating, their team’s results, and, therefore, their end of course personal results.

No surprises!

The intention of formative assessment is that, ideally, a team member should face no surprises when they receive their final personal result and peer assessment feedback at  the conclusion of the course. For instance, a free-riders should receive clear feedback that the rest of their team observes they are free-riding. Consequently, the free-rider should learn in a timely manner that they will be penalised at the concluding summative assessment unless they remediate their behaviour. It is equally important that an overachieving student who does most of the work is given timely feedback that they need to learn to involve and engage the other team members in the team’s planning and execution of tasks. The Peer Assess Pro™  survey specifically targets these aspects of leadership and team process contributions, an this particular style of overachieving student should be identified through the peer assessment ratings they receive.

To minimise the risk of surprises, it is important, therefore, that the peer assessment you provide to your team members at the midpoint of a team activity is


Quick links and related information

FAQ: What questions are asked in the peer assessment survey?

FAQ: How do I provide useful feedback to my team members?

FAQ: How do students know where and when to complete the peer assessment activity then review their results?

FAQ: How do I view and experience what the students experience?

FAQ: How do I interpret the feedback results I've received from the peer assessment?

FAQ: How are peer assessment and personal results calculated and defined mathematically?

FAQ: Is the self-assessment used to calculate Peer Assessed Score?


FAQ: How do I provide useful feedback to my team members?

It is essential that the peer assessment a team member provides to their team members through peer assessment is:

Ohland et al (2012) provide a table of Behaviorally Anchored Ratings covering high and low contributions to team effectiveness. The table provides some guidance to team members about how they might give accurate, effective, and productive feedback to their team members through peer assessment.

Examples of high and low contributions to team effectiveness

HIGH

CONTRIBUTION

LOW

  • Does more or higher quality work than expected.
  • Makes important contributions that improve the team's work.
  • Helps to complete the work of teammates who are having difficulty.
  • Completes a fair share of the team's work with acceptable quality.
  • Does not do a fair share of the team's work.
  • Delivers sloppy or incomplete work.
  • Misses deadlines. Is late, unprepared, or absent for team meetings.
  • Does not assist teammates. Quits if the work becomes difficult.
  • Asks for and shows an interest in teammates' ideas and contributions.
  • Improves communication among teammates.
  • Provides encouragement or enthusiasm to the team.
  • Asks teammates for feedback and uses their suggestions to improve.
  • Listens to teammates and respects their contributions.
  • Communicates clearly.
  • Shares information with teammates.
  • Participates fully in team activities.
  • Respects and responds to feedback from teammates.

INTERACTION

  • Interrupts, ignores, bosses, or makes fun of teammates.
  • Takes actions that affect teammates without their input.
  • Does not share information.
  • Complains, makes excuses, or does not interact with teammates.
  • Accepts no help or advice.
  • Watches conditions affecting the team and monitors the team's progress.
  • Makes sure that teammates are making appropriate progress.
  • Gives teammates specific, timely, and constructive feedback.
  • Notices changes that influence the team's success.
  • Knows what everyone on the team should be doing and notices problems.
  • Alerts teammates or suggests solutions when the team's success is threatened.

KEEPING FOCUS

  • Is unaware of whether the team is meeting its goals.
  • Does not pay attention to teammates' progress.
  • Avoids discussing team problems, even when they are obvious.

  • Demonstrates the knowledge, skills, and abilities to do excellent work.
  • Acquires new knowledge or skills to improve the team's performance.
  • Able to perform the role of any team member if necessary.
  • Has sufficient knowledge, skills, and abilities to contribute to the team's work.
  • Acquires knowledge or skills needed to meet requirements.
  • Able to perform some of the tasks normally done by other team members.

CAPABLE

  • Missing basic qualifications needed to be a member of the team.
  • Unable to perform any of the duties of other team members.
  • Unable or unwilling to develop knowledge or skills to contribute to the team.

Source: Ohland et al., (2012)

Adapted by Mellalieu (2017) from Ohland, M. W., Loughry, M. L., Woehr, D. J., Bullard, L. G., Felder, R. M., Finelli, C. J., … Schmucker, D. G. (2012). APPENDIX B: Behaviorally Anchored Rating Scale (BARS) Version, from Comprehensive Assessment of Team Member Effectiveness. Academy of Management Learning & Education, 11(4), 609–630. Retrieved from http://amle.aom.org/content/11/4/609.short

FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern?

For teachers: How do I advise a student who feels they have been unfairly treated?

Symptoms of an unfair assessment

Here are some symptoms that you may have been treated unfairly by one or more team mates in their peer assessment of you:

Steps to address an unfair peer assessment

If you believe you may have been unfairly treated, these are the steps you should pursue, in this order of action

  1. Ensure that you understand the results of your peer assessment, and how the results are calculated. See the Quick Links below for suggestions.
  2. Email then arrange to discuss your concerns with your teacher. Your teacher can check the peer assessments provided by your team members. For example, perhaps one or more extreme peer assessments have affected negatively your result. The teacher can explore that possibility with you and the team member concerned. You should be able to present evidence to the teacher of the work you agreed to complete, your record of attending meetings, the work you produced, and relevant communications from your team members expressing their satisfaction or otherwise of your contributions to team outputs and/or working processes.
  3. Meet with your team members to ensure you fully understand WHY they have awarded you the ratings they gave. In particular, ensure you understand accurately what you must do to achieve a better peer assessment rating in the future. In addition, when you meet your teacher in Step 2, you might request the teacher’s attendance at this meeting as a meeting observer or facilitator ‘to keep the peace’.
  4. If there remains a dispute between your self-assessment and your team’s assessment, the teacher MAY with discretion request that one or more of your team members resubmit their peer assessment. This step can only be undertaken if the peer assessment activity is not finalised.
  5. If you continue to dispute your peer assessment and/or personal result then you should pursue your institution’s policy for appealing an assessment result. This policy is usually mentioned in your course syllabus, assignment specification, programme overview, and/or learning management system.

A note on appealing a peer assessment result

An appeal against a peer assessment result is likely to fail if one or more of the following circumstances have prevailed:

Prevention is better than cure

Take these steps to avoid a mismatch between the peer assessment result you expect, and the result you receive.


Quick links and related information

How peer assessment affects personal results

FAQ: How are peer assessment and personal results calculated and defined mathematically?

FAQ: Is the self-assessment used to calculate Peer Assessed Score?

FAQ: What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process?

Using the results from peer assessment for better performance

FAQ: How do I interpret the feedback results I've received from the peer assessment?

FAQ: I don’t understand what my teammates are trying to tell me. How do I ask for better feedback?

FAQ: How do I interpret measures of realistic self-assessment?

FAQ: What steps can I take to get a better personal result?


FAQ: How do I interpret the feedback results I've received from the peer assessment?

(To be published)

Quick links and related information

FAQ: How do I interpret measures of realistic self-assessment?


FAQ: I don’t understand what my teammates are trying to tell me. How do I ask for better feedback?

Begin by viewing this video. Watch especially for the question that is introduced soon after minute 15 by Harvard University professor Sheila Heen.

Heen, S. (2015). How to use others’ feedback to learn and grow. TEDx. Retrieved from https://www.youtube.com/watch?v=FQNbaKkYk_Q

As Heen and Stone observe

“Feedback is less likely to set off your emotional triggers if you request it and direct it. So donʼt wait until your annual performance review. Find opportunities to get bite-size pieces of coaching from a variety of people throughout the year. Donʼt invite criticism with a big, unfocused question like “Do you have any feedback for me?” Make the process more manageable by asking a colleague, a boss, or a direct report,

“Whatʼs one thing you see me doing (or failing to do) that holds me back?”

That person may name the first behavior that comes to mind or the most important one on his or her list. Either way, youʼll get concrete information and can tease out more specifics at your own pace.” (Heen & Stone, 2014)

Quick links and related information

Heen, S., & Stone, D. (2014). Find the Coaching in Criticism. Harvard Business Review, 9. Retrieved from https://medschool.duke.edu/sites/medschool.duke.edu/files/field/attachments/find-the-coaching-in-criticism.pdf


FAQ: What steps can I take to get a better personal result?

Your Personal Result is determined from a combination of your Team Result and your Peer Assessed Score. Consequently, to raise your Personal Result you need to apply balanced effort to raising both these contributing factors.

Raise your Team Result

Typically, your Team Result is earned from its assignment outputs, such as a report, and/or a presentation. Consequently, the grade for the Team Result is determined by the teacher, based on the rubric (marking guideline) they apply to assess your team’s outputs. Ensure you understand the assignment elements and how each will be assessed. Seek out exemplars of good practice. Pursue the guidance found in:

Mellalieu, P. (2013, March 15). Creating The A Plus Assignment: A Project Management Approach (Audio). Innovation & chaos ... in search of optimality website: http://pogus.tumblr.com/post/45403052813/this-audio-tutorial-helps-you-plan-out-the-time

Use your institution’s academic support services

In addition to your teacher and their assistant tutors, your academic institution will offer personal and group coaching to guide you on the specific success factors related to the type of assignment you are pursuing. Schedule appointments to make use of these support facilities early in your project. Locate the online resources these coaching support services have curated for your guidance.


Raise your Peer Assessed Score

Group and team projects present special challenges of coordination, motivation, communication and and leadership. These challenges are normal! Furthermore, an essential part of your job as team member is to overcome proactively these challenges as part of your academic learning journey.

As you overcome these challenges you will achieve several benefits directly instrumental in raising your Personal Result:

You will also develop team work and leadership competencies that will both raise your future employability, and your effectiveness in future teamwork, as discussed in:

FAQ: What is the purpose of peer assessment?

How do I address proactively the challenges of team work?

Whilst there are many resources to help address the challenges of team work in academic settings, we suggest you familiarise yourself with these resources early in your team project. Since “Any fool can learn from their own mistakes. It takes genius to learn from the mistakes of others” (Einstein), be proactive rather than foolish in learning effective team working skills from:

Turner, K., Ireland, L., Krenus, B., & Pointon, L. (2011). Collaborative learning: Working in groups. In Essential Academic Skills (2nd ed., pp. 193–217).

Carr, S. D., Herman, E. D., Keldsen, S. Z., Miller, J. G., & Wakefield, P. A. (2005). The Team Learning Assistant Workbook.

Learning constructively from mid-course peer assessment feedback

Good practice peer assessment management by your teacher will provide you with two opportunities for peer assessment and peer feedback through your course, formative and summative.

Your first, mid-course, formative assessment provides you with early advice about your strengths and opportunities for development as perceived by your team members. Make use of this formative feedback at the earliest opportunity as you proceed towards the conclusion of your team work, and your final, summative peer assessment. Usually, this final, summative assessment is where you earn the significant contribution to your course grade from the Personal Result earned from your Peer Assessed Score awarded by your team members.

Consequently, take proactive action following the mid-course formative assessment through referring to:

FAQ: How do I interpret the feedback results I've received from the peer assessment?

Maybe you don’t understand or don’t agree with the feedback your teammates are providing. In that case, refer to

FAQ: I don’t understand what my teammates are trying to tell me. How do I ask for better feedback?


Quick links and related information

The purpose of peer assessment

FAQ: What is the purpose of peer assessment?

Undertaking the peer assessment

Using peer assessment results for better performance

How peer assessment affects personal results

FAQ: How are peer assessment and personal results calculated and defined mathematically?

FAQ: What steps can I take to get a better personal result?

FAQ: What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process?

FAQ: Is the self-assessment used to calculate Peer Assessed Score?


FAQ: What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process?

What happens if a team member attempts to 'game' the peer assessment process?

The designers of Peer Assess Pro have many decades’ experience working with students. We know the tricks that students attempt to play with peer assessment. We have anticipated the tricks,  so Peer Assess Pro warns the teacher that a trick may be being played. Furthermore, the teacher receives highly specific and student individualized information about each incident. The teacher may then undertake overt or covert action to address the issue to which they have been alerted. For example, the trick-playing student or team may then receive a request to reconsider and resubmit their peer assessment. In more extreme incidents, the student or team may receive an invitation to visit the teacher for a counselling consultation.

The tricks we know!

Here follow a few of the ‘tricks’ that Peer Assess Pro identifies and warns the teacher about during the survey process. Examples follow later.

  1. Low quality assessor rating: An assessor may have engaged unconstructively with peer assessment. Also known as “I can’t be bothered with this. I’ll give everyone the same rating
  2. Low quality team rating: A team may have engaged unconstructively with peer assessment. Also known as “If we give everyone a high mark, we’ll get a better personal result won’t we?”
  3. Outlier individual rating: A team member has assessed another team member very differently than the other team members. Also known as “This is my friend, I’ll rate her well” or “We’ve never got on well together. Now this is my chance for revenge!
  4. Mismatched self-assessment: A team member's self-assessment is materially different from the peer assessment given by their team. Also known as “If I give myself a high rating, I’ll get a better personal result won’t I?”
  5. At risk team member: A team member has been rated amongst the lowest in class. Also known as “Darn! I thought I could hide under the radar”

Examples: Highly specific and individualized information

Here are some examples of the highly specific and individualized Active Warnings a teacher receives about each incident.

1. Low quality assessor rating

Madison may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 100 and low range 3.  Team  Alpha

Ben  may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 86 and low range 5.  Team Bravo

This message warns the teacher that the team member has given everyone a near perfect Peer Assessed Score or a similar score (narrow range). Practically, from the student’s point of view they are ‘wasting their votes’. If everyone is scored with the same or similar score then students who have contributed substantially to the team’s result will not be adequately recompensed. Furthermore, if EVERY team member pursued this same approach, then every team member would be awarded the Team Result. In this case, the team member just looks stupid in the eyes of the teacher. Furthermore, the team member fails to gain practice at being a leader where giving accurate assessments of team members’ contributions is a valued management competency.

2. Low quality team rating

Team Bravo  may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 98 and low range 8.

Team Echo  may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 94 and low range 10.

These messages warns the teacher that the team collectively may have arranged to give everyone a near perfect Peer Assessed Score or the same score. Practically, from the students’ point of view, this trick is a waste of time. If everyone is scored with the same score, or a perfect Peer Assessed Score of 100, then every team member will be awarded the Team Result … which is usually not 100. The team members just look stupid in the eyes of the teacher. Furthermore, they may not receive useful qualitative feedback and ratings that help guide focussed development of their future productivity in team assignments and their future professional work in teams.

3. Outlier individual rating

The warning highlights a situation where the team members appear to be inconsistent in rating high, medium and low contributors to the team’s process and results.

This example is a symptom there maybe some disruptive team dynamics or bullying within the team.

Harrison Ford assessed Steven Spielberg awarding a PA Subscore of 38. Compared with the average rating by the other team members of 70 this subscore DEPRESSED the PA Score to 64 by 7 PA Score Units. Team  Alpha

This message warns the teacher there maybe some favouritism between friends or allies.

Donald Trump assessed Vladimir Putin awarding a PA Subscore of 90. Compared with the average rating by the other team members of 57 this subscore RAISED the PA Score to 64 by 7 PA Score Units. Team  Charlie

4. Mismatched self-assessment

Anna  self-assessment of 68 is OVERCONFIDENT compared with the peer assessment of 34 given by others in team  Charlie.  IRSA = 51

This message warns the teacher that the team member has a very much higher opinion of their performance than is evidenced by the rating provided by their peers. The teacher may request an interview with the student to explore the reasons for this divergence, and how the student can develop a more realistic self-assessment.

Alternatively, the team member may be being scape-goated by the remainder of the team, and that possibility will be discussed with the team member for whom this warning is raised.

5. At risk team member

Anna has been rated amongst the lowest in class. Low Recommendation 2.3 and/or low Peer Assessment Score 34. Team  Alpha

This message warns the teacher that the team member is rated very poorly when compared with most of the class. It’s often a symptom of little or no attendance or contribution by the team member, which the teacher will verify through examining the qualitative feedback provided by the team members. Again, the teacher may request an interview with the student to explore the reasons for this divergence, and how the student can develop a more realistic self-assessment.

Example: Better feedback. Better teams

Examine the following teacher’s dashboard graphic revealing a real class that undertook a peer assessment.


Teachers dashboard: visible identification of teams with low quality team rating

Which teams will raise the Active Warning: Low quality team rating?

Team 1, 14, 13, 2, 7, 5, and 11. Over ½ of the teams in the class!

Observation: This class was poorly briefed on how to make the best use of peer assessment and feedback. With a better briefing, less than 10% of teams will raise this warning.

Which teams tend to have a higher team result?

The lower Team Results are associated with teams that had a low quality team rating. Apart from Team 15, all teams with an adequate quality team rating had a Team Result equal to or greater than the Class median Team Result of 73.3. For example, Team 10 (Team Result 90) through to Team 4 (Team Result 76.7) according to the sort by Range in the foregoing table.

Which teams have worked most productively as a team?

Team 10, with a Team Result of 90 is clearly a high performing team. The moderately low Range of PA Scores (10) across the team suggests IN THIS CASE that everyone contributed relatively equally and effortfully towards a great Team Result. Reminder: The Team Result is awarded by the teacher: it is independent of the Peer Assessed Scores of the team.

However, Team 3 is also a good candidate for being a fair and productive team. They engaged honestly with peer assessment, awarding a high spread of Peer Assessed Scores (Range 18.8) an a team average PA Score (78.3). This team average was not outrageously high, in contrast to teams 1 (100!), 14 (100), 13, 2, 7, 5. Furthermore, Team 3 earned the class median Team Result of 73.3, which appears then allocated according to the peer assessed contribution of the team members. This fair distribution is illustrated in the following graph and table. Team Member Charlie earned the highest Personal Result of 81, whilst Able earned 65.3. Similar reasoning applies to Team 6 to a slightly lesser degree, since the Range is not so wide.

Note from the following graph how teams 14, 5, 13, 1, 2 and 7 are again glaringly identified in the Teachers Dashboard as outlier teams poorly engaged with the peer assessment process: the low vertical spread in the graph. This low vertical spread in the Personal Result (NPR in this case) derives from the low range of Peer Assessed Scores across each team.

With this admittedly small case size, we advance the proposition that ‘Better feedback leads to better teams’. And/Or ‘Better teams give better feedback!’. In conclusion, let’s say Better feedback. Better teams.


Teachers dashboard: a fairly productive team

Quick links and related information

FAQ: What is the purpose of peer assessment?

FAQ: How do I provide useful feedback to my team members?

FAQ: How do I interpret the feedback results I've received from the peer assessment?

FAQ: I don’t understand what my teammates are trying to tell me. How do I ask for better feedback?

FAQ: I believe I have been unfairly treated by the results of the peer assessment. How do I address my concern?

Active Warnings, thresholds parameters, and program logic

The following section explains how the teacher should respond to the Active Warnings displayed on their dashboard. The thresholds parameters and program logic for raising the warnings are also provided.

Responding to Active Warnings


FAQ: Give me a quick overview of how to launch a Peer Assess Pro™ activity through Xorro

If this is your first time using Peer Assess Pro, we recommend strongly that you glance briefly our Frequently Asked Questions so you are prepared to answer your own and your students' concerns - https://www.peerassesspro.com/frequently-asked-questions-2/

Download a pdf of the Quickstart Guide and this Reference Guide here - http://tinyurl.com/papRefPdf

Contact

Patrick Dodd - https://www.peerassesspro.com/contact/


Quick links and related information

View the web Quickstart Guide at tinyurl.com/pdfQuickWeb

FAQ: How do I contact people at Peer Assess Pro?

FAQ: Where may I view the most recent version of the User Guide?

FAQ: What are the design objectives, key features, and benefits of the Peer Assess Pro development?


FAQ: What are the design objectives, key features, and benefits of the Peer Assess Pro development?

Design objectives

Our overall objectives for Peer Assess Pro™ are

Benefits for students

Benefits for teachers

Peer Assess Pro™ is a work in progress

We appreciate your participation in this pre-market release of our substantially revised Peer Assess Pro™ in conjunction with the Xorro advanced quiz and survey platform.

As we proceed through this pre-market refinement phase we respond almost daily  to your suggestions for improving both the software applications and user documentation. These improvements are implemented at anytime whilst we undergo our Beta Development phase. We anticipate that our implementations are robust enough to prevent loss of your data and wasting your time. We crave your forgiveness if we have been over optimistic in keeping Murphy’s Law at a distance.

Where’s the latest?

You need not take any action to use the latest versions of the Peer Assess Pro™ Xorro Teacher’s Dashboard. Those updates happen in the background and will automatically use any data and activities you have initiated. However, if you use the PDF version of this user guide, you will need to update regularly to the latest version here.

Quick links and related information

FAQ: where may I view the most recent version of the Reference Guide?


FAQ: How do I find the Peer Assess Pro Xorro Teacher’s dashboard?

If you quit your browser then wish to return to the Teachers Dashboard

HOME: Running Activities

Alternative method: ACTIVITIES: Running Activities


From HOME Tab


From Activities Tab: Running Activities

Quick links and related information


FAQ: How do I navigate the PARTICIPANTS page for Peer Assess Pro?

Select PARTICIPANTS Tab

 

Note the list of ‘All participants’ currently known to Xorro in your institution.

Note a list of all other Groups uploaded by other Teachers in your institution. A group is a list of participants, such as students in a class. The minimum requirement for a Group is id, first name, last name.

However, for a peer assessment activity a Group must include team membership for all team members. This team membership is not required for most other Xorro activities. Accordingly, Groups set up for other teachers or by other teachers will rarely contain the correct team membership data required for your Peer Assess Pro™ activity.

Orientation note: Select an existing Group

Select Group ClassAM101.6. This group selection displays  a list of about 25 students in the class titled AM101.6


Inactive functions in PARTICIPANTS page

Quick links and related information

FAQ: How do I correct the participants (team members) in a group I uploaded?

FAQ: How do I correct  the Team Composition in a running peer assessment activity?


FAQ: How do I correct the Team Composition in a running peer assessment activity?

Take care! Here there be dragons!!

Ensure you read ALL of this FAQ before proceeding.

If you make a mistake in this process the consequence may lead to unrecoverable, complete loss of all responses received to date

Apply this process when, for a launched, running peer assessment activity, you need to make these adjustments:

Key check points

View the team composition

Select the ‘Team Composition’ button for the running Peer Assessment Activity for which you wish to adjust the team composition.

Correct the team composition

During the re-import, the changes to the teamset will be presented to you so that you can check and confirm the adjustment process. Take care!

Upon completion of the re-import process, the running Peer Assess Pro Activity will continue.

All students in teams affected by a change in composition are now required to resubmit their peer assessment responses. Reason: They now have different team members to rate. The remaining teams of the class will be unaffected. There responses remain submitted and evident within Peer Assess Pro.

Team members will be notified of their need to re-submit by an automatically generated email from Peer Assess Pro.

Subtle technical note

You cannot change the participants in the Xorro Group used to create the running activity, as explained in the FAQ:

FAQ: How do I correct the participants (team members) in a group already uploaded to Xorro?

Reason: whenever a Xorro activity is created a snapshot is taken of the Group used to create the activity. From that moment this snapshot, known as a Xorro Teamset, is inextricably connected with the activity. That activity-specific teamset can be updated only during a running activity through the FAQ detailed above, through the Team Composition section of the Peer Assess Pro dashboard.

In the image above, the Group used to create the peer assessment activity is BT101. Any changes made to that group WILL NOT affect the running activity. The teamset created from the Group BT101 is denoted 2019-02-24 BT101 by Beta Beta. That name indicates what date the teamset was created, from which Group, and by whom.


FAQ: Can I create a peer assessment activity without having all my teams correctly identified by team name and/or team membership?

You can add, swap or delete delete team members anytime before launching the activity, and anytime before the peer assessment activity is finalised.

Good Practice Hint. Get your team composition list absolutely correct before the activity is launched and made available for response by your students. Reason: All students in teams affected by a change in composition will be required to resubmit their peer assessment responses. The students now have different team members to rate. However, the remaining teams of the class will be unaffected.

Quick links and related information

FAQ: How do I correct the Team Composition in a running peer assessment activity?

FAQ: How do I create a CSV file from a Google Sheet?

Quick links and related information


FAQ: How do I view a demonstration version of Peer Assess Pro?

A Beta Test demonstration site has been established with these credentials:

Browse to: https://qf.staging.xorro.com/

Enter: Username BetaTest, Password Secret

This Beta Test User is established for you to view. But don’t touch to hard!

View


FAQ: How do I correct the participants (team members) in a group already uploaded to Xorro?

View an existing imported Group

Select the Group name to view a list of all students in your class (Group).


Correct the team members associated with an existing Xorro TeamSet Group

Section 2. Launch Peer Assessment Activity

Note that the data loaded from the TeamSet CSV operates according to these rules

If a peer assessment activity is launched and running then you cannot update team membership details.  You must use the FAQ below. Reason: whenever a Xorro activity is created a snapshot is taken of the Group used when creating the activity. From that moment this snapshot, known as a Xorro Teamset, is inextricably connected with the activity. That activity-specific teamset can only be updated during a running activity through the following FAQ.

Quick links and related information

FAQ: How do I correct the Team Composition in a running peer assessment activity?

FAQ: Where may I view the most recent version of the user guides?

Quickstart Guide

Quickstart Guide for Peer Assess Pro: Xorro. (2019, March 6). Peer Assess Pro. http://tinyurl.com/pdfQuickWeb

Pdf version: http://tinyurl.com/pdfQuick

Video guides

Login and orientation. (2019). Auckland: Peer Assess Pro.

Launch a Peer Assess Pro Activity. (2019). Auckland: Peer Assess Pro.

Student survey experience. (2019). Auckland: Peer Assess Pro.

Latest reference guide

Peer Assess Pro. (2019, March 5). Manage a Peer Assessment Activity using Xorro: Reference Guide for Teachers. Auckland: Peer Assess Pro

Web version http://tinyurl.com/papRefWeb2

Pdf version http://tinyurl.com/papRefPdf

Work in progress Google DOCS development version

Google Docs version.

Feel welcome to make suggestions or ask questions using the Comment feature of the Google Docs development version. Shows work in progress improvements.

Frequently Asked Questions for teachers and team members

Frequently Asked Questions (FAQs) (2019). In Manage a Peer Assessment Activity using Xorro: Reference Guide for Teachers [web]. Auckland, New Zealand: Peer Assess Pro. http://tinyurl.com/papFAQ

Teachers Process Flowchart

Peer Assess Pro. (2019). Xorro  Peer Assess ProTM Teachers Process Flowchart: Overview and Detail. http://tinyurl.com/papChart

Quick links and related information


FAQ: How do I decide which Personal Result method to apply in my peer assessment activity

The choice of calculation method for determining a team member’s personal result is determined by the teacher's preference for compensating more strongly team members who have contributed significantly to their teams, and under-rewarding team members who are peer assessed as weak contributors. The figure illustrates the statistical features, such as team average, range, and standard deviation, associated with each method.

Alternative calculation methods for Personal Result (PR) illustrating effect on team average and spread for a given Team Result

The teacher can select either the Peer Assessed Score (PA Score) or Peer Assessed Index (PA Index) if they wish to exclude a team result in calculating the Personal Result (PR).

More usually, the Peer Assessed Score and Team Result (TR) are combined mathematically to produce a Personal Result. There are three alternative methods. As the figure illustrates, the Indexed Personal Result (IPR) is the least discriminating method, whilst the Rank-Based Personal Result (RPR) is the most discriminating in terms of favouring significant team contributors and penalising weak contributors. Most teachers select the Normalised Personal Result, often with a spread factor of 1.5 to 2.0.

In contrast to the graphical illustration earlier, the following table summarises the example calculations presented through a series of FAQ that present the mathematical definition and example calculations for each method.

Comparison of Personal Results calculated by several methods in a team of four members

ASSESSEE

ASSESSOR

Bridget

Julian

Lydia

Nigella

Mean

Range

Rank Reversed

1

2

4

3

Peer Assessed Score, PA Score

54

74

82

78

75

28

Peer Assessed Index, PA Index

66

90

100

95

88

34

Team Result, TR

50

50

50

50

50

0

Indexed Personal Result, IPR

33

45

50

48

44

17

Normalised Personal Result, NPR

(SpreadFactor = 1)

39

51

56

54

50

17

Normalised Personal Result, NPR

(Spreadfactor = 2)

28

52

62

58

50

34

Rank-Based Personal Result, RBR

20

40

80

60

50

60

Source: FAQ: How are peer assessment and personal results calculated and defined mathematically?


Definitions and features of calculation methods used in Peer Assess Pro

Attribute (X1)

Abbreviation (X1)

Definition (X1)

Peer Assessed Score

PA Score

A relative measure of the degree to which a team member has contributed to their team's overall achievement, team processes, and leadership. The Peer Assessed Score (PA Score) is calculated for each team member directly from their Average Team Contribution (ATC) and Average Leadership Contribution (ALC). That is, from the ten components of Team and Leadership contribution survey in the peer assessment.

A Peer Assessed score is generally used to compare the relative contribution of students WITHIN the same team, rather than BETWEEN teams. The Team Result has NO impact on the value of the Peer Assessed Score. Values for the PA Score range from zero through 100.

Peer Assessed Index

PA Index

The Peer Assessed Score (PA Score) is indexed upwards so that the person in the team with the highest Peer Assessed Score is awarded a Peer Assessed Index of 100. All other team members receive a proportionally lower PA Index in the ratio PA Score / max(PA Score). The Team Result has NO impact on the value of the Peer Assessed Index.

Team Result

TR

The result awarded to the team for the outputs of their work. The teacher typically derives the Team Result (TR) from grades for team reports, presentations, and results of Team Readiness Assurance Tests.

The teacher may select to combine a student's Peer Assessed Index (PA Index) with their team's Team Result (TR) to calculate a Personal Result (PR) for each student, reflecting their relative contribution to the Team Result as assessed by their peer team members. Peer Assess Pro enables the teacher to select from several methods to combine the Team Result and Peer Assessed Index (PA Index) to produce a Personal Result: the Indexed Personal Result (IPR), the Normalised Personal Result (NPR), and the Rank Based Personal Result (RPR).

Measures of a student's personal result

Personal Result

PR

A student's personal result gained from combining their Peer Assessed Index (PA Index) and, optionally, their Team Result (TR).

The teacher selects from one of several Calculation Methods to calculate the Personal Result that incorporates the Team Result. These methods are Indexed Personal Result (IPR), Normalised Personal Result (NPR), and Rank-Based Personal Result (RPR).

The choice of method is determined by the teacher's preference for compensating more strongly students who have contributed significantly to their teams, and under-reward students who are peer assessed as weak contributors. Figure 1 illustrates the statistical features, such as team average, range, and standard deviation, associated with each method. The IPR is the least discriminating method, whilst the RPR is the most discriminating in terms of favouring significant team contributors and penalising weak contributors, as the figure illustrates.

Indexed Personal Result

IPR

The Indexed Personal Result is calculated from the Team Result (TR) combined with the student's specific Peer Assessed Index (PA Index). The Indexed Personal Result method awards the Team Result to the TOP RATED student in the team, since, by definition, their Peer Assessed Index is 100. All remaining students in the same team earn the Team Result downwards, directly proportional to their PA Index.

The Indexed Personal Result calculation means that NO team member can earn an Indexed Personal Result greater than the Team Result. That is, values for the Indexed Personal Result range from zero up to the Team Result.

Normalised Personal Result

NPR

The Normalised Personal Result is calculated from the Team Result combined with the student's specific Indexed Personal Result (IPR). However, in contrast to the IPR method, the Normalised Personal Result method awards the AVERAGE student in the team the Team Result (TR). All remaining students are awarded a Personal Result ABOVE or BELOW the Team Result depending on whether their IPR is above or below that team's average.

Features of the Normalised Personal Result are that (a) In contrast to the IPR method, the Normalised Personal Result method calculates a Personal Result ABOVE the Team Result for the above-average peer rated students in the team (b) The average of the team's Normalised Personal Results matches the Team Result (c) The spread of the team's Normalised Personal Results matches the spread of the Indexed Personal Results (IPR) that is calculated for that team. Spread is measured by the standard deviation statistic. .

Optional feature: To enhance the effect of rewarding high contributors and penalising weak contributors the tutor can increase the Spread Factor (SF) from the default value of 1.0. Increasing the Spread Factor increases the spread of the results centred around the Team Result. However, an increase in the Spread Factor will maintain a team average NPR that matches that team's Team Result. A Spread Factor of 1.5 to 2.0 is recommended, especially in classes where team members are reluctant to penalise weak contributors and/or reward the highest contributors through their peer assessment rating responses.

Values for the NPR range from zero to 100. Calculations that exceed these ranges are clipped to fit within zero to 100

Rank Based Personal Result

RPR

The Rank Based Personal Result is calculated from the Team Result combined with the student's specific Rank Within Team based on that student's Peer Assessed Score. Like the Normalised Personal Personal Result the RPR method awards the AVERAGE student in the team the Team Result. All remaining students are awarded a personal result above or below the Team Result depending on whether their Rank Within Team is above or below that team's middle-ranked student.

Features of the Rank Based Personal Result (PR) calculation method are that (a) A team's RPR values are spread over a MUCH WIDER range than the NPR and IPR methods. Small differences in PA scores within a team are amplified significantly by this method (b) In contrast to the IPR method, the RPR method calculates a Personal Result significantly ABOVE the Team Result for the top ranked student in the team (c) Like the NPR method, the average of the team's RPR values matches the Team Result. Values for the Rank Based Personal Result range from zero to 100. Calculations that exceed these ranges are clipped to fit within the range zero to 100.

Note that in the Xorro version of Peer Assess Pro, we have renamed the following Personal Result Methods from those used in the Google Docs version of Peer Assess Pro.

Renaming of terms for Peer Assess Pro

Peer Assess Pro

Abbreviation

Google Peer Assess Pro

Abbreviation

Peer Assessed Score

PA Score

Team Based Learning Score

TBL Score Score

Peer Assessed Index

PA Index

Team Based Learning Index

TBL Index

Quick links and related information

FAQ: How are peer assessment and personal results calculated and defined mathematically?


FAQ: How are peer assessment and personal results calculated and defined mathematically?

A teacher has several alternative calculation methods to determine a personal result from a team member’s Peer Assess Pro assessment. The teacher will usually advise team members about the method they have chosen.

The teacher’s choice of calculation method for a personal result is determined by the teacher's preference for

These choices are illustrated in this figure.

A student’s Personal Result emerges from the Teacher’s choice of Calculation Method, relative Peer Assessed Score, and Team Result

Calculation methods that exclude a team result

The teacher can select either the Peer Assessed Score (PA Score) or Peer Assessed Index (PA Index) if they wish to exclude the team result in calculating the personal result.

Quick links and related information

FAQ: How is the Peer Assessed (PA) Score calculated?

FAQ: How is the Peer Assessed Index (PA Index) calculated?

Calculation methods that incorporate a team result from team outputs

More usually, the Peer Assessed Score (PA Score) and team result are combined through one of three methods. The following methods are listed in order of increasing impact for compensating more strongly students who have contributed significantly to their teams, and under-rewarding students who are peer assessed as weak contributors

FAQ: How is the Indexed Personal Result (IPR) calculated?

FAQ: How is the Normalised Personal Result (NPR) calculated?

FAQ: How is the Rank Based Personal Result (RPR) calculated?

Quick links and related information

FAQ: What factors are measured in the peer assessment survey?

FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?

FAQ: How do students know where and when to complete the peer assessment activity then review their results?

Automated communications to students

The Teachers Process Flowchart: Detail illustrates the points throughout the peer assessment process where emails are sent to students to advise them

In most cases, the emails are generated automatically by the Peer Assess Pro system. In the case of warnings, the teacher has the option of initiating an email request to a student, or ignoring that warning.

Copies of all emails are sent to the teacher whose Xorro account was used to launch the activity

Standard operating mode

When you create and launch a Peer Assess Pro™ Peer Assessment activity in Xorro AND the Start Date has been reached:

Alternative mode for student access to assessment and results

Alternatively, the teacher can direct students to the Participant URL shown at the top left of the  Xorro HOME page. The student must then select from a list the correct peer assessment activity for their response. The teacher may deliver other Xorro-based test activities from which the student must select the correct Peer Assess Pro™ activity distinguished by the Activity Title specified by the teacher.

FAQ: How do I view and experience what the students experience?

FAQ: What questions are asked in the peer assessment survey? in the peer assessment survey?


FAQ: How do I view and experience what the students experience?

View your student’s personal results directly from your Teacher’s Dashboard

xxx UNDER DEVELOPMENT xxx

View your students’ experience of the Peer Assess Pro™ survey

Enter your Participants’ URL into your browser


Select the activity you wish to experience

Login in using the Identification (id) of a student in the Team List Group used to create the activity


View a survey ready and waiting for responses

The student will see this view when all of the following are TRUE:

Note that students can continue to submit responses AFTER the Due Date UNTIL the teacher has Finalised the activity.

View a sample question

View a student’s published results

The student will be able to see their Personal Results when all the following are true:

A student with a Xorro Plus account may view his results any time after the Activity is Finalised by the Teacher.

The student views

Example results for a student

View the peer assessment survey for a demonstration class

Xxx TO DO xxx

Quick links and related information

FAQ: What questions are asked in the peer assessment survey? in the peer assessment survey?

FAQ: How is the Peer Assessed (PA) Score calculated?

FAQ: How do students know where and when to complete the peer assessment activity then review their results?


FAQ: Why are different terms used to display peer assessment results in the Xorro and previous Google versions of Peer Assess Pro™?

The following terms have been renamed from the Google version of Peer Assess Pro for Peer Assess Pro

Renaming of terms for Peer Assess Pro

Peer Assess Pro

Abbreviation

Google Peer Assess Pro

Abbreviation

Peer Assessed Score

PA Score

Team Based Learning Score

PA Score Score

Peer Assessed Index

PA Index

Team Based Learning Index

PA Index

Quick links and related information

FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?


FAQ: How do I take action on the Warnings presented in the Peer Assess Pro™ Teacher’s Dashboard?’ What if I ignore the Warnings?

In general, see the Sections

Action responses to warnings

Responding to Active Warnings

Warning messages are under constant development and refinement as we respond to facilitators’ and team members’ experience of Peer Assess Pro.

Critical and catastrophic warnings!

These warnings must be resolved, otherwise utterly invalid results will arise, and students’ time will be wasted completing incorrect surveys.

Example: The composition of a team needs adjusting, see

See Adjusting team composition

Important warnings

Peer Assess Pro will not be able to present results for all teams unless these warnings are resolved.

Example: Insufficient responses from a team are received

See Results hidden from team members and teacher

Example: Enter Team Results:

See Enter Team Results

Informational warnings

Advisory warnings do not affect critically the operation of Peer Assess Pro. However, the teacher would be prudent to review the details to ensure that peer assessments have been conducted fairly and honestly.

Example: Overgenerous or parsimonious ratings by a team member.

FAQ: How is an outlier peer assessment rating identified?

Optional emails generated for team members

Several warnings give the facilitator the option to despatch an email to students advising them of exceptional conditions and requesting their action. For example

The criteria used to generate these warnings, and the recommended response by the facilitator is detailed in this section:

Responding to Active Warnings

For example, in the case of a Mismatched self-assessment, the team member is invited to meet with the teacher to explore the reasons for the mismatch, and develop approaches to narrow the gap.

Quick links and related information

FAQ: What is the content of emails sent by Peer Assess Pro?

Responding to Active Warnings


FAQ: When, why, and how do I Refresh and Update Results?

When to recalculate

 You select Refresh and Update results when

Why recalculate?

The most important reason is that you as a teacher MUST be able to review results BEFORE displaying (publishing) results to students. After examining the results to date, you might publish an interim snapshot of the results for view by students.

Students may review the interim results and raise issues such as a questionable peer assessment rating, such as scapegoating. Alternatively, you may need to adjust a Team Result, or experiment with another method of Personal Result Calculation.

In this situation, we have presumed you do not want new responses, nor adjustments to be immediately viewable by students. In particular,  you need the opportunity to review the effect of adjustments before explicitly publishing revised results to students.

How to recalculate

Quick links and related information


FAQ: What questions are asked in the peer assessment survey?

The Peer Assess Pro survey measures one overall assessment, Recommendation, followed by ten quantitative ratings, then several qualitative questions.

The ten quantitative ratings are used to calculate the Peer Assessment Score (PA Score). The ten ratings are categorized into two classes: Contribution to Task, and Contribution to Leadership and Teamwork, as shown in the example survey below.

In addition, two qualitative questions are asked that request examples of behaviours supporting the quantitative ratings in relation to Contribution to Task, and Contribution to Leadership and Teamwork. Finally, the assessor is asked to provide Development Feedback. That is, advice that would help the team member improve their future contribution to the team.

Quick links and related information

FAQ: How is the Peer Assessed (PA) Score calculated?

FAQ: How do students know where and when to complete the peer assessment activity then review their results?

FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?

The ten questions used as the basis for calculating the Peer Assessment Score are adapted from:

Deacon Carr, S., Herman, E. D., Keldsen, S. Z., Miller, J. G., & Wakefield, P. A. (2005). Peer feedback. In The Team Learning Assistant Workbook. New York: McGraw Hill Irwin.


Example Peer Assessment Survey: Quantitative

My name is:

I am rating my team member:

My Team name is:

Team Member A

Team Member B

Team Member C

Self

Recommendation

How likely is it that you would recommend this team member to a friend, colleague or employee?

1 = Highly unlikely, 5 = Extremely likely

Contribution to Task Accomplishment

Rate the team member on a 5-point scale.

Rating scale:

1 = Almost never, 2 = Seldom, 3 = Average, 4 = Better than most, 5 = Outstanding

Rate your typical or average team member a mid-level rating of 3.

Initiative

Shows initiative by doing research and analysis.

Takes on relevant tasks with little prompting or suggestion.

Attendance

Prepares for, and attends scheduled team and class meetings.

Contribution

Makes positive contributions to meetings.

Helps the team achieve its objectives.

Professionalism

Reliably fulfils assigned tasks.

Work is of professional quality.

Ideas and learning

Contributes ideas to the team's analysis.

Helps my learning of course and team project concepts.

Contribution to Leadership and Team Processes

Focus and task allocation

Keeps team focused on priorities.

Facilitates goal setting, problem solving, and task allocation to team members.

Encourages contribution

Supports, coaches, or encourages all team members to contribute productively.

Listens and welcomes

Listens carefully and welcomes the contributions of others.

Conflict management and harmony

Manages conflict effectively.

Helps the team work in a harmonious manner.

Chairmanship

Demonstrates effective leadership for the team.

Chairs meetings productively.


Example Peer Assessment Survey: Qualitative

Peer Assessment Survey:

Feedback to the team member

Submit one copy of this form for each team member

My name is:

I am a member of Team Number and Name:

I am assessing (student’s name):

Contribution to Task Accomplishment

For the team member you have assessed, provide specific examples of productive or ineffective behaviours related to your ratings of Contribution to Task Accomplishment. For example, shows initiative; attends meetings; makes positive contributions; helps team achieve objectives; is reliable; contributes quality work; contributes to learning of course concepts. Further examples here http://tinyurl.com/BARSOhland

Contribution to Leadership and Team Processes

For the team member you have assessed, provide specific examples of productive or ineffective behaviours related to your ratings of Contribution to Leadership and Team Processes. For example: keeps team focused on priorities; supports, coaches and encourages team members; listens carefully; manages conflict effectively; demonstrates effective leadership.

Development feedback

What specific behaviours or attitudes would help your team member contribute more effectively towards your team's accomplishments, leadership, and processes? Please provide specific positive or constructive feedback that could enable the team member to improve their behaviour productively. Considering your team member's strengths, how could that person coach other team members to acquire similar strengths for Task Accomplishment, Team Processes, and Leadership?

Source: Peer Assess Pro (2019).


FAQ: How is the Peer Assessed (PA) Score calculated?

The Peer Assessed Score, PA Score, is a relative measure of the degree to which a team member has contributed to their team's overall achievement, team processes, and leadership.

A Peer Assessed Score is generally used to compare the relative contribution of students WITHIN the same team, rather than BETWEEN teams. The Team Result has NO impact on the value of the Peer Assessed Score.

The PA Score is calculated for each team member directly from summing the ten ratings of Team and Leadership Contribution surveyed in the peer assessment. The sum of ratings is  adjusted by scale factors to give values for the PA Score that range from zero through 100.

The Peer Assessed Score is an essential factor used as the basis for calculating several alternative measures of Personal Result including the Peer Assessed Index (PA Index), Indexed Personal Result (IPR), Normalised Personal Result (NPR), and Rank Based Personal Result (RPR).

The self-assessment is excluded from calculating PA Score

The self-assessment conducted by a team-member is EXCLUDED from the calculation of their Peer Assessed Score. The self-assessment, PA (self),  is used to enable the student to compare their self-perception with that of their team members, and the class as a whole. One method of comparison, the IRSA, is based on the ratio  as detailed in the FAQ:

FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?


Mathematical definition of Peer Assessed Score, PA Score

There are ten Peer Rating components  awarded by each Assessor, a, to each Assessee, s, in the team of t members. The mathematical task is to combine all these ratings into one Peer Assessed Score for each team member.

The Peer Assessed SubScore  is defined as the peer assessment score awarded by Assessor a to Assessee s:

                           

Where

 = the Peer Rating for each of the ten peer assessment components, r,  submitted by the Assessor a for the assessed team member, the Assessee, s. The student’s self-assessment is excluded from the calculation of the PA Score. The Recommendation rating is excluded from calculation of the PA Score.

To ensures the PA Score ranges from zero through 100 the following features are required in the above formula:


The Peer Assessed Score,  for team members s is the mean of the PA Subscores awarded by the other (t - 1) team members to the team member s.

Where

t = the number of team members in the team in which s is a team member.

 = the peer assessment score awarded by Assessor a to Assessee s, mathematically defined earlier.

Note that Peer Assessed Score takes NO account of the team’s Team Result. The Team Result is accounted for in the Indexed Personal Result (IPR), Normalised Personal Result (NPR) and Rank-Based Personal Result (RPR) methods discussed elsewhere.

An example calculation is shown below. In the first table, the team member Bridget (ASSESSEE) is rated by her three team members (ASSESSORS), plus her own self-rating. The subsequent tables show the calculation of the Peer Assessment Score for all four team members based on all team members’ assessment ratings. The long-form calculations show in detail the arithmetic calculations.

Quick links and related information

FAQ: What questions are asked in the peer assessment survey?

FAQ: How do students know where and when to complete the peer assessment activity then review their results?

Alternative but equivalent methods for calculating the Peer Assessed Score are detailed below in the section:

Alternative mathematical formulations of PA Score

 

Example table of assessments for assessed team member Bridget

ASSESSEE: Bridget

ASSESSOR:

Ratings by team member:

Team Name: Kubla

Bridget (Self)

Julian

Lydia

Nigella

Mean Rating

Contribution to Task Accomplishment

Rating scale:

1 = Almost never, 2 = Seldom, 3 = Average, 4 = Better than most, 5 = Outstanding

Initiative

Shows initiative by doing research and analysis.

Takes on relevant tasks with little prompting or suggestion.

5

5

3

1

9/3

Attendance

Prepares for, and attends scheduled team and class meetings.

4

4

4

1

9/3

Contribution

Makes positive contributions to meetings.

Helps the team achieve its objectives.

4

5

5

1

11/3

Professionalism

Reliably fulfils assigned tasks.

Work is of professional quality.

4

3

4

1

8/3

Ideas and learning

Contributes ideas to the team's analysis.

Helps my learning of course and team project concepts.

5

5

5

1

11/3

Contribution to Leadership and Team Processes

Focus and task allocation

Keeps team focused on priorities.

Facilitates goal setting, problem solving, and task allocation to team members.

5

5

3

1

9/3

Encourages contribution

Supports, coaches, or encourages all team members to contribute productively.

4

4

4

1

9/3

Listens and welcomes

Listens carefully and welcomes the contributions of others.

5

5

3

1

9/3

Conflict management and harmony

Manages conflict effectively.

Helps the team work in a harmonious manner.

4

4

4

1

9/3

Chairmanship

Demonstrates effective leadership for the team.

Chairs meetings productively.

5

5

5

1

11/3

SubTotal

SubTotal = Task + Leadership

45

45

40

10

# 95/30

( 3.167)

Peer Assessed Score

PA Score = (2.5 x SubTotal ) - 25

* 87.5

87.5

75

0

54.2

* The self-assessment ratings are excluded from calculation of the PA Score. So, 54.2 = (87.5 + 75 + 0) / 3

# Alternatively, PA Score = (25 x Mean Rating) - 25. So, 54.2  = 25 x 95/30 - 25 = (25 x 3.167) - 25


Example calculations of Peer Assessed Score

Suppose that the Peer Assessed Scores determined from all four team members rating each other appear as follows. Bridget’s PA Scores are copied from the previous table, forming the second vertical column here.

Since

Now consider the Assessment by Lydia of Bridget

In the previous table, note how Nigella rated Bridget with the minimum possible rating of one for all ten components. By definition, that gives a PA Score of zero. Similarly, if an assessor had rated a team member the maximum rating of five across all ten components, then a PA Score of 100 would have resulted.

Peer Assessed Sub-Scores for a team of four members

ASSESSEE

ASSESSOR

Bridget

Julian

Lydia

Nigella

Bridget

87.5

62.5

75

72.5

Julian

87.5

92.5

87.5

82.5

Lydia

75

82.5

77.5

80

Nigella

0

77.5

82.5

82.5


Now the PA Score for each ASSESSEE team member is calculated from the mean of the PA SubScores provided by the other ASSESSORS in their team, as shown in the following table. The self-assessments of each ASSESSOR are excluded from the calculation. For example, the PA Score for Nigella is determined as follows from the ratings by her three teammates Bridget, Julian and Lydia:

Since

Then for Nigella

Calculation of Peer Assessed (PA) Scores for a team of four members

ASSESSEE

ASSESSOR

Bridget

Julian

Lydia

Nigella

Bridget

-

62.5

75

72.5

Julian

87.5

-

87.5

82.5

Lydia

75

82.5

-

80

Nigella

0

77.5

82.5

-

Peer Assessed Score

54.2

74.2

81.7

78.3


Note how Nigella’s rating of Bridget (PA Score = 0) seems an outlier when compared with the much higher ratings given by Julian and Lydia (7.5 and 75). Peer Assess Pro warns the teacher when outlier ratings like this occur.

This outlier issue is discussed in

FAQ: How is an outlier peer assessment rating identified?                                                        

Alternative mathematical formulations of PA Score

The following equations provide the identical mathematical result for the calculation of PA Score.

Calculation from Average Rating

Where:

Average Rating is the average rating of an assessed student s averaged over all the ten components of the rating for that student, by their team members. The Average Rating lies between 1 and 5.

The factor (-1) adjusts the Average Rating value to zero through 4. The scale factor 100 /4 adjusts the PA Score to lie between zero and 100.

Notice from the first table showing ratings of Bridget that the average rating across all ten components contributing to her Peer Assessment Score given by her three team members was shown as  

Therefore, the PA Score is calculated directly from the average rating:

Calculation from Average Team and Leadership Contributions

Finally,

 

         

Where:

ATC and ALC are the average ratings for the five components that comprise the Task and Leadership contributions, respectively.

Mathematically:

ATC and ALC range over the values 1 through 5. The factor (-1) adjusts those values from zero through 4. The scale factor 50/4 (= 12.5) ensures that the PA Score achieves a range from zero to 100.

Quick links and related information

FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?

FAQ: How are peer assessment and personal results calculated and defined mathematically?


FAQ: How is the self-assessment used to calculate Peer Assessed Score?

The self-assessment conducted by a team-member when they rate their team members is EXCLUDED from the calculation of that team member’s Peer Assessed Score. Instead, their self-assessment, PA (self),  is used to enable the team member to compare their self-perception with that of their team members, and the class as a whole. This comparison is provided to the team member through a SPider Chart and the calculation of their Index of Realistic Self Assessment (IRSA).

Spider chart of individual and averaged team peer ratings

The Spider Chart shows each of their eleven ratings provided by themself, compared with the average of the ratings provided to them by their peer team members. The class average ratings for each of the 11 factors are also provided.  In this example, the team member has significantly UNDERRATED themself on nearly all factors (innermost plots), when compared with the ratings provided by their team members (orange).

Spider Chart comparison of self and other team members’ contribution ratings

Index of Realistic Self-Assessment (IRSA)

Another method of comparison, the IRSA, is based on the ratio

as detailed in the FAQ:

FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?

For the team member illustrated in the foregoing Spider Chart, their Peer Assessed Score, PA Score, is 92 and their self-assessed Score, PA (self), is 75. The ratio results in the Index of Realistic Self Assessment (IRSA) 122 = 100 x 92 / 75.

An IRSA between 75 and 95 is typical of about 2/3 of team members in a class. About ⅙ of team members achieve an IRSA below 75. Such people appear to assess their team members excessively OVERCONFIDENT in their abilities. In contrast, an IRSA above 95 suggests the team member has a tendency to UNDERESTIMATE their team contribution when contrasted with the assessment perceived by their team members.

Quick links and related information

FAQ: How are peer assessment and personal results calculated and defined mathematically?


FAQ: How is the Peer Assessed Index (PA Index) calculated?

The Peer Assessed Index is defined such that the team member with the maximum PA Score for each team is assigned a PA Index of 100. All other team members in the same team are scaled in relation to the maximum PA Score for that group.

In a gradebook of results, the PA Index is useful for identifying the team members most highly rated by their peers, as they have PA Indexes in the 90 to 100 range. In combination with the Team Result, the PA Index is used to calculate the Indexed Personal Result, (IPR), Normalised Personal Result, (NPR) and Rank-Based Personal Result (RPR).

Mathematical definition of Peer Assessed Index

Where

 = the Peer Assessed Score for a team member s in team t, as defined in: FAQ: How is the Peer Assessed (PA) Score calculated?

= the maximum value of PA Score found across all members in team t.


Example calculations of Peer Assessed Index

Consider a team of four team members, whose PA Scores are shown in the following table. Lydia has a PA Score of 82, the highest for the team. Therefore, Lydia’s PA Index is 100, by definition.

Calculation of Peer Assessed Index (PA Index) for a team of four members

ASSESSEE

ASSESSOR

Bridget

Julian

Lydia

Nigella

Bridget

-

62.5

75

72.5

Julian

87.5

-

87.5

82.5

Lydia

75

82.5

-

80

Nigella

0

77.5

82.5

-

Peer Assessed Score

54

74

82

78

Peer Assessed Index

66

90

100

95

Bridget has a PA Score of 54, the lowest for the team. Therefore, since

Note that, as expected

The data for the previous table is drawn from

FAQ: How is the Peer Assessed (PA) Score calculated?

Quick links and related information

FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?

FAQ: How is the Peer Assessed (PA) Score calculated?


FAQ: How is the Indexed Personal Result (IPR) calculated?

The Indexed Personal Result (IPR) is calculated from the Team Result (TR) combined with the team member’s specific Peer Assessed Index (PA Index). The Indexed Personal Result method awards the Team Result to the TOP RATED team member in the team, since, by definition, their Peer Assessed Index is 100. All remaining team members in the same team earn the Team Result downwards, directly proportional to their PA Index.

The definition of Indexed Personal Result means that NO team member can earn an Indexed Personal Result greater than the Team Result. That is, values for the Indexed Personal Result range from zero up to the Team Result. Consequently, the IPR disadvantages team members who have been rated unfavourably by their peers. However, no reward is made for the team member(s) who have been rated as the most contributing team members. In contrast, the Normalised Personal Result and Rank-Based Personal Result do award a Personal Result above the Team Result for those team members who contribute above average to the team’s outputs, as assessed by their peers.

Mathematical definition of Indexed Personal Result

For each team member s, in their team, t

Where

= the team result awarded by the teacher for the outputs of team t

= the Peer Assessed Index for the team member s, as defined in

FAQ: How is the Peer Assessed Index (PA Index) calculated?

Example calculations of Indexed Personal Result

Suppose that the following team has a Team Result, TR, of 50 and Peer Assessed Indexes previously calculated as follows. The example data is taken from:

FAQ: How is the Peer Assessed Index (PA Index) calculated?

Calculation of Indexed Personal Result in a team of four members

ASSESSEE

ASSESSOR

Bridget

Julian

Lydia

Nigella

Peer Assessed Score, PA Score

54

74

82

78

Peer Assessed Index, PA Index

66

90

100

95

Team Result, TR

50

50

50

50

Indexed Personal Result, IPR

33

45

50

47.5

Bridget has a PA Index of 66, the lowest for the team. Therefore, since

                

In contrast, Lydia has the highest PA Score in the team, and hence a PA Index of 100. Therefore

                

The IPR for Lydia is equivalent to the Team Result, 50, as defined.

Quick links and related information

FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?

FAQ: How is the Peer Assessed Index (PA Index) calculated?

FAQ: How is the Peer Assessed (PA) Score calculated?


FAQ: How is the Normalised Personal Result (NPR) calculated?

The Normalised Personal Result, NPR, is calculated from the Team Result combined with the team member’s specific Indexed Personal Result (IPR). The Normalised Personal Result method awards the average student in the team the Team Result (TR). All remaining students are awarded a Personal Result above or below the Team Result depending on whether their IPR is above or below that team's average IPR.

Features of the Normalised Personal Result method are that

Use the Normalised Personal Result method with a high Spread Factor if you

Mathematical definition of Normalised Personal Result

For each team member s, in their team, t

Where

 the team result awarded by the teacher for the outputs of team t

That is, the mean value of the IPR values found for team t, containing n team members.

                   = a factor chosen optionally by the teacher that will  S T R E T C H  each team’s intrinsic spread of NPRs, as measured by the team’s standard deviation of NPR results. The default Spread Factor is 1.0. However a Spread Factor of between 1.5 and 2.o is recommended.

Values of NPR are trimmed to be within the range zero to 100.


Example calculations of Normalised Personal Result

Suppose that the following team has a Team Result, TR, of 50 and Indexed Personal Result previously calculated as follows. This first example illustrates a Spread Factor of 2.0. The example data is taken from:

FAQ: How is the Indexed Personal Result (IPR) calculated?

Calculation of Normalised Personal Result in a team of four members

Spreadfactor = 2.0

ASSESSEE

ASSESSOR

Bridget

Julian

Lydia

Nigella

Mean

Peer Assessed Score, PA Score

54

74

82

78

Peer Assessed Index, PA Index

66

90

100

95

Team Result, TR

50

50

50

50

Indexed Personal Result, IPR

33

45

50

48

44

Correction Factor

(Spreadfactor = 2)

-22

+2

+12

+8

0

Normalised Personal Result, NPR

(Spreadfactor = 2)

28

52

62

58

50

Bridget has a PA Index of 66, the lowest for the team.

The  for the four-member team is 44, calculated from ¼ x (33 + 45 + 50 + 48).

Since

Then

In contrast, the Normalised Personal Result for Lydia, with her IPR of 50, is calculated as follows:

Note how Lydia’s NPR of 62 is above the team Result of 50. Note also how the mean of the NPR values across the team is 50 = (28 + 52 + 62 + 58)/4, identical to the Team Result of 50.


Impact of adjusting the Spread Factor on Normalised Personal result

The previous example showed calculations of NPR using a Spread Factor of 2.0. The following table shows the results of calculating the Normalised Personal Result for the team using a more modest Spread Factor of 1.0.

Note the following:

The default Spread Factor is 1.0. However a Spread Factor of between 1.5 and 2.o is recommended.

Calculation of Normalised Personal Result in a team of four members

SpreadFactor = 1.0

ASSESSEE

ASSESSOR

Bridget

Julian

Lydia

Nigella

Mean

Peer Assessed Score, PA Score

54

74

82

78

Peer Assessed Index, PA Index

66

90

100

95

Team Result, TR

50

50

50

50

50

Indexed Personal Result, IPR

33

45

50

48

44

Correction Factor

(SpreadFactor = 1)

-11

+1

+6

+4

0

Normalised Personal Result, NPR

(SpreadFactor = 1)

39

51

56

54

50

Quick links and related information

FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?

FAQ: How is the Peer Assessed Index (PA Index) calculated?

FAQ: How is the Peer Assessed (PA) Score calculated?

FAQ: How are peer assessment and personal results calculated and defined mathematically?


FAQ: How is the Rank Based Personal Result (RPR) calculated?

The Rank Based Personal Result is calculated from the Team Result combined with the student's specific Rank Within Team based on that student's Peer Assessed Score. Like the Normalised Personal Personal Result the RPR method awards the AVERAGE student in the team the Team Result. All remaining students are awarded a personal result above or below the Team Result depending on whether their Rank Within Team is above or below that team's middle-ranked student.

Features of the Rank Based Personal Result (RPR) calculation method are that


Mathematical definition of Rank-Based Personal Result

For student s in their team t with n team members

Where

the team result awarded by the teacher for the outputs of team t

                   = the reversed rank of the team member s in team t where the team member with the lowest Peer Assessed Score in that team is defined as 1. Equal ranks are permitted.

                            = numbers of members in team t

Values of RPR are trimmed to be within the range zero to 100.


Example calculations of Rank-Based Personal Result

Suppose that the following team has a Team Result, TR, of 50 and Peer Assessed Scores previously calculated as follows. The example data is taken from:

FAQ: How is the Peer Assessed (PA) Score calculated?

Calculation of Rank-Based Personal Result in a team of four members

ASSESSEE

ASSESSOR

Bridget

Julian

Lydia

Nigella

Mean

Peer Assessed Score, PA Score

54

74

82

78

Rank (Reversed)

1

2

4

3

Share Fraction

1/10

2/10

4/10

3/10

Team Result, TR

50

50

50

50

50

Rank-Based Personal Result, RBR

20

40

80

60

50

First calculate the sum of ranks for the team of four members, n = 4. This number is the denominator for calculating the ShareFraction for each team member.

 

Consequently, there are 10 ‘pieces of cake’ to be shared amongst the 4 team members, in proportion to their reversed rank.


Bridget has a PA Score of 64, the lowest for the team. Her rank in the team,  is therefore 1.

Note how the second-ranked student, Julian receives double the ShareFraction, and, consequently, double the RPR than does Bridget

Lydia, the top-ranked student,  = 4, receives four times the RBR that Lydia received, = 80.

Note how the mean of the RBR values matches the Team Result for team t of 50.

             =

                                =

    =

    =

Note that, by definition, the sum of the ShareFractions across the team is exactly  100 %.

Example calculation with tied ranks

The following example shows a case where two team members have the same Peer Assessed Score of 74. Note how Lydia has a reverse rank of 4, not 3. The Google RANK function, for example, with the optional is_ascending flag set to 1 demonstrates this ranking behaviour.

Calculation of Rank-Based Personal Result with tied scores

ASSESSEE

ASSESSOR

Bridget

Julian

Lydia

Nigella

Mean

Peer Assessed Score, PA Score

54

74

82

74

Rank (Reversed)

1

2=

4

2=

Share Fraction

1/9

2/9

4/9

2/9

Team Result, TR

50

50

50

50

50

Rank-Based Personal Result, RBR

22

44

89

44

50

Quick links and related information

FAQ: How do I decide which Personal Result method to apply in my peer assessment activity?

FAQ: How is the Peer Assessed Index (PA Index) calculated?

FAQ: How are peer assessment and personal results calculated and defined mathematically?

FAQ: How is Standard Peer Assessed Score (SPAS) calculated?

How do we compare students within a class, and between classes based on their Peer Assessed Scores?

The short answer is: We can use the Peer Assessed Score to compare students ONLY within their team. A higher PA Score within a team suggests that team member has contributed more than a team member with a lower PA score.

A Peer Assessed Score of 90 clearly indicates that a student in the same team has contributed more to their teams outcomes than a student in the same team with a Peer Assessed Score of 30. However, a Peer Assessed Score achieved by a student in one team does not meaningfully compare with the Peer Assessed Score of a student in another team. A Peer Assessed Score of 60 is no better nor worse than a PA Score of 90 achieved by a student in another team. We cannot conclude from comparing the Peer Assessed Score which is the better student in terms of team contribution and/or leadership when the students are from different teams. Why? Some students and teams diligently commit to rating each other so the average student in their team does rate ⅗ on each of the ten item in the peer assessment survey, as intended. Meanwhile, other teams believe they are all above average, having come from their local equivalent of Lake Wobegon. By chance and/or good team functioning, some teams achieve that desired state where all members work productively and effectively together: the Holy Grail of the Dream Team. Other teams comprising high performers can conversely fall into the desolation of dismal performance characterised by the Apollo Syndrome (Belbin). 

The long answer is that through applying appropriate data analytics, we can develop three related numbers that enable comparison of peer assessed team members both within and between classes, and over time. These measures are Standard Peer Assessed Score (SPAS), Employability, and Net Employability. In essence, the data analytic processes can be likened to a forensic photoanalyst attempting to read an automobile’s number plate. Imagine the original photo image has been photographed through smog, on a dark night, from a far distance, with a low resolution setting, using a poor quality lense and poor imaging sensor. But through advanced algorithms that remove background noise, amplify relevant signals, and enhance clarity, a readable, useful image can be discerned, as illustrated in the example of from Acclaim Software

Source: Acclaim Software. (2015). Forensics - Recovering the Most Detail from Your Image - Focus Magic. http://www.focusmagic.com/forensics-tutorial.htm

The Standard Peer Assessed Score (SPAS) is our first measure designed to enable a more realistic relative comparison of peer assessment ratings between members of a whole class. The Standard Peer Assessed Score combines normalised values of the Recommendation and Peer Assessed Score for each team member. The normalisation applies several data analytic process to correct for the biases introduced by some students and teams in their rating. The SPAS approach is not perfect, but it’s a start. Furthermore, the determination of Standard Peer Assessed Score is a necessary precursor to the calculation of Employability and Net Employability, discussed elsewhere.

Design features of Standard Peer Assessed Score

The whole-of-class values of Standard Peer Assessed Score for a particular class response dataset are targeted to have these features:

Mean: 50

Standard Deviation: 20

Maximum possible range: from 0 to 100

By virtue of the definition of the Standard Peer Assessed Score, the following effects occur by design:

One half of the class values of Standard Peer Assessed Score will fall in the range zero to 50 (below the target average). Naturally, the remaining one half of values will fall in the range 50 to +100 (above average).

Approximately ⅔ of Standard Peer Assessed Scores in the class will lie between 30 and 60. That is, within one standard deviation of the mean value of 50. More accurately, if SPAS was normally distributed, then 68.3 percent of the class dataset values of SPAS will lie between plus and minus one standard deviation of the mean.

Approximately ⅙ of students in the class will receive a Standard Peer Assessed Score value of either greater than 60, or less than 30. More precisely, 15.9 percent of values will lie in each of these ranges.

Finally, given the wonders of the normal distribution, 95% of all class members will lie in the range of SPAS 10 through 90. That implies that a student with a SPAS above 90 is in the top 2.5 % of members of the class. Conversely a student with SPAS less than 10 is in the bottom 2.5% of the class. This knowledge allows the teacher to more reliable identify their star students, and students at risk, rather than relying simply on Peer Assessed Score.

Mathematical calculation

The general approach to creating the Standard Peer Assessed Score is to apply z-score normalisation to a student’s (raw) Recommendation, R and Peer Assessed Score, PAS. The two z-scores () are added, then re-scaled to achieve, for the class dataset as a whole, the target mean,  of 50, and target standard deviation,  of 20 required for the SPAS statistic. Note that the result of z-score normalisation for any datset is such that the normalised data has a mean of zero and standard deviation of 1.0, detailed later.

 The Standard Peer Assessed Score for student s is defined as

Where

                 = Target mean for the SPAS statistic, by definition a constant of 50

                =Target standard deviation for the SPAS statistic, by definition a constant of 20

                = a correction factor to ensure the standardisation process achieves the target standard deviation, . The factor is required because in practice the distributions of the raw data are not normally distributed, but tend to have strong negative skew, due to such factors as the Lake Wobegon effect mentioned earlier. A factor of 1.2 has been found appropriate in practice.

= The z-score normalisation of the Recommendation rating for student s by their team members.        

        

= The z-score normalisation of the Peer Assessed Score rating  for student s by their team members.

                = Recommendation rating awarded to student s by their team members in team t

        = Peer Assessed Score awarded to student s by their team members in team t

                = estimate of the class mean Recommendation rating derived over all valid assessed teams in the class responses dataset

                = estimate of the class standard deviation of the class Recommendation ratings derived over all valid assessed teams in the class responses dataset

        = the population mean of the Peer Assessed Scores derived over all team members in the team t  in which student s is a member.

        = the population standard deviation of the Peer Assessed Scores derived over all team members in the team t in which student s is a member.

Notes

Divisor of 2. The sum of the two z-score normalised functions, each with unit standard deviation, gives a resulting distribution with standard deviation of 2.0. Consequently the divisor of 2 is required in the calculation of SPAS so that has a mean of zero and standard deviation of 1.

Trimming. Values of Standard Peer Assessed Score that calculate above +100 are trimmed down to +100. Similarly, values of Standard Peer Assessed Score that calculate below 0 are trimmed up to 0.

Example calculations of Standard Peer Assessed Score

The following table shows example calculations of Standard Peer Assessed Score for three students in two teams, A and B. Note how Michael Mass (Team A) and Lydia Loaded (Team B) have both been awarded the same Peer Assessed Score 0f 50 by their team members. However,  because of their different team means and standard deviations, the z-score normalisations realise +1 and +2 respectively.

As part of the journey towards calculating SPAS, the intermediate calculations of the combined z-scores provide the basis for calculating the percentage proportion of the entire class who would fall below that combined z-score. This can be interpreted as the percentage of the class who would recommend the specific team member to an employee, a colleague, or another team. This percentage is rounded conservatively to produce the student’s Employability rating, the detailed methodology for which is detailed in the FAQ

FAQ: What is Employability? How is it calculated?


Example calculations for Standard Peer Assessed Score (SPAS) and Employability

Student, s

Peter Johns

Michael Mass

Lydia Loaded

Recommendation,

2.0

4.5

3.0

Mean of class Recommendation,

3.0

3.0

3.0

Standard deviation of class Recommendation,

0.5

0.5

0.5

Normalised Recommendation,

(2-3)/0.5 =

-2

(4.5-3)/0.5 =

+3

(3.0-3.0)/0.5 =

0

Peer Assessed Score,

30

50

50

Team,

A

A

B

Mean of team Peer Assessed Score,

40

40

20

Standard Deviation of of team Peer Assessed Score,

10

10

15

Normalised Peer Assessed Score,

(30-40)/10 =

 -1

(50-40)/10 =

 +1

(50-20)/15 =

+2

Combined z-scores,

(-2-1)/2 =

 -1.5

(+3+1)/2 =

 +2

(0+2)/2 =

+1

Target Standard Deviation,

20

20

20

Correction factor,

1.2

1.2

1.2

Target mean,

50

50

50

Standardised Peer Assessed Score, SPAS

50 + 1.2 x 20 x (-1.5)

 = 50 - 36 =

 14

50 + 1.2 x 20 x 2

 = 50 + 48 =

 98

50 + 1.2 x 20 x  1

= 50 + 24 =

 74

Proportion of class below Combined z-score

 

0.5 + GAUSS(-1.5)

= 0.5 - 0.4332 =

 6.9%

0.5 + GAUSS(+2)

= 0.5 + 0.4772 =

 97%

0.5 + GAUSS(+1)

= 0.5 +0.34 =

84%

Employability

10

90

80

Example charts for Standard Peer Assessed Score

The following figures show a Standard Peer Assessed Score histogram, and the histograms for the Recommendation and Peer Assessed Score data that contribute to the Standard Peer Assessed Score chart.

Figure 1. Histogram of Recommendation

Mean = 3.7, standard deviation = 0.53

Figure 2: Histogram of Peer Assessed Score

Mean = 67, standard deviation = 11.3

Figure 3: Standard Peer Assessed Score histogram

Mean = 0, Standard deviation = 20

Assumptions about Standard Peer Assessed Score

The calculation of Standard Peer Assessed Score assumes several conditions, described as follows.

The statistical distributions of the Recommendation and Peer Assessed Scores (PA_Score) are assumed to be normally distributed. In practice, the distributions are typically asymmetric with negative skew. See Figures 1 and 2 earlier.

The Recommendation score awarded to a student s1 in team t1 are assumed to be absolutely comparable to a similar Recommendation score awarded to another student s2 in another team t2. In other words, a Recommendation score of 3.5 awarded to student s1 in team t1 means exactly the same for student s2 in team t2 if they are also awarded a Recommendation score of 3.5. Similarly, a difference in Recommendation ratings of 1.0 unit means the same in any team. In practice, the Recommendations made by one team may not be consistent with the Recommendation values assigned by another team. However, given that Recommendation is a ‘top of mind’ peer assessment done at the start of the Peer Assess Pro survey, we think it is a reasonable approximation. Consequently, the Recommendation values are z-score normalised using the mean and standard deviation of the entire class of responses.

In contrast, in normalising the Peer Assessed Score it is well recognised that different teams award quite different Peer Assessed Scores to a students who would ordinarily achieve the same Peer Assessed Score in an ideal world of perfect raters. Consequently, it is assumed that each team possesses a uniform, random mix of student capabilities drawn from the entire class. Therefore, all things being equal, one would expect that the mean and standard deviations of each team’s Peer Assessed Score would be equivalent. However, in practice, this equivalence is rarely observed. Consequently, the need arises to z-score normalise the Peer Assessed Score for each team to achieve a set of nor aloised Peer Assessed Scores with mean zero and standard deviation 1 FOR EACH TEAM.

The impact of gaming peer assessment

The Peer Assessed Score awarded to a student s1 in team t1 is assumed NOT to be comparable to similar Peer Assessed Score that might be awarded to another student x in team y. Why? Some teams honestly peer assess each, whilst others attempt to ‘game’ the peer assessment process, such as awarding everyone above average, or even the full 5/5 rating for each of the team peer assessment factors. In contrast, it is assumed that the Peer Assessed Score of the average student sa in team t1 should be adjusted to match the peer rating of the average student rated in another team t2, even though the arithmetic value of the (original) Peer Assessed Scores usually differ. The same reasoning applies to the spread of Peer Assessed Score values within teams, namely, that the best team member s in team t1 should be rated comparably with the best team member in team t2, even if their Peer Assessed Score differ. Consequently, the Peer Assessed Scores WITHIN a team are scaled to match the relative values within other teams through normalisation using each team’s mean and standard deviation.

FAQ: What is the influence on Standard Peer Assessed Score (SPAS) if a team rates ALL its members with a Peer Assessed Score of 100?

In that case, the z-score normalised Peer Assessed Score  for every team member is set to 0.5.

A Future option to consider: Exclude students from consideration for receiving calculation of their SPAS in the case of a ‘misguided team’, identified as

FAQ: Would a student received the same Standard Peer Assessed Score (SPAS) if rated in another class?

In general, ‘NO’. A student is motivated differently in each of the classes the take. The luck of the draw is that they may work with a superior or inferior team, who will rate them relatively differently.

Quick links and related information

FAQ: What is Employability? How is it calculated?

FAQ: How are peer assessment and personal results calculated and defined mathematically?


FAQ: What is Employability? How is it  calculated?

For a specific Peer Assess Pro assessment, Employability is the statistical probability that team members from the class would recommend the specific team member to an employee, a colleague, or another team.

Employability is a proprietary measure defined by Peer Assess Pro™ drawn from the calculation of a student’s Standard Peer Assessed Score (SPAS). SPAS combines a student’s Peer Assessed Score and their Recommendation score, through various statistical treatments such as z-score normalisation. The resulting Employability score is a statistical probability, ranging from 5 to 95 percent. Employability is the best available estimate of the degree to which team members from the class in which the student has participated in a team project would recommend that specific team member to an employee, a colleague, or another team.

Mathematical calculation of Employability

Where

= the employability for student s, ranging over values from 5 to 95 in steps of 5.

 = the Gaussian distribution.  The statistical probability that a random variable, z, drawn from a normal distribution, will lie between the mean and z standard deviations above (or below) the mean. The GAUSS function returns values between -0.5 and +0.5

 is the combined z-score resulting from combining the z-score normalisation of the Recommendation  and Peer Assessed Scores for student s, as explained in the mathematical calculations for the Stand Peer Assessed Score. Through the process of normalisation,  has a mean of zero, standard deviation 1, which is the required input for the GAUSS function.

 is a mathematical function that rounds one number to the nearest integer multiple of another. In the case of Employability, m = 5. For example,  and . The MROUND function coupled with the attenuation factor of 95 achieves a step interval of 5 units.

The constant 0.5, adds the probability that a z-score lies between minus infinity and the mean, which is, by definition, 50%.

Conditioning transformations to de-emphasise unsubstantiated precision

The following transformations are applied to remove the impression of an over-precise measure of Employability, and reduce the possibilities of elation or despair in response to extreme values of Employability. Specifically, we apply a Principle of Conservatism the result of which is that Employability is conditioned to lie between 5 to 95, and rounded to increase in steps of 5, rather than the theoretically possible values of zero to 100, with apparently infinite precision!

The MROUND to the closest multiple of 5 coupled with attenuation by 95 achieves the step interval of 5 units.

The constant 2.5 is a translation factor that compensates for the shift downwards in mean values on account of the 95 attenuation factor.

Example calculations of Employability

The following table shows example calculations for of Employability based on the most likely range of possible values for combined z-scores arising from the generation of Standardised Peer Assessed Score, SPAS

The subsequent graph shows the data from the calculations of Employability charted against Combined z-scores.

As an example, consider a student achieving a SPAS of zero, arising from their combined z-score of -3. According to the normal distribution, less than 1 in 1000 students would recommend this student, as indicated by the proportion of the class who would fall below a combined z-score of -3. The calculation of Employability generously raises the assessment of the student suggesting that 5 % of the class would recommend them! The same conservativism happens at the other extreme, where a brilliantly contributing student (eg above a Combined z-score of +2) achieves an Employability of 95%, whereas if the normal distribution was to believed, they might expect 98% of the class to recommend them.

Example calculations of Employability from Combined z-scores

Combined z-scores,

-3

-1.5

-1

-0.5

0

0.5

1

1.5

2

3

-0.50

-0.43

-0.34

-0.19

0

0.19

0.34

0.43

0.48

0.50

Standardised Peer Assessed Score, SPAS

-22

14

26

38

50

62

74

86

98

122

Standardised Peer Assessed Score, SPAS

(Trimmed to 0 to 100)

0

14

26

38

50

62

74

86

98

100

Proportion of class below Combined z-score

 

0.1%

7%

16%

31%

50%

69%

84%

93%

98%

99.9%

Employability

5

10

20

30

50

70

80

90

95

95

Quick links and related information

FAQ: How is Standard Peer Assessed Score (SPAS) calculated?

FAQ: How are peer assessment and personal results calculated and defined mathematically?


FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?

Having a good sense of who you are enables you to build upon your strengths and correct your weaknesses. In turn, that can make you more successful at work and in your personal life. You are able to better understand, predict and cope with others more effectively. You can better distinguish valid and invalid informal and formal feedback from others. You are more likely to select (and achieve!) realistic personal goals. (‘ERSI: Exceptionally Realistic Self-Image’, 2012)

The Index of Realistic Self Assessment (IRSA) is a first step in providing data upon which to develop an Exceptionally Realistic Self-Image (ERSI).

Mathematical definition of the Index of Realistic Self Assessment

The Index of Realistic Self Assessment (IRSA) is a ratio-based measure of the extent to which a team members’  SELF assessment is matched by the assessment of the OTHER members of your team.

Where

   = the Peer Assessed Score assigned that student by their team members

      = the Peer Assessed Score a student has assessed themself

IRSA typically lies in the range 50 to 120. However, theoretically, IRSA could lie between zero and infinity. IRSA values generally calculate as:

IRSA is calculated only when these two conditions occur:

Extreme values of IRSA are notified in the teacher’s Active Warnings, as detailed in the FAQ

FAQ: What is a mismatched self-assessment (IRSA)?

Example calculations of the Index of Realistic Self Assessment

The data for the following table is drawn from

FAQ: How is the Peer Assessed (PA) Score calculated?

Calculations of the Index of Realistic Self Assessment for four team members

ASSESSEE

ASSESSOR

Bridget

Julian

Lydia

Nigella

Bridget

87

62.5

75

72.5

Julian

87.5

93

87.5

82.5

Lydia

75

82.5

78

80

Nigella

0

77.5

82.5

82

Peer Assessed Score (others)

54

74

82

78

Peer Assessed Score (self)

87

93

78

82

  Index of Realistic Self Assessment

62

80

105

95

Indication

Overconfident

Typical

Underconfident

Typical (Borderline underconfident)

Lydia has been assessed by others with a PA Score of 82. Her self-assessment has produced her  of 78. Therefore, since

Lydia’s IRSA of 105 indicates that she is an outlier when compared with most team members in a typical class. Specifically, she is underconfident in terms of assessing her strengths when compared with how others perceive her.

Why an IRSA of 100 is not a perfect score!

From our experience using Peer Assess Pro in many classes, we find most team members overrate themself when compared with how their team members rate them. This overrating results in a self-assessed Peer Assessed Score typically 7 to 10 points higher than the  Peer Assessed Score awarded by the other members of that same team. This phenomenon of overrating of one’s self assessment is well-established in the literature, termed self-enhancement bias (See, for instance, (Loughnan et al., 2011). Informally, self-enhancement bias is also known as the Lake Wobegon Effect, a phenomenon observed in a fictional town “where all the women are strong, all the men are good looking, and all the children are above average." (‘Lake Wobegon effect’, n.d.; ‘Lake Wobegon: The Lake Wobegon Effect’, 2017).

Quick links and related information

FAQ: What is a mismatched self-assessment (IRSA)?

FAQ: What is a valid assessed team?

FAQ: How do I interpret measures of realistic self-assessment?

Lake Wobegon effect. (n.d.). Retrieved 25 July 2017, from http://psychology.wikia.com/wiki/Lake_Wobegon_effect

Lake Wobegon: The Lake Wobegon Effect. (2017). In Wikipedia. Retrieved from https://en.wikipedia.org/w/index.php?title=Lake_Wobegon&oldid=787029148#The_Lake_Wobegon_effect

Loughnan, S., Kuppens, P., Allik, J., Balazs, K., de Lemus, S., Dumont, K., … Haslam, N. (2011). Economic Inequality Is Linked to Biased Self-Perception. Psychological Science, 22(10), 1254–1258. https://doi.org/10.1177/0956797611417003


FAQ: How do I interpret measures of realistic self-assessment?

From our experience using Peer Assess Pro in many classes, we find most team members overrate themself when compared with how their team members rate them. This overrating results in a self-assessed Peer Assessed Score typically 7 to 10 points higher than the  Peer Assessed Score awarded by the other members of that same team. This phenomenon of overrating of one’s self assessment is well-established in the literature, termed self-enhancement bias (See, for instance, (Loughnan et al., 2011). Informally, self-enhancement bias is also known as the Lake Wobegon Effect, a phenomenon observed in a fictional town “where all the women are strong, all the men are good looking, and all the children are above average." (‘Lake Wobegon effect’, n.d.; ‘Lake Wobegon: The Lake Wobegon Effect’, 2017).

Interpreting the Index of Realistic Self Assessment (IRSA)

The usual tendency of team members is to apply a self-enhancement bias when rating themselves using Peer Assess Pro. Consequently, we can interpret Index of Realistic Self Assessment (IRSA) scores in one of three ways: typical team members, overestimated, and underestimated.

Typical IRSA

An IRSA score between 75 and 95 suggests the assessed team member understand realistically their team contribution when contrasted with the assessment perceived by other team members. A score between 75 and 95 is typical of about 2/3 of team members in a class.

Overconfident IRSA

An IRSA below 75 suggests the assessed team member OVERESTIMATES their team contribution when perceived by other team members. An index below 75 suggests the team member undertake action to understand proactively their areas for development by informally soliciting further feedback and guidance from their team members. About ⅙ of team members achieve an index of below 75.

Underconfident IRSA

An IRSA above 95 suggests the assessed team member has a tendency to UNDERESTIMATE their team contribution when contrasted with the assessment perceived by other team members. The team member should consider developing more confidence in applying and displaying their strengths. About ⅙ of team members achieve an index of above 95.

Developing an exceptionally realistic self image, ERSA

An Index of Realistic Self Assessment that is not in the ‘typical’ range of 75 to 95 suggests that the team member take active steps to

What are the benefits of having an Exceptionally Realistic Self-Image?

What can get in the way of having an Exceptionally Realistic Self-Image?

How do I develop my Exceptionally Realistic Self-Image, ERSI?

A three-step programme to develop an Exceptionally Realistic Self-Image includes

Quick links and related information

FAQ: What is a mismatched self-assessment (IRSA)?

FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?

ERSI: Exceptionally Realistic Self-Image. (2012). Orange County Human Resource Services Portal. Retrieved from http://bos.ocgov.com/hr/hrportal/docs/docs_hr_leadership_forum/minutes_2012/minutes_030812/ersi.doc

Lake Wobegon effect. (n.d.). Retrieved 25 July 2017, from http://psychology.wikia.com/wiki/Lake_Wobegon_effect

Lake Wobegon: The Lake Wobegon Effect. (2017). In Wikipedia. Retrieved from https://en.wikipedia.org/w/index.php?title=Lake_Wobegon&oldid=787029148#The_Lake_Wobegon_effect

Loughnan, S., Kuppens, P., Allik, J., Balazs, K., de Lemus, S., Dumont, K., … Haslam, N. (2011). Economic Inequality Is Linked to Biased Self-Perception. Psychological Science, 22(10), 1254–1258. https://doi.org/10.1177/0956797611417003


FAQ: How is an outlier peer assessment rating identified? WARNING 0042

If one member of a team submits a peer assessment for an assessee ‘materially different’ than the assessments given by the other team members, this difference gives rise to an Active Warning in Peer Assess Pro titled

Critical Warning 0042 Outlier individual rating

A team member has assessed another team member very differently than the other team members.

Warning detail

The extended detail for the Active Warning displays one or more messages such as:

Harris  assessed  Michael  awarding a PA Subscore of 38. Compared with the average rating by the other team members of 70 this subscore DEPRESSED the PA Score to 64 by 7 PA Score Units. Team Alpha

Josef  assessed  Alvin  awarding a PA Subscore of 100. Compared with the average rating by the other team members of 66 this subscore RAISED the PA Score to 73 by 7 PA Score Units. Team Alpha

An Outlier individual rating warning will be raised ONLY if the impact on the assessee’s Peer Assessed Score is raised or lowered by more than 5 PA units outside the average rating given by the other members of the team.

The warning will be generated only for members of a valid assessed team, as detailed in

FAQ: What is a valid assessed team?


Example calculations

Consider team Alpha containing 5 members where Adam has been assessed with the following Peer Assessed Subscores by the other four team members.

Impact of removing one Assessor from the calculation of Peer Assessed Score for Adam

Assessee

Assessor

PA Subscore

Team Size

PA Score

PA Score Exclusive

Assessor Impact

Impact Direction

Adam

Edward

53

5

73

80

-7

DEPRESSED

Adam

Mary

63

5

73

77

-4

Adam

Stephanie

78

5

73

72

1

Adam

Josef

100

5

73

64

9

RAISED

Adam has a Peer Assessed Score of 73, calculated from his four team members subscores as follows:

= (Edward + Mary + Stephanie + Josef) / (Team size - 1)

= (53 + 63 + 78 + 100) / (5 - 1)

= 294 / 4

= 73.5

To determine the impact of Edward’s assessment of Adam, we can calculate the Peer Assessed Score Adam would receive from just the other three members as follows:

PA Score Exclusive         = (Mary + Stephanie + Josef) / (Team size - 2)

= (63 + 78 + 100) / 3

= 241 / 3

= 80.3

Therefore, the Assessor’s impact is the difference between the whole-of-team’s originally-calculated PA Score and the PA Score Exclusive

Impact        = PA Score - PA Score Exclusive

                        = 73.5 - 80.3

                                = - 6.8

We observe that the impact of Edward’s relatively low assessment of Adam has an impact that DEPRESSED Adam’s overall Peer Assessed Score by about 7 PA Units.

Peer Assessed Pro presents the following detail:

Edward  assessed  Adam  awarding a PA Subscore of 53. Compared with the average rating by the other team members of 80 this subscore DEPRESSED the PA Score to 73 by 7 PA Score Units. Team Alpha

In contrast, we see Josef’s rating of 100 had an impact that raised Adam’s Peer Assessment score. The following detailed outlier warning is presented:

Josef  assessed  Adam  awarding a PA Subscore of 100. Compared with the average rating by the other team members of 64 this subscore RAISED the PA Score to 73 by 9 PA Score Units. Team Alpha

Threshold for warning of outlier individual peer rating

The threshold for raising this Warning in Peer Assess Pro is +/- 5 PA Score units, the ThresholdOutlier constant. That is, if one assessor’s rating would affect the PA Score awarded to an assessee by more than 5 units, then the Outlier Warning will be raised.

In the previous example, the impact on Adam by assessors Mary and Stephanie is within the ThresholdOutlier constant of 5 PA Score units, so no outlier warning message is generated for these two assessors.

Alternative mathematical calculation of Assessor Impact

A more elegant method for calculating the Assessor Impact follows. First, calculate  the Peer Assessed Score for Assessed student s, excluding the PA Subscore  awarded by Assessor a.

Where

t = the number of team members in the team in which s is a team member.

 = the Peer Assessed Score for assessed student s

 = the peer assessment score awarded by Assessor a to Assessee s

The Assessor Impact,  of removing Assessor a’s assessment of Assessee s is  

Alternative example calculations

Consider Edward’s assessment of Adam using the data from the table above.

Quick links and related information

FAQ: How is the Peer Assessed (PA) Score calculated?

FAQ: What is a valid assessed team?


FAQ: What is a mismatched self-assessment (IRSA)? WARNING 0040

If a team member submits a self-assessment that is ‘materially different’ than the assessments given by their other team members, this difference gives rise to an Active Warning in Peer Assess Pro titled

Critical Warning 0040 Mismatched self-assessment

A team member's self assessment is materially different to the peer assessment given by their team

Warning detail

The extended detail for the Active Warning displays one or more messages such as:

Gregor’s self-assessment of 63 is UNDERCONFIDENT compared with the peer assessment of 93 given by others in team  Charlie.  IRSA = 148

Daphne’s self-assessment of 68 is OVERCONFIDENT compared with the peer assessment of 34 given by others in team  Alpha.  IRSA = 51

The warning will be generated only for members of a valid assessed team, as detailed in

FAQ: What is a valid assessed team?

Furthermore, the warning will only be generated when a student has completed their self-assessment as part of their peer assessment submission.


Threshold for warning of mismatched self-assessment

The Peer Assess Pro system constant ThresholdIrsaUnderconfident is defined as 115. Values greater than or equal to ThresholdIrsaUnderconfident will raise the UNDERCONFIDENT active warning. In general, about 7 % to 16 % of students will be flagged with this warning.

The Peer Assess Pro system constant ThresholdIrsaOverconfident is defined as 75. Values greater than or equal to ThresholdIrsaOvererconfident will raise the UNDERCONFIDENT active warning. In general, about 7 % to 16 % of students will be flagged with this warning.

Example calculations

The Mismatched self-assessment warning is raised from the value of Index of  Realistic Self Assessment (IRSA) that is calculated for each student.

See FAQ

FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?

The warning of UNDERCONFIDENT is raised when IRSA for a student is greater than 115, the ThresholdIrsaUnderconfident.

The warning of OVERCONFIDENT is raised when IRSA for a student is less than 75, the ThresholdIrsaOverconfident.

Sample of several Peer Assessed Scores and self-assessments

Name

PA Score

PA Self

IRSA

Confidence

Abel

96.7

100

96.7

Baker

100.0

82.5

121.2

UNDERCONFIDENT

Charlie

82.1

70

117.3

UNDERCONFIDENT

Daphne

34.2

67.5

50.6

OVERCONFIDENT

Edward

95.8

87.5

109.5

Consider the case of Daphne

Since 50.6 is less than 75, then Daphne’s self-assessment is regarded as OVERCONFIDENT. Consequently, the Mismatched self-assessment warning is raised.

The extended detail message is:

Daphne’s self-assessment of 68 is OVERCONFIDENT compared with the peer assessment of 34 given by others in team  Alpha.  IRSA = 51

Recommended action for facilitator

For UNDERCONFIDENT and OVERCONFIDENT students, Peer Assess Pro generates an email that the teacher can send optionally. The email recommends the student arranges an appointment to meet with the teacher to explore the reasons for the variation in self and others’ peer assessment.

Quick links and related information

FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?

FAQ: How do I interpret measures of realistic self-assessment?

FAQ: How is the Peer Assessed (PA) Score calculated?

FAQ: What is a valid assessed team?


FAQ: What is a low quality team rating? WARNING 0050

Suppose a team collectively submits a set of peer assessments that are both

This feature is an indication that the team may have engaged unconstructively with the peer assessment process. When these conditions are both fulfilled, an Active Warning in Peer Assess Pro is generated:

Critical Warning 0050 Low quality team rating

A team may have engaged unconstructively with peer assessment

Warning detail

The extended detail for the Active Warning displays one or more messages such as:

Team Alpha  may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 100 and low range 0.

Team Bravo  may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 96 and low range 8.

Team Charlie  may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 96 and low range 8.

The warning will be generated only for members of a valid assessed team, as detailed in

FAQ: What is a valid assessed team?


Threshold for warning of low quality team rating

The Peer Assess Pro system constant ThresholdTeamAverage is defined as 90.

The Peer Assess Pro system constant ThresholdTeamRange is defined as 11.

The Active Warning, YES, is generated for a team when both conditions are true:

Example calculations

Suppose Team Mike contains 6 members, whose Peer Assessed Scores are shown below. The Average Peer Assessed Score and Range of Peer Assessed Scores are calculated.

Peer Assessed Scores for members of Team Mike

Name

Peer Assessed Score

Annie

93.75

Emma

92.55

Joe

90.85

Freddie

92.50

Tammy

95.88

Tilly

88.32

Team Average

92 = 553 / 6

Team Range

8 = 95.88 - 88.32

The Team Average Peer Assessed Score and Team Range are examined for every team. A low quality team rating is identified for those teams that breach the Threshold parameters defined below.

Identification of low quality team ratings

Team

Team Average Peer Assessed Score

Team Range

Low Quality Team Rating

Alpha

100

0

YES

Mike

92

8

YES

November

87

8

NO

Oscar

95

9

YES

Papa

85

9

NO

Quebec

95

10

YES

Romeo

92

12

NO

Recommended action for facilitator

In general, a team that has this warning may have engaged unconstructively with peer assessment. Most team members have not entered the spirit of the peer assessment process. They may have attempted to ‘game’ the peer assessment by giving everyone well above typical or average ratings.

Peer Assess Pro provides the facilitator with the option to send out an email to all members of the team suggesting they may wish to reconsider their ratings. Furthermore, the students are encouraged to provide qualitative evidence in support of the ratings they have provided.

High performing teams

In a small proportion of teams, it is possible that a high performing team will ALSO have this Active Warning generated. In a high performing team all team members contribute effectively to the results and team processes. This outcome will be evident to the teacher through the team gaining a high Team Result for their submitted work.


Example case

The following graph shows a large first year university class of 848 students who have undertaken their first, formative experience of Peer Assess Pro. Of the 84 valid teams, 24 teams are identified as potentially having a low quality team rating.

 

In a case like this, the teacher might consider guiding the class of students towards more constructive and discriminating peer assessment before undertaking the final, summative peer assessment. For example, remind students of the purpose of peer feedback, and how to provide useful feedback.

Quick links and related information

FAQ: What is a valid assessed team?

FAQ: What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process?

FAQ: What is the purpose of peer assessment?

FAQ: How do I provide useful feedback to my team members?

FAQ: What is a low quality assessor rating?


FAQ: What is a low quality assessor rating? WARNING 0300

Suppose a team member submits a set of peer assessments that are both

This feature is an indication that the team member may have engaged unconstructively with the peer assessment process. When these conditions are both fulfilled, an Active Warning in Peer Assess Pro is generated:

Critical Warning 0300 Low quality assessor rating

An assessor may have engaged unconstructively with peer assessment.

Warning detail

The extended detail for the Active Warning displays one or more messages such as:

Tony  may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 100 and low range 0.  Team Alpha

Kathy  may have engaged unconstructively with peer assessment. High Peer Assessment Scores awarded. Average 85 and low range 8. Team Bravo

The warning will be generated only for members of a valid assessed team, as detailed in

FAQ: What is a valid assessed team?

Threshold for warning of low quality assessor rating

The Peer Assess Pro system constant ThresholdAssessorAverage is defined as 85.

The Peer Assess Pro system constant ThresholdAssessorRange is defined as 9.

The Active Warning is generated for an assessor when BOTH conditions are true:

Example calculations

Suppose Kathy a member of Team Bravo assesses all her fellow team members as follows:

Peer Assessed SubsScores assessed by Kathy in Team Bravo

Name

Peer Assessed Sub Score

Garry

90

Dan

87.5

Sunny

82.5

Freddie

82.5

Robby

82.5

Average

85 = 425 / 5

Range

7.5 = 90 - 82.5

The Average Peer Assessed Score and Range are examined for every assessor. A low quality assessor rating is identified for those individuals that breach the threshold parameters defined above..

Identification of low quality individual assessor ratings

Assessor

Average PA Score (awarded)

Range (PAS Units)

Low quality assessor rating

Tony

100

0.0

Y

Andy

93

0.0

Y

Jess

75

0.0

N

Johnny

93

2.5

Y

Chance

82

2.5

N

Kathy

85

7.5

Y

Zara

83

7.5

N

Riley

97

10.0

N

Recommended action for facilitator

In general, an individual with this warning may (or may not!) have engaged unconstructively with peer assessment. The team member may not have entered the spirit of the peer assessment process. They may have attempted to ‘game’ the peer assessment by giving everyone well above typical or average ratings.

Peer Assess Pro provides the facilitator with the option to send out an email to the assessor suggesting they may wish to reconsider their ratings. Furthermore, the student is encouraged to provide qualitative evidence in support of the ratings they have provided.

High performing teams

In a small proportion of teams, it is possible that a member of a high performing team will ALSO have this Active Warning generated. In a high performing team all team members contribute effectively to the results and team processes. Consequently, it is reasonable to expect a high average Peer Assessed Score to be awarded most members, with a concurrent low range. This outcome will be evident to the teacher through the assessor’s team gaining a high Team Result for their submitted work.

Quick links and related information

FAQ: What is a valid assessed team?

FAQ: What happens if I try to 'game' (fool? play? disrupt?) the peer assessment process?

FAQ: What is the purpose of peer assessment?

FAQ: How do I provide useful feedback to my team members?

FAQ: What is a low quality team rating?


FAQ: What is a valid assessed team? WARNING 0022

If a team member submits a self-assessment that is ‘materially different’ than the assessments given by their other team members, this difference gives rise to an Active Warning in Peer Assess Pro titled

Critical Warning 0022 Insufficient team responses

The number of responses from a team is insufficient for presenting valid results.

Warning detail

The following extended detail is provided

Alpha  has received 2 team member responses. Minimum required 3 from team size 4

Bravo  has received 3 team member responses. Minimum required 4 from team size 6

Results not displayed to members of non-valid assessed teams

Peer Assess Pro restricts the display of results to valid assessed teams. The notion of a valid assessed team is to prevent the display of results to students (and facilitators) when a small number of peer assessments from a team has been submitted. Such a low response situation could distort the reliability and accuracy of both the team’s peer assessment and personal result calculations, and ACTIVE WARNING messages for a team. Consequently, class statistics such as mean, maximum, range, and standard deviation are calculated only for team members that are designated as part of a valid assessed team.

Students can only view results if they belong to a valid assessed team.

A facilitator may only view results from valid assessed teams.


How many valid and invalid teams do I have?

The Teacher’s Dashboard Active Warnings and (i) Information button inform you of the number of valid teams and valid assessments throughout the progress of managing the peer assessment responses. The Active Warning enables you to ‘hunt down’ the teams that have not yet achieved valid status.

Recommended action for facilitator

Peer Assess Pro generates an email that the teacher can send optionally to members of non-valid team who have not yet responded. The email reminds students to respond by the Due Date.

Mathematical definition

For teams with five or fewer members, a valid assessed team must have peer ratings from at least three members of the team. For teams with six or more team members ‘just over’ half the team members must peer assess. The required minimum number of team members  who must rate within a particular team of size n members is defined as:

Where

= the minimum number of team members required to rate within a particular team

is a function that selects the maximum of the calculated values

 is a function that calculates the integer value of the result

For teams of size 0, 1 and 2 peer assessment results are not calculated. The default Personal Pesult in these circumstances is the Team Result.

Example calculations

Team size,

Required minimum assessors,

Proportion of whole team

3

3

100%

4

3

75%

5

3

60%

6

4

66%

7

4

57%

8

5

62%

9

5

56%

10

6

60%

Quick links and related information

FAQ: How is the Peer Assessed (PA) Score calculated?

FAQ: How are peer assessment and personal results calculated and defined mathematically?


FAQ: What is an ‘at risk’ team member? WARNING 0044

Future featured to be implemented.

An ‘at risk’ team member has been rated amongst the bottom of the class as measured by either the Peer Assessed Score OR Recommendation. A low rating on either of these assessments gives rise to an Active Warning in Peer Assess Pro titled

Critical Warning 0044 At risk team member

A team member has been rated amongst the lowest in class.

Warning detail

The following extended detail is provided

Anne Smith has been rated amongst the lowest in class. Low Recommendation 2.3 and/or low Peer Assessment Score 34. Team  Alpha.

Recommended action for facilitator

Peer Assess Pro generates an email that the teacher can send optionally to team members with a low rating. The email requests that the team member make an appointment to meet promptly with the teacher to discuss their peer assessment results so they can develop a more productive contribution to the team's future outputs, processes and leadership.

The facilitator should view the personal snapshot of low rated team members examining the qualitative feedback given. You should expect that the qualitative feedback will confirm the low Recommendation or Peer Assessed Scores. It will be helpful to have reviewed these snapshots prior to your interviewing and counselling the at risk students who visit you.


Threshold for warning of ‘at risk’ team member

The Peer Assess Pro system constant ThresholdPaScore is defined as 55.

The Peer Assess Pro system constant ThresholdRecommendation is defined as 3.75.

The Active Warning is generated for an assessor when both conditions are true:

The test is conducted only when the team has sufficient assessments to qualify as a valid assessed team.

Example calculation

Identification of at risk students

Team

Team Member

Valid team?

Recommendation

Peer Assessed Score

At Risk?

Alpha

Anna

YES

2.3

34.2

YES

Alpha

Ashok

YES

3.6

80

NO

Alpha

Prasad

YES

2.7

60

YES

Bravo

Tony

NO

3.0

95

NO

Bravo

Nick

NO

2.3

34

NO

The foregoing data will generate the following WARNING messages.

Anne has been rated amongst the lowest in class. Low Recommendation 2.3 and/or low Peer Assessment Score 34. Team  Alpha.

Prasad has been rated amongst the lowest in class. Low Recommendation 2.7 and/or low Peer Assessment Score 60. Team  Alpha

Note that Nick in team Bravo is not yet flagged as an at risk student because the team has not yet received sufficient peer assessment responses to be qualified as a valid team.

Limitation

The choice of these threshold criteria are admittedly crude. In future versions of  Peer Assess Pro, a more sophisticated criteria based on the Standard Peer Assessed Score will be employed.

If the thresholds seem somewhat high, above the average student point it is our experience that for many classes the Lake Wobegon (self-enhancement) effect applies. That is, the class tends to over-rate a higher than expected proportion of the class above average. For further discussion about the Lake Wobegon Effect, see the section ‘Why an IRSA of 100 is not a perfect score!’ in

FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?

Alternative approaches to identifying at risk students

The teacher has several graphical approaches to identifying the most at risk students in their class

FAQ: What is a mismatched self-assessment (IRSA)? WARNING 0040


Identifying at risk students from sorted table of Recommendations


Identifying at risk students from sorted table of Peer Assessed Scores with concurrent examination of Self Assessment

Quick links and related information

FAQ: What steps can I take to get a better personal result?

FAQ: What is a valid assessed team?

FAQ: How is the Peer Assessed (PA) Score calculated?

FAQ: How is the Index of Realistic Self Assessment (IRSA) calculated?

FAQ: What is a mismatched self-assessment (IRSA)? WARNING 0040


FAQ: What is the content of emails sent by Peer Assess Pro?

The content of email generated by Peer Assesspro undergoes regular review and improvement. Details may not match exactly the detail presented here.

The first table shows the SUBJECT title of the email generated in response to various events and warning triggers.

The second table shows the detailed content of each email. The teacher can, of course, copy this template text, modify, and send their own email.

Table listed by Email ID, Short descriptor, and Email SUBJECT line

Priority Email ID: Short Descriptor

Subject

CATASTROPHIC 1: Request to CORRECT TEAM membership

CRITICAL! Please correct team membership for peer assessment due by << Due Date >>. <<Activity Title>>. Alert from team member << team member>>

CRITICAL 11: Request to COMPLETE peer assessment

Please complete peer assessment due by << Due Date >>. <<Activity Title>>

CRITICAL 12: REMINDER to complete peer assessment

REMINDER! Please complete peer assessment due by << Due Date >>. <<Activity Title>>

CRITICAL 13: Request to RESUBMIT peer assessment

RESUBMIT! Please complete peer assessment due by << Due Date >>. <<Activity Title>>

CRITICAL 20: ABANDONED Peer Assessment activity.

ABANDONED peer assessment for peer assessment due by << Due Date >>. <<Activity Title>>

WARNING 101: Request to RECONSIDER peer assessment

Request to reconsider peer assessment due by << Due Date >>. <<Activity Title>>

ADVISORY 1001: Personal results PUBLISHED and available to view

Please view your personal results for peer assessment due by << Due Date >>. <<Activity Title>>

ADVISORY 1002: REVISED personal results published and available to view

REVISED RESULTS! Please view your personal results for peer assessment due by << Due Date >>. <<Activity Title>>

ADVISORY 1003: FINALISED personal results published and available to view

FINALISED RESULTS! Please view your personal results for peer assessment <<Activity Title>>. Available until << finalisation date + 2 weeks >>

ADVISORY 1004: Personal results PUBLISHED but NOT available to view

Incomplete submissions from your team for peer assessment due by << Due Date >>. <<Activity Title>>

ADVISORY 1005: FINALISED personal results published but NOT available to view

FINALISED RESULTS: Incomplete submissions from your team for peer assessment due by << Due Date >>. <<Activity Title>>

ADVISORY 1006: Due Date imminent

DUE DATE IMMINENT: Review your peer assessment <<Activity Title>> . Due by <<Due Date>>


Table of email body text, listed by Short descriptor, Email ID, Email SUBJECT

Short Descriptor (Priority Email ID)

Subject

Detail

Request to CORRECT TEAM membership (CATASTROPHIC 1)

CRITICAL! Please correct team membership for peer assessment due by << Due Date >>. <<Activity Title>>. Alert from team member << team member>>

Dear << Teacher Fullname >>,

The team member << team member>> identified as a team member in team << Team >> has noted that there is an error in the composition of their team. Peer Assess Pro currently has the team membership as:

<< Team Name>>

<<List of Team members>>

You may view the current membership of teams for your activity as follows:

Follow this Running Activity URL https://qf.xorro.com/activities/<<Activity Number>>/results?view=2

Alternatively,

- Select the Activities tab from the Xorro Teachers dashboard

- Select Running Activities under the Activities menu

- Select the peer assessment activity <<Activity Title>>

- Select the Team Composition button for the Activity <<Activity Title>>

Please carry out these corrective actions to update the teamset for this activity as detailed in the Frequently Asked Question:

FAQ: How do I correct the Team Composition in a running peer assessment activity?

Find this FAQ in the section 'Manage the peer assessment activity' here: https://www.peerassesspro.com/frequently-asked-questions-2/

Upon a succesful re-import of the revised teamset, Peer Assess Pro will proceed automatically to:

- Remove the existing peer assessment submissions from the analysis for ALL members of the affected team(s)

- Send an email to the affected team members requesting them to resubmit their responses.

Notes:

Submissions from team members whose team membership is unchanged will NOT be affected.

Do not use the Participants tab to adjust the team membership of a running activity. Any changes you make will NOT be transferred into an existing running activity.

Kind regards,

Peer Assess Pro

Request to COMPLETE peer assessment (CRITICAL 11)

Please complete peer assessment due by << Due Date >>. <<Activity Title>>

Dear <<team member>>,

Please complete the Peer Assess Pro peer assessment activity << Activity Title>> for your team before << Due Time>> << Due Date >>.

To complete the activity, please visit the Activity URL << Activity Specific URL>>.

The peer assessment requires a Login ID. Usually, the Login ID will be your student id, unless your teacher has advised an alternative.

The Activity URL will become available for your responses from << Activity Start Time>> << Activity Start Date >>.

For further information about preparing for, and using the results of the peer assessment, please visit http://peerassesspro.com/resources/students.

Team membership check

The following are your team members. If there is a mistake in this list please urgently advise your teacher the correct composition, using the email below.

<< Team Name>>

<<List of Team members>>

Do not reply to this email. Rather, contact your teacher whose email is listed below.

Kind regards,

<< Teacher fullname >>

<< Teacher email >>

REMINDER to complete peer assessment (CRITICAL 12)

REMINDER! Please complete peer assessment due by << Due Date >>. <<Activity Title>>

Dear <<team member>>,

The peer assessment activity for << Activity Title>> will soon become unavailable for you to complete. Therefore, please complete the Peer Assess Pro peer assessment activity for your team before << Due Time>> << Due Date >>.

To complete the activity, please visit the Activity URL << Activity Specific URL>>.

The peer assessment requires a Login ID. Usually, the Login ID will be your student id, unless your teacher has advised an alternative.

The Activity URL became available for your responses from << Activity Start Time>> << Activity Start Date >>.

For further information about preparing for, and using the results of the peer assessment, please visit http://peerassesspro.com/resources/students.

Do not reply to this email. Rather, contact your teacher whose email is listed below.

Kind regards,

<< Teacher fullname >>

<< Teacher email >>

Request to RESUBMIT peer assessment (CRITICAL 13)

RESUBMIT! Please complete peer assessment due by << Due Date >>. <<Activity Title>>

Dear <<team member>>,

You may have already completed the Peer Assess Pro peer assessment for <<Activity Title>> due before << Due Time>> << Due Date >>.

I regret to advise that I require you to resubmit your survey. You response submitted to date has been deleted from the analysis. The reasons for this request may be due to a change to the membership of your team, such as a deletion or addition of a team member.

To complete the activity, please visit the Activity URL << Activity Specific URL>>.

Please resubmit your peer assessment for <<Activity Title>> due before << Due Time>> << Due Date >>.

We apologise for your inconvenience.

For further information about preparing for, and using the results of the peer assessment, please visit http://peerassesspro.com/resources/students.

Do not reply to this email. Rather, contact your teacher whose email is listed below.

Kind regards,

<< Teacher fullname >>

<< Teacher email >>

ABANDONED Peer Assessment activity. (CRITICAL 20)

ABANDONED peer assessment for peer assessment due by << Due Date >>. <<Activity Title>>

Dear <<team member>>,

You and your team members were invited to participate recently in the peer assessment for << Activity Title>> due << Due Date >>.

The Teacher ABANDONED the activity on << Abandoned Date >> due to exceptional circumstances. Please disregard any previous interim published results.

I apologise for your inconvenience.

Kind regards,

<< Teacher fullname >>

<< Teacher email >>

Request to RECONSIDER peer assessment (WARNING 101)

Request to reconsider peer assessment due by << Due Date >>. <<Activity Title>>

Dear <<team member>>,

You recently completed the peer assessment of << Activity Title>>. However your Teacher noted that your responses were significantly different when compared with other respondents. Specifically, you may have:

- Rated all team members the same, or over a narrow range

- Rated a team member significantly overgenerously or undergenerously, when compared with other members of your team

- Provided unhelpful, unconstructive or inconsiderate qualitative comments in your feedback to one or more team members.

If you feel that your ratings and feedback are entirely justified, you need take no further action.

Alternatively, if you wish to review and resubmit a more accurate survey, please use the URLs below to submit a replacement peer assessment survey. Please take special care to provide useful and accurate qualitative feedback that will help your team member(s) and Teacher understand the ratings you have provided.

Complete the revised Peer Assess Pro peer assessment activity for your team before << Due Time>> << Due Date >>.

To complete the activity, please visit the Activity URL << Activity Specific URL>>.

For further information about preparing for, and using the results of the peer assessment, please visit http://peerassesspro.com/resources/students.

Do not reply to this email. Rather, contact your teacher whose email is listed below.

Kind regards,

<< Teacher fullname >>

<< Teacher email >>

Personal results PUBLISHED and available to view (ADVISORY 1001)

Please view your personal results for peer assessment due by << Due Date >>. <<Activity Title>>

Dear <<team member>>,

You recently completed the peer assessment of << Activity Title>>. You may now view your Personal Result and feedback. Please visit the Activity URL << Activity Specific URL>>.

Your results will be available for you to view for a period of two weeks following the finalisation of the activity.

If you have specific questions or concerns about your Personal Results please contact me promptly so that I can determine a remedy.

For further information about preparing for, and using the results of the peer assessment, please visit http://peerassesspro.com/resources/students.

Do not reply to this email. Rather, contact your teacher whose email is listed below.

Kind regards,

<< Teacher fullname >>

<< Teacher email >>

REVISED personal results published and available to view (ADVISORY 1002)

REVISED RESULTS! Please view your personal results for peer assessment due by << Due Date >>. <<Activity Title>>

Dear <<team member>>,

You recently completed the peer assessment of << Activity Title>>. You may now view your revised Personal Result and feedback. Please visit the Activity URL << Activity Specific URL>>.

Your results may have been revised from those previously made available to you. Reasons for revisions include:

- A change in Team Results

- Late peer assessment responses

- An adjustment to the method the teacher has used to calculate your personal result.

Your results will be available for you to view for a period of two weeks following the finalisation of the activity.

If you have specific questions or concerns about your Personal Results please contact me promptly so that I can determine a remedy.

For further information about preparing for, and using the results of the peer assessment, please visit http://peerassesspro.com/resources/students.

Do not reply to this email. Rather, contact your teacher whose email is listed below.

Kind regards,

<< Teacher fullname >>

<< Teacher email >>

FINALISED personal results published and available to view (ADVISORY 1003)

FINALISED RESULTS! Please view your personal results for peer assessment <<Activity Title>>. Available until << finalisation date + 2 weeks >>

Dear <<team member>>,

You recently completed the peer assessment of << Activity Title>>. You may now view your final Personal Result and feedback. Please visit the Activity URL << Activity Specific URL>>.

Your results are available for you to view for a period of two weeks following the finalisation of the activity. That is, from now until << Finalisation date + two weeks >>.

Your results may have been revised from those previously made available to you. The revisions may have been due to:

- A change in Team Results

- Late peer assessment responses

- An adjustment to the method the teacher has used to calculate your personal result.

If you have specific questions or concerns about your Personal Results please contact me promptly so that I can determine a remedy.

For further information about preparing for, and using the results of the peer assessment, please visit http://peerassesspro.com/resources/students.

Do not reply to this email. Rather, contact your teacher whose email is listed below.

Kind regards,

<< Teacher fullname >>

<< Teacher email >>

Personal results PUBLISHED but NOT available to view (ADVISORY 1004)

Incomplete submissions from your team for peer assessment due by << Due Date >>. <<Activity Title>>

Dear <<team member>>,

You recently completed the peer assessment << Activity Title>>. However, several team of your members have yet to complete their peer assessment. Consequently, you are restricted from viewing the results of the peer assessment as the results would be incomplete.

Once the remainder of your team have completed their peer assessments, you will be able to view your final Personal Result and feedback. Please visit the Activity URL << Activity Specific URL>>.

If you have specific questions or concerns about your Personal Results please contact me promptly so that I can determine a remedy.

For further information about preparing for, and using the results of the peer assessment, please visit http://peerassesspro.com/resources/students.

Do not reply to this email. Rather, contact your teacher whose email is listed below.

Kind regards,

<< Teacher fullname >>

<< Teacher email >>

FINALISED personal results published but NOT available to view (ADVISORY 1005)

FINALISED RESULTS: Incomplete submissions from your team for peer assessment due by << Due Date >>. <<Activity Title>>

Dear <<team member>>,

You and your team members were invited to participate recently in the peer assessment for << Activity Title>> due << Due Date >>.

The Teacher finalised the results on << Finalisation Date >>. However, several of your team members failed to complete their peer assessment. Consequently, you are restricted from viewing the results of the peer assessment as the results would be incomplete.

Since the activity has been finalised, there is now no option for further peer assessments to be submitted from your team.

For further information about preparing for, and using the results of the peer assessment, please visit http://peerassesspro.com/resources/students.

Do not reply to this email. Rather, contact your teacher whose email is listed below.

Kind regards,

<< Teacher fullname >>

<< Teacher email >>

Due Date imminent (ADVISORY 1006)

DUE DATE IMMINENT: Review your peer assessment <<Activity Title>> . Due by <<Due Date>>

Dear << Teacher Fullname >>,

You scheduled the peer assessment activity << Activity Title>> due for completion by <<Due Date>>. However, students can continue submitting responses beyond the Due Date until you personally FINALISE the activity.

Prior to publishing and finalising the peer assessment, review carefully the Active Warnings on your Peer Assess Pro dashboard. In particular, we suggest you review

- The Qualitative Feedback report, examining especially students who have been peer assessed with a low rating by other team member(s)

- Students with an OVERCONFIDENT Index of Realistic Self Assessment (IRSA)

- Students who have rated a team member significantly differently than the other team members.

View your Teacher's Peer Assess Pro Dashboard here <<Teacher's Activity URL>>.

For additional advice on managing and finalising a Peer Assess Pro activity, review the Frequenly Asked Questions in the section 'Manage the peer assessment activity' here: https://www.peerassesspro.com/frequently-asked-questions-2/

Kind regards,

Peer Assess Pro

Quick links and related information


FAQ: How do I login to my peer assessment Activity URL

Activity URL

To access the Peer Assess Pro survey you require an activity-specific URL. In general, the format of the Activity URL is:

https://q.xorro.com/teacherid/activityid

Example

https://q.xorro.com/smup/23021

The teacherid is usually four letters, such as smup. The teacher is ALWAYS identified by these letters.

The activityid is usually several digits, such as 23021.

The Activity URL is provided to a student through:

Participant URL

The Participant URL lists ALL the activities currently running that have been started by one teacher. The format is a truncated form of the Activity URL. That is, no Activityid, just the teacherid:

https://q.xorro.com/teacherid

Successful login through Activity URL

When everything is working correctly, you follow the link to the Activity URL. You should see the Login Page. Note the Activity title in the top left corner. That information should confirm you have the correct Activity URL for the peer assessment you are required to undertake.

Login to Peer Assess Pro Activity

Enter you ID. For students, this is usually your Student ID or Student Registration. Your teacher or facilitator will advise you if a different system of identification is being used.

Successful login confirms your name, and details about the institution and teacher that should be familiar to you!

Select ‘Next’ to proceed to the peer assessment.


Successful login to Peer Assess Pro Activity

Quick links and related information

FAQ: I am unable to login. My login failed


FAQ: I am unable to login. My login failed

There are several reasons why a student’s login may fail to be successful. Steps to effect a remedy are detailed later.

  1. You entered your ID incorrectly.
  2. Your teacher or facilitator has entered your ID incorrectly
  3. The Xorro Activity related to the Activity URL has not yet reached its Start Date
  4. The Xorro Activity related to the Activity URL has been Finalised and Finished.
  5. The Xorro Activity related to the Activity URL has been Abandoned.
  6. The institution manager for Xorro has not maintained payment of the subscription to use Xorro and/or Peer Assess Pro
  7. An exceptional system fault has occurred with the Xorro participants database entry for your ID: duplicate identical ids
  8. Some other mysterious fault

This FAQ explains how to login correctly:

FAQ: How do I login to my Activity URL

Investigation and remedies for login failure

  1. You entered your ID incorrectly.

  1. Your teacher or facilitator has entered your ID incorrectly

FAQ: How do I correct the Team Composition in a running peer assessment activity?

  1. The Xorro Activity related to the Activity URL has not yet reached its Start Date

Select the required Activity Title from the list of teacher’s activities

  1. The Xorro Activity related to the Activity URL has been Finalised and Finished.

  1. The Xorro Activity related to the Activity URL has been Abandoned

  1. The institution manager for Xorro has not maintained payment of the subscription to use Xorro and/or Peer Assess Pro

  1. An exceptional system fault has occurred with the Xorro participants database entry for your ID: duplicate identical ids

There is a rare exception that can prevent a student’s login. This exception occurs when there are two or more identical IDs in the Xorro institution participants database.


Search All Participants by ID to identify duplicate ID matches

8. Some other mysterious fault

If none of the previous explanations or solutions resolve the issue, contact Peer Assess Pro providing full details of the messages shown, activity, activity URL, classlist Teamset.csv, and institution.

Quick links and related information

FAQ: How do I login to my Activity URL


FAQ: Can I adjust the Start Date or Due Date for a running activity?

The short answer is No, you cannot adjust either of these dates. However, there are workarounds described below.

Adjusting the Start Date

When the Start Date and time is reached a multiplicity of emails are sent to the students in the class advising them

Since there is NO MECHANISM to recall the despatched emails, the sole workaround to adjust the Start Date is to abandon the peer assessment, then launch a new per assessment. The abandon process is detailed below, Worst case scenario: Abandon the peer assessment

Adjusting the Due Date

The teacher establishes the deadline for completing a peer assessment when the activity is first created then launched. The deadline is termed the Due Date.

You cannot extend the Due Date once the activity has been launched. However, the Due Date is advisory inly. See later!

The Due Date is the date that students will be advised by which they should complete the peer assessment. The Due Date is advised to students through:

The Due Date is also used to prompt the teacher to conduct important administration and management activities during the activity, and prior to Finalisation.

The good news: The Due Date date is advisory only

The ‘Due Date’ date is advisory only. Students can CONTINUE to submit responses beyond the Due Date UNTIL the teacher Finalises or Abandons the activity. After Finalisation, the students will have up to two weeks to review their results after the Finalisation Date.

Advise students of your extended deadline

Given that the Due Date is advisory only, we suggest you announce in class, or by email that you have ‘extended’ the Due Date for peer assessment submissions until some arbitrary future date you select. On that date you can then choose to Finalise the peer assessment. Check the progress results first!

Worst case scenario: Abandon the peer assessment

Abandoning a peer assessment is a worst case, last ditch attempt which we advise against.

However if you insist on changing the Start Date or the Due date, then from the Peer Assess Pro Dashboard

The Activity URL will become invalid. The students will be advised the peer assessment has been abandoned. All survey results collected to date WILL BE nulled.

Now launch a new activity with the revised Start Date and/or Due Date. A few students will be confused by receiving the (old, abandoned) peer assessment Activity URL and the superseding peer assessment request. However, only the new activity will be available to students. Furthermore, the URL link to the abandoned activity will direct the student to a list of all Xorro running activities initiated by that teacher. Hopefully, you have clearly identified the title of the peer assessmnent activity so that stxdents can selcet it correctly.

Abandoning a peer assessment is a worst case, last ditch attempt which we advise against.

Quick links and related information

The importance of correctly selecting the Start Date and Due Date is detailed in the Reference Manual:

2.3 Launch and create the peer assessment activity

About Finalisation and Abandonment

4. Finalise the peer assessment activity


CONTACT US

Please do not hesitate to ask us for help.

Patrick Dodd patrick@peerassesspro.com +64 21 183 6315

Peter Mellalieu peter@peerassesspro.com +64 21 42 0118 Skype myndSurfer

We especially welcome your advice on how the app and Reference Guide could be improved. We’d also like to know which features you expect to value highly in your teaching and use of Peer Assess Pro™.

Thank you for your participation.

Frequently Asked Questions

FAQs on the web at https://www.peerassesspro.com/frequently-asked-questions-2/

Quickstart Guide for Peer Assess Pro

https://www.peerassesspro.com/quickstart-guide-for-teachers/

Home/Table of Contents for Reference Manual

http://tinyurl.com/papRefWeb2

Website

https://www.peerassesspro.com/

Quick links and related information

TOP                                                                                        FAQs


[1] Hyperlinked to Xoro site. Teacher must be a registered Xorro user.

[2] Hyperlinked to web version of Reference Guide.

[3] Internal links within pdf version of Reference Guide.

[4] Conditions of use apply to a free Xorro Account. See Discover Xorro-Q

[5] New to peer assessment? See our FAQ: What is the purpose of peer assessment?