Peer_Assess_Pro_Logo_LOW_PwithTEXT.jpg

  User Manual v15.3.3

Peer Assess Pro™ provides a comprehensive approach for a teacher to arrange for team members to rate each other and provide constructive feedback for the team’s development.

This document provides

Printable PDF versions of the User Manual and Quick Guide are here: Print versions

Table of Contents

0. Overview        5

Key elements        6

On-line support and registration of interest        6

1. Official download links        9

11. Production versions        9

Class List Template        9

Peer Assess Pro Survey        9

Peer Assess Pro Analyser        10

12. Video overview and instructions        10

Video guides for teachers        10

YouTube playlist        10

0. Introduction and prerequisites        10

I. Set up the Class List and Survey        10

II. Prepare the Analyser for receiving Survey (Responses)        11

III. Preview and send the Survey to students        11

IV. Review and correct Analyser warnings concerning incorrect responses in the Survey        11

V. Review team, class, and individual results in Peer Assess Pro Analyser        11

VI. Distribute personal results using the Yet Another Mail Merge (YAMM) add-on        12

VII. Guidelines for good practice file management when using Peer Assess Pro        12

Video guides for students        12

0. YouTube playlist        12

I. Overview of team work and peer assessment        12

II. The peer assessment survey process        12

III. Calculation of personal result from team result and peer assessment ratings        13

13. Demonstration versions        13

Demonstration Peer Assess Pro - Survey - Respondents’ view        13

Demonstration Peer Assess Pro - Reports        13

Demonstration Peer Assess Pro - Analyser        13

Demonstration Survey (Responses) dataset        14

Demonstration Peer Assess Pro - Survey - Teacher’s view        14

Demonstration Class list        14

14. Print versions in PDF format        15

Quick Reference Guide        15

User Manual        15

Workflow        15

Peer Assess Pro - Survey        15

Website        15

15. New features        16

Peer Assess Pro Survey and Analyser Version 15.0 February 2018        16

Analyser        16

Documentation        16

Peer Assess Pro Survey and Analyser Version 14.0 September 2017        17

Survey        17

Analyser        17

Peer Assess Pro Survey and Analyser Version 12.0 June 2017        19

16. Workshop and interest group registration        21

2. Assignment Briefing for Calculation of Personal Result        22

21. Introduction        22

22. Overview        22

23. Peer Assess Pro: A demonstration        23

24. Your Personal Result (PR) and Peer Assess Pro        23

25. Example personal snapshot report        26

26. Factors used to determine your Personal Result        26

27. Further details        27

28. References        27

29. Cite this document        27

APPENDIX A: Calculation of the Personal Result, PR        32

A1. Team Based Learning Score, TBL        34

Example calculation for TBL        35

A2. Team Based Learning Index, TBLI        36

Example calculation for TBLI        36

A3. Indexed Personal Result, IPR        36

Example calculation for IPR        36

Bonus for best team performer added to Indexed Personal Result        37

A4. Normalised Personal Result, NPR        37

Example calculation for NPR        38

Adjusting the NPR Spread Factor        38

A5. Rank Based Personal Result, RPR        38

Example calculation for RPR        39

A6. Advantages and disadvantages of the five methods        40

Team Based Learning Score, TBL        41

Team Based Learning Index, TBLI        41

Indexed Personal Result, IPR        41

Normalised Personal Result, NPR        42

Rank Based Personal Result, RPR        42

A7. Illustrative data for calculations and reports        43

Comprehensive example reports        46

APPENDIX B: Instructions for teacher’s deployment of Peer Assess Pro Survey        47

B1. Before you start: Getting Oriented        47

B2. Create the Class List        47

B3. Prepare your version of the Survey form        49

Avoid adjusting the number or content of questions        53

B4. Prepare the (Responses) spreadsheet for receiving responses from the survey form        53

B5. Send the Survey form to your respondents        56

APPENDIX C: Instructions for using Peer Assess Pro Analyser        58

C1: Set up the Analyser Dashboard        58

C2: Link the Analyser to the Class List        60

C3: Link the Analyser to the Survey Responses        63

C4: Review and correct Analyser WARNINGS concerning incorrect Survey responses        66

C5. Reports produced by Peer Assess Pro        69

C6: Dispatch personalised result summaries to students using SNAPSHOT        71

APPENDIX D: Mass distribution of personal snapshots to students using the YAMM mailmerge addon        75

D1: Install the Google Sheets add-on YAMM        75

D2: Ensure the Class list with emails is linked in the Dashboard        75

D3: Create a draft Gmail email copying MailMergeDraft        76

D4: Link the draft Gmail to the Mailout subsheet        76

D5. Mass mail the Gmail to all students in the class using the YAMM addon        77

APPENDIX E: Features and calculation of Net Recommender        79

E1. Design features of Net Recommender        79

E2. Mathematical calculation        80

E3. Example calculations of Net Recommender        81

E4. Feedback provided to students        82

E5. Example charts for Net Recommender        82

E6. Assumptions about Net Recommender        84

E7. Frequently Asked Questions about Net Recommender        84

APPENDIX F: Frequently Asked Questions, known errors and limitations        85

FAQ 1. May I select a new destination for my Peer Assess Pro Survey?        85

FAQ 2. May I adjust the number or content of questions in the survey?        85

FAQ 3. May I remove rows or columns of Peer Assess Pro spreadsheet?        86

FAQ 4. Why are the members of a team not appearing correctly in some parts of the Analyser?        86

Problem:        86

User defined        86

Symptoms        86

Environment        86

Solution        87

Prevention        87

Systems development: Bug fixes and enhancements        87

Systems development: Future requirements        88

APPENDIX G: Future features requested        89

APPENDIX H: Workshop and interest group registration        90

Peer Assess Pro: enhancing the effectiveness of student team peer feedback        90

Next workshop        90

Abstract        90

Register interest        91

Pre-workshop preparation        91

User documentation        91

APPENDIX I: Previous download versions        92

I1. History of key improvements to Peer Assess Pro        92

I2. Production version 10 released 2016-07-10 to 2017-05-31        92

I3. Demonstration versions v.11        92

I4. Production version 12 released 2016-06-01 to 2017-08-31        93

Peer Assess Pro Survey v12        93

Peer Assess Pro Analyser v12        93

Demonstration Peer Assess Pro - Analyser v12        93

User Manual v12        94

I5. Production version 14 released 2017-09-01 to 2018-01-31        94

Peer Assess Pro Survey v14        94

Peer Assess Pro Analyser v14        94

Demonstration Peer Assess Pro - Analyser v14        94

Class List Template v14        95

User Manual v14        95

APPENDIX J: Definitions of terms        96


0. Overview

There has been a long call from employees to develop graduates’ soft skills in addition to the traditional academic competencies. In recent years, the call for developing graduates’ soft skills is being realised through wider adoption of team-based pedagogies. Several challenges related to formative and summative assignment now arise as teachers increasingly adopt team-based pedagogies such as project-based learning and action learning. Specifically, in terms of summative assessment, how can a teacher quantify an appropriate contribution mark for each member of the team? Equally important, as a team proceeds to work together, how can the team be supported to improve team members’ contribution to team effectiveness in a proactive manner that will contribute to an improved end-of-semester outcome?

The Peer Assess Pro User Manual presents instructions for using a decision support system designed to enhance the effectiveness of student team peer feedback. Peer Assess Pro provides teachers and team members with quantitative and qualitative information that enables timely, constructive conversations focussed on precise pinpointing of team members’ strengths, weaknesses, and opportunities to improve their contribution to their team’s achievement. Specifically, Peer Assess Pro comprises several elements:

 

Figure 0.1 provides an overview of the system components of Peer Assess Pro. Figure 0.2 provides the detailed process workflow of the steps that a teacher will undertake to create, process, and distribute the results of a Peer Assess Pro survey conducted for a specific class.

Section 14 provides links to a Quick Reference Guide.

Figure 0.1 Peer Assess Pro Overview

Key elements

In reviewing Figure 0.2, the key elements of Peer Assess Pro are:

  1. A Class List, of student names, student IDs, and emails used as input to modifying a ...

  1. Google Form template thereby creating the ...

  1. Peer Assess Pro Survey, a Google Form the teacher despatches to each student in the specific class.

  1. The class-specific Peer Assess Pro Survey collates responses from class students as they rate each of their teammates. These responses automatically collate into the Peer Assess Pro Survey (Responses) spreadsheet specific to that survey instance. Optionally, students can self-assess their own performance as a basis for comparing OTHERS rating with their SELF rating.

  1. The Peer Assess Pro Analyser spreadsheet analyses the survey responses found in the Peer Assess Pro Survey (Responses) spreadsheet. The analysis is conducted for each individual, team, and the class as a whole. The Analyser also produces warnings for the teacher about students who failed to submit ratings, and other discrepancies. The Analyser produces individual student snapshots including the quantitative and qualitative feedback they received, and the feedback they gave to their team mates. The snapshots help each student to engage in a constructive conversation with their team mates about how to improve their team’s performance.

  1. The Peer Assess Pro Analyser creates a draft memo from the teacher, for copy/pasting into a Google Mail draft document. Optionally, the Google Mail draft is used as a mail merge template automating the despatch of individual snapshots to each student. The student receives their results, their self-assessment, and comparisons with class statistics including average and range. The mail merge is generated from Peer Assess Pro Analyser in conjunction with ….

  1. The Google Sheets Add-on Yet Another Mail Merge (YAMM). YAMM requires that the Class List containing the students’ email addresses has been linked into the Peer Assess Pro Analyser.

On-line support and registration of interest

PeerAssessPro.com


Figure 0.2 Workflow for Peer Assess Pro

Peer Feedback Workflow-PartAB.png


Peer Feedback Workflow-PartCD.png

Source: Workflow for Peer Assess Pro [PDF].http://tinyurl.com/peerassesswflpdf


1. Official download links

Please always download versions of the Google Form survey, demonstrations, and Peer Assess Pro Analyser spreadsheet from this section. All elements of Peer Assess Pro are in constant development.

For instructions on how to apply correctly these forms and spreadsheets, see

APPENDIX B: Instructions for teacher’s deployment of Peer Assess Pro Survey

APPENDIX C: Instructions for using Peer Assess Pro Analyser

Advice: these Production versions are subject to improvement. Return to this chapter to obtain the latest download version. The links will change!

Previous major version releases are available from APPENDIX I: Previous download versions.

Website (Registration of interest, Community, Help):  www.peerassesspro.com

11. Production versions

Class List Template

Class List Template - Peer Assess Pro 15. (2018). (Version 15) [Google Sheets]. Peer Assess Pro.

 https://docs.google.com/spreadsheets/d/1-51CJMiAhlxVk07yjakg568zkxLrBknT1guJ9kVJsIE/copy

This template is an optional template structured in the correct format for Peer Assess Pro. The template includes a utility feature that (a) Generates Full Names appended with the Team ID and (b) Generates a list of the Unique Team Names in Sheet2.

Refer B1. Before you start: Creating the Class List

Video: Create the Peer Assess Pro Class List of names, student IDs and emails. https://www.youtube.com/watch?v=cNTUrbX_riY&feature=youtu.be&t=1m9s

Peer Assess Pro Survey

Survey - Peer Assess Pro v15. (2018). (Version 15.0) [Google Form]. Auckland: Peer Assess Pro.

https://docs.google.com/forms/d/1bWFbOnUXZ4yPlc2SET3fKjGRZmUm39PR6jXLsP-i4TY/copy

The above link creates your personal copy of the Peer Assess Pro Survey. You adapt the survey form to include your class names and student IDs, typically presented in your Class List.

Refer: APPENDIX B: Instructions for teacher’s deployment of Peer Assess Pro Survey.

Video: Peer Assess Pro Comprehensive Operations Guide for Teachers - YouTube. Auckland: Peer Assess Pro. https://www.youtube.com/watch?v=cNTUrbX_riY&index=4&list=PLrrxmJ7TOi8NIIUUq9lMi88enry92-ovx

Peer Assess Pro Analyser

Analyser - Peer Assess Pro v15.3 (Version 15.3) [Google Sheets]. Auckland: Peer Assess Pro.

https://docs.google.com/spreadsheets/d/1EtaFLNEoDoqWxbZBA5Wvlqy6vEDLE-uzPQfo_W0IgZw/copy

The above link creates your copy of the Peer Assess Pro Analyser. In the Analyser, you link to the (Responses) spreadsheet generated by students’ responses to the Peer Assess Pro Survey you created and launched. It is good practice, but optional, to link the Analyser to the Class List.

Refer: APPENDIX C: Instructions for using Peer Assess Pro Analyser.

Video: Prepare the Analyser for receiving Survey (Responses)

12. Video overview and instructions

This  video and slideshow explain and demonstrate the essential components of Peer Assess Pro.

Peer Assess Pro: Demonstration (2016). Peer Assess Pro Ltd, https://www.youtube.com/watch?v=oGzH2kVsD7A&list=PLrrxmJ7TOi8NIIUUq9lMi88enry92-ovx&t=1s

Peer Assess Pro Analyser: Introduction [Google Slides]. https://docs.google.com/presentation/d/17IEvZ1-HhqW9NbG5rwSq74WPPHImtl9c_WuRUuGyUgE/edit?usp=sharing

Video guides for teachers

The following playlist and links take you to the specific sections in the video playlist: Peer Assess Pro for Teachers that you might need to refresh your mind about. The videos are listed in order of workflow for setting up and using Peer Assess Pro.

YouTube playlist

Peer Assess Pro for Teachers [Playlist]. Auckland. https://www.youtube.com/playlist?list=PLrrxmJ7TOi8NIIUUq9lMi88enry92-ovx

0. Introduction and prerequisites

0. Comprehensive Guide for Teachers - Introduction and Prerequisites

https://www.youtube.com/watch?v=iAsDGgSCdFk&list=PLrrxmJ7TOi8NIIUUq9lMi88enry92-ovx&t=01s

I. Set up the Class List and Survey

1. Survey Preparation and Distribution - Create the class list

https://www.youtube.com/watch?v=3VmiX_QL_fo&list=PLrrxmJ7TOi8NIIUUq9lMi88enry92-ovx&t=01s

2. Create the Survey

https://www.youtube.com/watch?v=g_zdVhonZ0g&list=PLrrxmJ7TOi8NIIUUq9lMi88enry92-ovx&t=01s

You may now optionally proceed directly to: Phase III. Preview to send the Survey to students.

Good practice Hint: complete Phase II Prepare the Analyser immediately after you despatch the Survey to respondents!

II. Prepare the Analyser for receiving Survey (Responses)

The following two steps may be postponed until immediately before you are ready for Phase IV. Review and correct Analyser warnings.

3. Download the Analyser

https://www.youtube.com/watch?v=tv7vuqItE1I&list=PLrrxmJ7TOi8NIIUUq9lMi88enry92-ovx&t=01s

4. Link Analyser to class list

https://www.youtube.com/watch?v=b7ohDi8UFas&list=PLrrxmJ7TOi8NIIUUq9lMi88enry92-ovx&t=01s

5. Link Analyser to Survey (Responses)

https://www.youtube.com/watch?v=1xN3MTfc4kw&list=PLrrxmJ7TOi8NIIUUq9lMi88enry92-ovx&t=02s

III. Preview and send the Survey to students

6. Preview the survey: What the students experience.

https://www.youtube.com/watch?v=O3BgB8okXu8&list=PLrrxmJ7TOi8NIIUUq9lMi88enry92-ovx&t=1s

7. Send Survey to assessors (students)

https://www.youtube.com/watch?v=FSz64O-8rrY&list=PLrrxmJ7TOi8NIIUUq9lMi88enry92-ovx&t=01s

IV. Review and correct Analyser warnings concerning incorrect responses in the Survey

Ensure Phase II Prepare the Analyser for receiving Survey (Responses) is completed before you continue with the following steps.

8. Response checking, correction, and analysis

https://www.youtube.com/watch?v=g54vTGtcCDs&t=01s&list=PLrrxmJ7TOi8NIIUUq9lMi88enry92-ovx

V. Review team, class, and individual results in Peer Assess Pro Analyser

9. Review all results in the Analyser 

https://www.youtube.com/watch?v=uHUI8ZDQb_Y&t=01s&list=PLrrxmJ7TOi8NIIUUq9lMi88enry92-ovx

9.1  Enter Team Results into the Peer Assess Pro Analyser Dashboard

https://www.youtube.com/watch?v=uHUI8ZDQb_Y&t=1m01s&list=PLrrxmJ7TOi8NIIUUq9lMi88enry92-ovx

9.2. Review Class Charts in the Analyser

https://www.youtube.com/watch?v=uHUI8ZDQb_Y&t=1m51s&list=PLrrxmJ7TOi8NIIUUq9lMi88enry92-ovx

9.3. Review and sort Analysis Gradebook

https://www.youtube.com/watch?v=uHUI8ZDQb_Y&t=2m40s&list=PLrrxmJ7TOi8NIIUUq9lMi88enry92-ovx

10. Snapshot review of a specific student https://www.youtube.com/watch?v=BNQk2WMnvEw&t=01s&list=PLrrxmJ7TOi8NIIUUq9lMi88enry92-ovx

VI. Distribute personal results using the Yet Another Mail Merge (YAMM) add-on

11. Distribution of results - Create mail merge draft email

https://www.youtube.com/watch?v=1TDR_YanVcE&t=01s&list=PLrrxmJ7TOi8NIIUUq9lMi88enry92-ovx

12. Install Yet Another Mail Merge Addon: First time installation of YAMM

https://www.youtube.com/watch?v=NAwilxOqwF8&t=01s&list=PLrrxmJ7TOi8NIIUUq9lMi88enry92-ovx

13. Receive a test email using a draft Gmail template using the YAMM add-on

https://www.youtube.com/watch?v=roiWfZKGPIg&list=PLrrxmJ7TOi8NIIUUq9lMi88enry92-ovx&t=1s

14. Initiate mailmerge personalised emails using YAMM add-on https://www.youtube.com/watch?v=4TbTf7tnrM8&t=01s&list=PLrrxmJ7TOi8NIIUUq9lMi88enry92-ovx

VII. Guidelines for good practice file management when using Peer Assess Pro

15. Good practice - gather your assets

https://www.youtube.com/watch?v=jh5Snyjvx2s&t=01s&list=PLrrxmJ7TOi8NIIUUq9lMi88enry92-ovx

Video guides for students

0. YouTube playlist

Peer Assess Pro for Students [Playlist]. Auckland: Peer Assess Pro.

https://www.youtube.com/playlist?list=PLrrxmJ7TOi8PjeJWUzUjFxlg2MZ6Wbo0J&t=01s

I. Overview of team work and peer assessment

0. Peer assessment for students - Introduction and overview.

https://www.youtube.com/watch?v=ExxkNdxwFt8&t=01s&list=PLrrxmJ7TOi8PjeJWUzUjFxlg2MZ6Wbo0J

1. Why do team work?

https://www.youtube.com/watch?v=1C2TyhYcPJ4&list=PLrrxmJ7TOi8PjeJWUzUjFxlg2MZ6Wbo0J&t=01s

2. Why do peer assessment?

https://www.youtube.com/watch?v=mub1Uy0VO7g&t=01s&list=PLrrxmJ7TOi8PjeJWUzUjFxlg2MZ6Wbo0J

II. The peer assessment survey process

3. How is peer assessment conducted?

https://www.youtube.com/watch?v=ORhTqUcEODM&list=PLrrxmJ7TOi8PjeJWUzUjFxlg2MZ6Wbo0J&t=01s

4. What questions are asked in the survey? https://www.youtube.com/watch?v=-abKLOzDnwA&list=PLrrxmJ7TOi8PjeJWUzUjFxlg2MZ6Wbo0J&t=01s

5 What are examples of qualitative feedback? https://www.youtube.com/watch?v=-sFuHjsMGwI&list=PLrrxmJ7TOi8PjeJWUzUjFxlg2MZ6Wbo0J&t=01s

6. Why do I self assess? https://www.youtube.com/watch?v=jyfwpbMOYpo&t=01s&list=PLrrxmJ7TOi8PjeJWUzUjFxlg2MZ6Wbo0J

III. Calculation of personal result from team result and peer assessment ratings

7.1 How is my Personal Result calculated? Block Overview

https://www.youtube.com/watch?v=oGO2RnUiqsg&list=PLrrxmJ7TOi8PjeJWUzUjFxlg2MZ6Wbo0J&t=01s

7.2 Calculation of Team Based Learning Score (TBL) and Team Based Learning Index (TBLI)

https://www.youtube.com/watch?v=snRfFFw7tds&t=01s&list=PLrrxmJ7TOi8PjeJWUzUjFxlg2MZ6Wbo0J

7.3.1 Calculation of Indexed Personal Result (IPR).

https://www.youtube.com/watch?v=iwLF77v1VjU&list=PLrrxmJ7TOi8PjeJWUzUjFxlg2MZ6Wbo0J&t=01s

7.3.2 Calculation of Normalised Personal Result (NPR).

https://www.youtube.com/watch?v=iwLF77v1VjU&list=PLrrxmJ7TOi8PjeJWUzUjFxlg2MZ6Wbo0J&t=1m56s

7.4 Comparison of the features of Indexed Personal Result (IPR) and Normalised Personal Result (NPR). https://www.youtube.com/watch?v=NfRQLrO-Hv0&t=01s&list=PLrrxmJ7TOi8PjeJWUzUjFxlg2MZ6Wbo0J

13. Demonstration versions

These demonstration versions enable you to explore the functioning of the components of Peer Assess Pro.

Demonstration Peer Assess Pro - Survey - Respondents’ view

This view shows the Peer Assess Pro Survey ‘in use’ as it would be seen by a respondent (student).

https://goo.gl/forms/hjzqcsCZWWmE4KQ62

Demonstration Peer Assess Pro - Reports

Class 123 ALPHA Analyser Reports - Peer Assess Pro [pdf]

https://drive.google.com/file/d/1xQZHH_eDYq9cXrgeSVwhUvsaIW5fsriN/view?usp=sharing

The pdf documents shows  examples of the reports, such as Dashboard, Analysis and Charts, produced from the demonstration Analyser and Survey (Responses).

Demonstration Peer Assess Pro - Analyser

Class 123 ALPHA Analyser - Peer Assess Pro v15.3

https://docs.google.com/spreadsheets/d/1tJ2anWm_tZIE9iNAbCUSQZx2QY5xqe0RaAU_MoI1upA/copy

The demonstration version creates a copy of the Analyser linked to a demonstration dataset of Survey responses (Class 123 ALPHA) and email addresses ready for a mailmerge.

Demonstration Survey (Responses) dataset

Class 123 ALPHA Survey (Responses) v15 - Peer Assess Pro [Google Sheets]. https://docs.google.com/spreadsheets/d/1ENcU9iqVEhJNZrf9qz1QQhVu091p7ECgzfm-eOFvd8g/copy

Practice linking this Peer Assess Pro Survey (Response) data into your copy of the PRODUCTION version of Peer Assess Pro. BE PATIENT! It takes several minutes for the data to be analysed and percolate through the several sub-sheets.

Check out the WARNINGS subsheet. There are several deliberate faults in this TEST Responses dataset that are raised in the WARNINGS subsheet.

This Demonstration (Responses) dataset is also linked into the Class 123 Demonstration Peer Assess Pro - Analyser, listed above.

Note there are three sheets in this spreadsheet from which you can select in the Analyser Dashboard. The demonstration will open with Form Responses 2. Select Form Responses 3 to analyse a correct dataset of Responses

Form Responses 1 - About 60 responses up to mid way through the survey schedule. There are several deliberate errors in this dataset that are detected and identified in the Analyser WARNINGS. Note the (Responses) dataset is colour coded so you can see the nature of the detected WARNINGS.

Form Responses 2 - The full response dataset, about 100 responses. The errors presented in the Form Responses 1 dataset have not here been corrected. For example, a student has incorrectly stated their team membership, and another student has supplied two different Student IDs.

Form Responses 3 - A corrected version of Form Responses 2.

Demonstration Peer Assess Pro - Survey - Teacher’s view

Class 123 Survey - Peer Assess Pro [Google Form].

https://docs.google.com/forms/d/1NId1Sus6mTFnzsGttG1xdKhDb98a21fYMHJGPiVL48A/copy

This demonstration survey generates (Responses) for the Class 123 demonstration. The form will have almost no responses until you trial the Survey in Preview mode, for example.

Demonstration Class list

Class 123 ALPHA Class List v14.2. https://docs.google.com/spreadsheets/d/1IcEv7VFtQRnahMvnbjbUlXZlnfR8CWdte7rW-GKJQ4k/copy

This dataset is the list of student names and Student IDs that you may need to practice the demonstration. Note the Team Names are listed uniquely in the second subsheet, ClassList.

14. Print versions in PDF format

Quick Reference Guide

Quick Reference Guide [Web]. (2018). Peer Assess Pro.

http://tinyurl.com/peerassessquickstart

Quick Reference Guide [pdf]. (2018). Peer Assess Pro.

http://tinyurl.com/peerassessquickpdf

The Quick Reference Guide presumes you have completed successfully one cycle using the Peer Assess Pro™ system. The Actions listed should be followed in strict order. The online version of the Guide provides links to the corresponding documentation in the Video Playlist and User Manual.

User Manual

User Manual: Peer Assess Pro [PDF]. (2018). Version 15. Auckland: Peer Assess Pro Ltd. http://tinyurl.com/peerassessmanualpdf

User Manual: Peer Assess Pro [WEB]. Auckland, New Zealand: Peer Assess Pro Ltd. http://tinyurl.com/pfmanual

Workflow

Workflow for Peer Assess Pro [PDF]. Peer Assess Pro. http://tinyurl.com/peerassesswflpdf

This is a high resolution wall chart for Figure 0.2 Workflow for Peer Assess Pro

Peer Assess Pro - Survey

Survey PRODUCTION - Peer Assess Pro [Sample printout] (Version 14.0). Auckland, New Zealand. https://drive.google.com/file/d/1uFFZHqETZFWz3Hn_Kj5tj9_6yjZ3enAifv1g99QhwR6iyIi6FcEDAtswB3zWbqWfz-RxNqU9REvgs0-9/view?usp=sharing

This PDF is created from the Survey PRODUCTION Google Form. The original source Google form for downloading is here: Peer Assess Pro Survey

Website

Support, Registration of interest, Community, Help:  www.peerassesspro.com


15. New features

Peer Assess Pro Survey and Analyser Version 15.0 February 2018

Analyser

Team Based Learning Index (TBLI). A new measure, the Team Based Learning Index, (TBLI) enables a better comparison of peer assessment ratings across the entire class. The TBLI is defined so that the maximum achieved Team Based Learning Score (TBL) in each team is 100. The use of TBLI simplifies the task of explaining how Personal Result (PR) is calculated from the Team Result, TR. For example, IPR = TBLI x TR / 100.

The calculation of Indexed Normalised Result (IPR) and Normalised Personal Result (NPR) are redefined to align with the Team Based Learning Index (TBLI). However, these redefinitions have NO IMPACT on the calculation of IPR and NPR as defined in version 14. See Appendix A Section A2. Team Based Learning Index, TBLI.

Net Recommender: The calculation of Net Recommender is relocated to the Analysis subsheet from the SelfAnalysis sheet. The calculation has been adjusted and calibrated so that the target standard deviation for the distribution of Net Recommender is 40. See APPENDIX E: Features and calculation of Net Recommender.

Correlation with selected sort column: In the Analysis and SelfAnalysis sheets, an additional row of statistics shows the correlation of every column variable (eg Recommendation, ATC, ALC, TBL) with the variable you have selected for sorting the analysis. This correlation helps you identify patterns and associations in the results. For an example, see the Figure A.6 in: Section A7. Illustrative calculations and reports.

Documentation

Quick Reference Guide: The guide presumes you have completed successfully one cycle using the Peer Assess Pro™ system. The Actions listed should be followed in strict order. The online version of the Guide provides links to the corresponding documentation in the Video Playlist and User Manual. See Section 14 Quick Reference Guide.

Video playlists: The video guides for both teachers and students have been revised so that each element of the instructions is placed in a separate video. The videos are organised into two playlists, for teachers and students. See Section 12. Video overview and instructions.


Peer Assess Pro Survey and Analyser Version 14.0 September 2017

Survey

Survey redesign: The SURVEY has been redesigned to improve the user experience. For example, the qualitative question requesting examples of behaviors related to task accomplishment immediately follows the five quantitative questions on this topic.

Recommendation: A new measure, Recommendation has been added to the survey at the start to gain the Assessor’s quick assessment of the student they are assessing. The question is “How likely is it that you would recommend this team member to a friend, colleague, or employer?” In our experience, there is a high correlation between the answer to this question, and measures of Team-Based Learning Score.

Analyser

Net Recommender: A new aggregate measure, Net Recommender has been calculated, analogous to the Net Promoter score used in assessing the reputation of a company. The value is an aggregate of the Team Based Learning Score and Recommender score, normalised such that (a) the average student in a particular class achieves zero, and (b) ranges from -100 to +100. See APPENDIX E: Features and calculation of Net Recommender.

Self assessment: Respondents may now complete a self-assessment. The self-assessment results are shown in a new sub-sheet ‘SelfAssessment’. The self assessments are ALWAYS EXCLUDED from the calculation of an assessor’s ratings, such as the Team Based Learning Score.

Alphanumeric team names. Teams may be designated with alphanumeric designations in the SURVEY. Note that the team Result should not be applied to a team in the Analyser DASHBOARD until AT LEAST ONE Assessor from each team has responded. That is because the automatic numbering and sequencing of the teams updates as additional teams are discovered from the (Responses) dataset.

Renaming of several variables. Variables have been renamed in the Analyser to align with Team Based Learning practice. Table 1.1 below shows a list of the new and old names.

Average Task Contribution: The variable Average Task Contribution (ATC) replaces Total Team Contribution. Average Task Contribution shows more readily how the aggregate measure of contribution to task is derived from its five task attributes since ATC ranges from 1 through 5, the same as the component attributes.

Average Leadership Contribution: The variable Average Leadership Contribution (ALC) replaces Total Leadership Contribution. Average Leadership Contribution shows more readily how the aggregate measure of contribution to task is derived from its five task attributes since ALC ranges from 1 through 5, the same as the component attributes.

Re-specification of Team Based Learning Score (TBL Score). The formerly-designated Total Peer Score (TPS) effectively covered a range of 20 through 100 since the ten component attributes were rated in the Peer Assess Pro Survey on a scale from 1 through 5, a range of 4. The Team Based Learning Score (TBL) is now adjusted to span the entire range from zero to 100. Consequently, an expected mid point of 50 corresponds to average ratings of 3 across the ten attributes in the survey, where 3 is the midpoint of the ratings between 1 and 5.

TBL = (5/4) x (ATC + ALC - 2)

Indexes of realistic self-assessment. The self-assessment by each student is compared with the assessments by the other members of the team, using several INDEXES of 100 x (OTHER/SELF). An index close to 100 indicates the student is assessing themselves realistically when considered by their team members. An Index of 50, for example, suggests the student thinks rather too highly of themself compared to what their team members consider.

Class charts: Several charts show measures of Team Based Learning Score, Average Task Contribution and Average Leadership Contribution as measured for (a) OTHER students and (b) SELF assessment. The charts typically highlight how students tend to assess themselves somewhat higher than they rate everyone else. The so-called Lake Wobegon Effect, also known as Self-Enhancement Bias.

Mailmerge: Major redesign of the draft mail merge letter, to improve readability and include the new calculations of self-assessment and recommendation. For example, respondents can view their self assessment across all ten attributes against their averaged rating by team peers, and class average.

More error checking of the (Responses), including:


Table 1.1 Redefined Measures in Peer Assess Pro Version 14

Measure (Version 12, 13)

Abbreviation Ver 12, 13

Measure (Version 14 and above)

Abbreviation Ver 14

Total Team Contribution (%)

Average Task Contribution

ATC

Total Leadership Contribution (%)

Average Leadership Contribution

ALC

Total Peer Score (TPS)

TPS

Team Based Learning Score

TBL

(Introduced Version 15)

Team Based Learning Index

TBLI

Team Result

Team Result

TR

Individual Team Contribution Mark

ITCM

Personal Result

PR

Indexed Peer Mark (IPM)

IPM

Indexed Personal Result

IPR

Normalised Peer Mark (NPM)

NPM

Normalised Personal Result

NPR

Rank-based Peer Mark (RBM)

RBM

Rank Based Personal Result

RPR

Rank within team

Rank Within Team

RWT

Per Cent Rank (PCR)

PCR

Percentile Rank Within Team

PCR

0. Recommendation

Recommendation

Peer Assess Pro Survey and Analyser Version 12.0 June 2017

Up to 15 teams may be processed, up from ten teams in earlier versions.

Additional charts presented in the ‘Charts-CLASS’ sub-sheet, including:

Improved template for the draft mail merge message in ‘MailMergeDraftLetter’ sub-sheet.

‘Snapshot’ sub-sheet shows the quantitative marks a student gave his team mates. Good Practice Hint: BE ABSOLUTELY STRICT in avoiding showing this snapshot to other than the student being displayed.

Dear Teacher in Peer Assess Pro Survey has replaced the previous version Question 14 Other Comments’. The ‘Dear Teacher’ remarks are anonymised and placed into a new sub-sheet ‘DearTeacher’. The ‘Dear Teacher’ question  asks: “Please provide advice for the teacher about improving the effectiveness of teamwork in this course. Explain any issues or concerns you have about the Peer Assess Pro survey and feedback. Provide any other feedback to the course teacher.”

“We are all above average in our team, just like all the children of Lake Wobegon” (Keillor): A slight rewording of the Peer Assess Pro Survey to encourage students to give a rating of 3 out of 5 for average performance, rather than 5 out of 5. We can but try! The Part A introduction reads: “In the five questions that follow, rate the team member on a 5-point scale. Rate your typical or average team member a mid-level rating of 3.” The rating scale presented is: 1 = Almost never, 2 = Seldom, 3 = Average, 4 = Better than most, 5 = Outstanding.

Hints for qualitative feedback: In Section C: Qualitative Remarks in Peer Assess Pro Survey there is a link to ‘Examples of high and low contributions to team effectiveness’ (Ohland, et al. 2012, adapted Mellalieu, 2017). This table presents students hints at specific words they could use to describe the performance of their team members. See  http://tinyurl.com/BARSOhland.

Choice of Email lookup sub-sheet: You can choose the sub-sheet from which to extract the emails for the ‘EmailLookup’ sub-Sheet. This matches the style of selecting the (Responses) sub-sheet for a Google sheet with multiple tabs or sub-sheets.

More error checking of the (Responses) and EmailLookup sheets are displayed in the ‘WARNINGS’ sub-sheet such as:


16. Workshop and interest group registration

Please register your interest in either:

Please complete this Google form.

See APPENDIX H for workshop details.


2. Assignment Briefing for Calculation of Personal Result

21. Introduction

[Teacher: Remove this section from students’ view’] This is a generic set of instruction that teachers may adapt to provide an explanation for the use of Peer Assess Pro. These tools provide the basis for the teacher to facilitate formative developmental performance improvement and/or a summative calculation of a team member’s Personal Result. The Personal Result is calculated from a combination of the overall team project mark, and the average peer rating for a team member based on that team member’s peer rating by other team members.

22. Overview

The ability to give and receive constructive feedback is an essential skill for managers and team members. This course uses Peer Assess Pro to help you provide developmental feedback to your team members. The goal of developmental feedback is to highlight both positive aspects of performance plus areas for performance improvement. The result of feedback is to increase both individual and team performance (Carr, Herman, Keldsen, Miller, & Wakefield, 2005). Additionally, your teacher may use the quantitative marks summarised from Peer Assess Pro to determine each student’s Personal Result (PR). Your PR may contribute to the summative assessment grade you gain for this course.

Rating the relative contributions of your team members: Peer Assess Pro Survey

At several points throughout the semester, your teacher will require you to complete the Peer Assess Pro Survey. The Peer Assess Pro Survey is implemented as a Google Form survey that submits your data entries to the teacher and amalgamates the data with the ratings from other students in your team.

Anonymity: Your teacher will provide you with summary results from which the name of your other team members is removed. You will not know who rated you high or low for a particular factor.

The Peer Assess Pro Survey asks ten questions about your rating of each team member’s contribution to the factors presented detailed below, Contribution to Task Accomplishment (Part A), and Contribution to Leadership and Team Processes (Part B). Each of your team members’ ratings are input to a Google Form survey. The aggregated results from your team will determines the Personal Result (PR) ultimately allocated for each team member. The Peer Assess Pro Survey also requests that you give examples and explanations to support the quantitative rating you provided (Part C). Finally, Peer Assess Pro requests that you suggest specific advice that the student you are assessing could pursue to maintain or improve her contribution to team productivity and leadership.

23. Peer Assess Pro: A demonstration

Follow this  link to a DEMONSTRATION version of the Peer Assess Pro Survey form. Your teacher may vary the questions and factors used in her version of Peer Assess Pro depending on their personal teaching requirements. Demonstration Peer Assess Pro - Survey

These are the general instructions you will view when you begin the Peer Assess Pro Survey.

The ability to give and receive constructive feedback is an essential skill for managers and team members. This worksheet helps you provide developmental feedback to your team members. The goal of developmental feedback is to highlight both positive aspects of performance as well as areas for your performance improvement. The result is to increase both individual and team performance.

In the questions that follow, you will rate each team member on a 5-point scale. Rate your typical or average team members a mid-level rating of 3. Please ensure your ratings distinguish between higher and lower levels of performance within your team. For example, if a team member is a good listener, yet another member is a better listener, the latter should receive a higher rating on the 5-point scale.  

The teacher DOES NOT expect to see every team member rated with the same score, or a high score!

The rating scale is:

1 = Almost never

2 = Seldom

3 = Average

4 = Better than most

5 = Outstanding

Anonymity: Your teacher will provide your team with summary results from which your name has been removed. Around week 4 or 5 you will be provided in class with an opportunity to give and receive oral feedback from your team members focussed on improving both your contributions and the team’s overall productivity. Worksheet adapted from Carr, Herman, Keldsen, Miller, & Wakefield (2005).

24. Your Personal Result (PR) and Peer Assess Pro

A proportion of the marks you earn in this course are gained from your participation in a team project that delivers several outputs. Your participation mark is termed the Personal Result (PR). This section explains how your PR is calculated.

As you will know from past team assignment experience, you can expect each team member to contribute a different amount of effort to the team's output depending on their ambition, capability, and commitment. The Peer Assess Pro Survey, described earlier, enables you to rate the relative contribution of each of your team members according to your assessment of their contribution to your team's performance.

Subsequently, the teacher uses Peer Assess Pro to calculate your Personal Result based on a mathematical formula including:

  1. The Team Result (TR): the overall mark your team earns from its team project outputs,

  1. The Team Based Learning Score (TBL) you receive for your personal contribution to the team’s results. Your Team Based Learning Score is calculated from the average of the ten ratings made by your other team members using the Peer Assess Pro Survey.

If your team members collectively rank your contribution the best, then you could achieve 100 per cent for your PR. However, a 100 % mark is possible IF and ONLY IF your team achieves a high Team Result, typically above 75 marks. In contrast, if you free-load or disrupt the team's achievement, then you may achieve a zero mark for your Personal Result. Furthermore, if your team delivers poor results for your Team Assignments then you will also achieve a lower PR. It is in your interests  to help your team achieve a high team result!

Your teacher will advise you which mathematical approach will be used to determine your Personal Result for this course. The teacher may choose from five alternative approaches:

Figure 2.1 presents an illustration of the alternative approaches to determining students’ Personal Result (PR). Your teacher will advise you which alternative they are using, as explained earlier in this section.

Figure 2.1 Alternative approaches to determining students’ Personal Result (PR) 

The figure assumes a spread of TBL ratings spread around 50, and a Team Result of 65. The highest IPR the best team member can achieve is the Team Result. In contrast, the average of team members marks for both the NPR and RPR method matches the Team Result. Note the spread of marks for a team using the IPR approach matches the spread of marks for the NPR. The NPR with a Spread Factor of 2 spreads the individual team members’ marks over twice the range of the IPR. The Spread Factor is user-selectable by the teacher.


25. Example personal snapshot report

Figure 2.2 presents a snapshot of the personal results for an individual student. This report can be printed or emailed to each student. As a formative assessment, typically by week five of the semester, the snapshot report can be used as a basis for the student meeting with their team to solicit precise feedback for developing a personal Action Plan to improve their contribution to the team’s productivity.  The right part of the table show the class statistics as a base for comparing the student’s specific results, shown in the left column. The cells are colour coded to show where the student has achieved above the class average (green) or below (pink, red).

26. Factors used to determine your Personal Result

Your Personal Result (PR) is based on measures related to your task accomplishment, attempted input, your leadership, and your positive contribution to group processes. The contributions are combined with the aggregated mark allocated by your teacher to the team project outputs.

The factors that your teacher might consider in assessing your contribution include:

Your Team Result

Your Contribution to Task Accomplishment

Your Contribution to Leadership and Team Processes

27. Further details

The precise technical details explaining the mathematics of how your PR is calculated are presented with examples in:

 APPENDIX A: Calculation of the Personal Result, PR, in User Manual: Peer Assess Pro (2018). Auckland: Peer Assess Pro, Retrieved from http://tinyurl.com/peerfeed

28. References

Peer Assess Pro is extended and adapted from ideas presented in:

Carr, S. D., Herman, E. D., Keldsen, S. Z., Miller, J. G., & Wakefield, P. A. (2005). Peer feedback. In The Team Learning Assistant Workbook. New York: McGraw Hill/Irwin

Ohland, M. W., Loughry, M. L., Woehr, D. J., Bullard, L. G., Felder, R. M., Finelli, C. J., … Schmucker, D. G. (2012). The comprehensive assessment of team member effectiveness: Development of a behaviorally anchored rating scale for self-and peer evaluation. Academy of Management Learning & Education, 11(4), 609–630. Retrieved from http://amle.aom.org/content/11/4/609.short

29. Cite this document

Peer Assess Pro (2018). User Manual: Peer Assess Pro. Version 15. Auckland: Peer Assess Pro Ltd. http://tinyurl.com/pfmanual


Figure 2.2: Snapshot of personal results for an individual student

These reports are created for, and emailed to, each student using a Google Sheet Addon, Yet Another Mail Merge, YAMM.

Snapshot Part 1.png


Snapshot Part 2.png


Snapshot Part 3.png

Snapshot Part 4.png

Appendices


APPENDIX A: Calculation of the Personal Result, PR

[Teacher: Remove optionally the following sections from the students’ view’] 

The Peer Assess Pro Analyser provides the teacher with several alternative methods for calculating your Personal Result, PR. Your teacher will advise which method they have chosen. Each method has advantages and disadvantages, which are explained later.

A series of videos demonstrating how each method calculates the Personal Result is shown in section 12: Calculation of personal result from team result and peer assessment ratings

The five alternative methods for calculating the Personal Result, PR are:

The advantages and disadvantages of each method are discussed later.

To understand the differences between the four methods, consider Figures A.1 and A.2. Figure A.1 presents a spread of TBL scores spread around 50, and a Team Result (TR) of 65. The highest IPR the best team member can achieve is the Team Result. In contrast, the average of team members marks using either the NPR and RPR method matches the Team Result.

Note the spread of marks for a team using the IPR approach matches the spread of marks for the TBL. The NPR spreads the individual team members’ marks over twice the range. (However, the spread factor is user-selectable by the teacher).

Figure A.2 overviews the process through which the various Personal results are calculated.

Figure A.1 Several methods for calculation of students’ Personal Result (PR) illustrating effect on team average and spread of results within a team


Figure A.2 overview of the process sequence for calculating the various Personal Results

A1. Team Based Learning Score, TBL

In the explanation and discussion of the several methods for calculating Personal Result, the data for the example calculations is detailed in Section A7. Illustrative calculations and reports.

There are ten Peer Rating components, r,  for each student. Each component is rated on a scale of 1 through 5 with a maximum rating of 5 and therefore a range of 4. Consequently, for each student in the class:

The Peer Rating for each component equals the average (mean) of the peer ratings submitted by all Assessors for that Assessee student. If a student submits more than one assessment for the same student, then the mean  of the multiple assessments by that student is used prior to its value being used in the calculation of TBL.

To ensures the TBL ranges from zero through 100 the following features are required in the above formula:

Alternatively, with the same result,

TBL = (50/4) x (Average Task Contribution  + Average Leadership Contribution - 2)

TBL = 12.5 x (ATC + ALC - 2)

Where:

ATC and ALC are the average ratings for the five components that comprise the Task and Leadership contributions, respectively. Mathematically:

Note that the Team Based Learning Score, TBL, takes NO account of the team’s Team Result. The Team Result is accounted for in the IPR, NPR, and RBR methods discussed later.

Example calculation for TBL

The following example calculations refer to the data presented in Section A7. Illustrative calculations and reports. The team members Ritchie Chor and Karl Marc are selected from Team 1 Brazilia, which has a Team Result of 50.

From Figure A5 Ritchie Chor has achieved ATC = 3.9 and ALC of 3.7.

Therefore,

TBL Score (Ritchie Chor)  = 12.5 (3.9 + 3.7 - 2) = 12.5 x 5.6 = 70

Meanwhile, Karl Marc has achieved ATC = 2.6 and ALC of 3.13.

        TBL Score (Karl Marc) = 12.5 (2.6 + 3.13 - 2) = 12.5 x 3.73 = 47.6 = 47

Note that calculations are carried out in high precision floating point. However, results are displayed rounded to the nearest 0.1 for ATC and ALC, and to the nearest integer or other measures.


A2. Team Based Learning Index, TBLI

The Team Based Learning Score for each student, s, is scaled so that the team member with the maximum TBL for that team, t, receives a TBLI of 100.

        TBLI = 100 x TBL / (Maximum TBL Score in team, t)

Example calculation for TBLI

Building on the example provided for the calculation of TBL, Ritchie Chor from Team 1 has a TBL of 70, the highest for that Team. Therefore,

        TBLI (Ritchie Chor) = 100 x (70 / 70) = 100, by definition

In contrast, Karl Marc has a TBL of 47, the lowest for Team 1. Therefore,

        TBLI (Karl Marc) = 100 x (47 / 70) = 67.1 = 67

Note that the Team Based Learning Index, TBLI, takes NO account of the team’s Team Result. The Team Result is accounted for in the IPR, NPR, and RBR methods discussed later.

A3. Indexed Personal Result, IPR

For each student, s, in a particular team, t

IPR = Team Result for team t x TBLI / 100

The IPR of the team member with the highest Team Based Learning Index (TBLI) IN THAT TEAM, t,  will, by definition, equal the Team Result for that team, t.

Example calculation for IPR

Building on the example provided for the calculation of TBLI, Ritchie Chor from Team 1 has a TBLI of 100. The Team Result is 50. Therefore,

 IPR (Ritchie Chor) = 50 x (100 / 100) = 50, the same as the Team Result.

In contrast, Karl Marc has a TBLI of 67.1,  the lowest for Team 1. Therefore,

 IPR (Karl Marc) = 50 x (67.1 / 100) = 33.5 = 34.

        

Note: The Team Based Learning Index (TBLI) was introduced from version 15. In earlier versions, IPR is calculated directly from the Team Based Learning Score (TBL) The resulting IPR is mathematically identical in both version of Peer Assess Pro:

IPR = Team Result for team, t x (TBL Score)/ (Maximum TBL Score in team, t)

Bonus for best team performer added to Indexed Personal Result

Optionally, in the Dashboard, the teacher may apply a bonus to the top ranked student in the team. Our experience is that student do not like this option, plus it encourages them to game the system. They prefer the Normalised Personal Result (NPR) approach since the team, on average, earns the Team Result, and the above average contributors are awarded in proportion to their relative effort.

A4. Normalised Personal Result, NPR

NPR = Team Result + Correction Factor(IPR)

Where

Correction Factor(IPR) = Spread Factor x (IPR - Team Average IPR)

Team Result = the teacher’s mark assigned to the team’s results.

Spread Factor = a factor chosen by the teacher that will  S T R E T C H  each team’s intrinsic spread of marks, as measured by the team’s standard deviation of IPR marks. The default Spread Factor is 1.0. The effect is illustrated visually in Figure A.1.

IPR = the student’s specific IPR, calculated as above.

Team Average IPR = The average of the team’s IPR marks, for the specific team.

Values are trimmed to be within the range zero to 100.

By definition, the Team Average NPR in this method will match the Team Result for this team. The range of marks will be stretched so that the standard deviation of the team’s NPR marks will be a multiple of the standard deviation of the IPR marks according to the Spread Factor. If a Spread Factor of 1.0 is chosen, then the NPR marks will be such that the average NPR matches the Team Result and the standard deviation of the NPR marks matches the standard deviation of the IPR marks for that team.

A Spread Factor of 2.0 means that APPROXIMATELY the student with the best IPR will be boosted somewhat above the Team Result. In contrast, the student with the lowest IPR will often retain a score quite close to the score calculated by their IPR. An increase in Spread Factor will NOT increase the average of the team’s NPR results. The average NPR for the team will generally match the Team Result irrespective of the Spread Factor.

A Spread Factor greater than 2.0 will spread the range of marks more extremely, whilst still keeping the average NPR for the team matched with the Team Result for that team.

The default Spread Factor is 1.0, however a Spread Factor of 2.o is recommended.

Example calculation for NPR

Building on the example provided for the calculation of IPR, Ritchie Chor from Team 1 has a IPR of 50. The Team Result is 50. The average and the standard deviation of the IPRs calculated for Team 1 is 41.9 and 7.5 respectively. See Table A.2 Team statistics calculated for example Team 1, Brazilia). Select a Spread Factor of 1.0.

Since

NPR = Team Result + Spread Factor x (IPR - Team Average IPR)

 

Therefore,

NPR(Ritchie Chor) = 50 + 1.0 x (50 - 41.9) => 50 + 1.0 x 8.1 => 58.1 => 58

Conversely, for Karl Marc, his IPR shown in the earlier example calculation is 34.

Therefore,

NPR(Karl Marc) = 50 + 1.0 x (50 - 41.9) => 50 - 1.0 x 8.1 => 41.9 = > 42

Note the standard deviation of the NPRs for Team 1 is 7.5 which is identical to the standard deviation of the IPRs (without Bonus). Note also how the average of the team’s NPR results equals the Team Result of 50.

Adjusting the NPR Spread Factor

The teacher can adjust the Spread Factor in the Peer Assess Pro Dashboard from the default value of 1.0. If the Spread Factor was adjusted from 1.0 to 2.0, then NPR(Ritchie Chor) increases from 58 to 66. Conversely, the NPR(Karl Marc) decreases from 42 to 33. The standard deviation of the NPR for Team 1 increases from 7.5 to 2 x 7.5 => 15. The average of the NPR results remains 50, the Team Result.

A5. Rank Based Personal Result, RPR

RPR = Share Fraction x Team Result x Number of Team Members

Calculation of the Rank Based Personal Result, RPR is more complex. First, a Share Fraction for each student in the team is calculated as follows. The Team Based Learning Scores for a team are reverse rank ordered. That is, a rank of 1 is assigned the lowest (worst)  rated student, as measured by the team’s set of Peer Scores. Second, the student(s) with the worst rank (lowest Peer Score) is allocated a small Share Fraction of the marks available for the team. In contrast, the best ranked (highest Peer Score) student is allocated a large Share Fraction.

In a four-person team, for example, the Share Fraction for the lowest ranked team member is 10%, whilst that for the best ranked student is 40%. In the case of a four-member team the best-ranked student gains four times the resulting Rank Based Personal Result of the lowest ranked student. The intermediate-ranked students gain various multiples of the Share Fraction of the lowest ranked team member. See the results for the four-member team, Team 1: Brazilia in Figure A4 Example Team Analysis Marks for Personal Result (PR) calculated by several methods. The rows that show this calculation in the Team Analysis Marks subsheet are hidden from the normal user’s view of the Peer Assess Pro Analyser. The rows may be unhidden by the slightly advanced spreadsheet user.

In general, for a specific student, s in a team with n members

For instance, in a team with 5 team members, ranked 1 through 5, then the Share Fraction of the worst performing student is:

Share Fraction = 1 / (5 + 4 + 3 + 2 + 1) = 1 / 15 = 6.67%

The top ranked student, with Reverse Rank 5, will achieve a Share Fraction of 5/15 = 33.3%

Note that by definition, (Share Fraction) = 100 % EXACTLY for the whole team.

Different Share Fractions result in the case when teams contain two or more equal ranked team members.

Values are trimmed to be within the range zero to 100.

Example calculation for RPR

Richie Chor has a Reverse Rank of 4 in a team of 4 members. The Team Result is 50, and his Share Fraction is 40%.

RPR(Ritchie Chor) = 40% x 50 x 4 = 80

In contrast, Karl Marc has a Reverse Rank of 1, so his Share Fraction is 10%.  

RPR(Karl Marc) = 10% x 50 x 4 = 20.


A6. Advantages and disadvantages of the five methods

Figure A.3 illustrates the effect of the five methods on a representative class of students based on the data presented in Section A7. Illustrative calculations and reports. In this case, the Team Marks range from 25 to 90 marks. Note how the IPR marks range from 15 to 90 marks, with no mark higher than the highest Team Result of 90. In contrast, the RPR transformation produces a distribution of marks distributed from 10 through 100 marks: almost the full range. The NPR transformation, using a Spread Factor of 1, presents a spread from 20 to 97, a spread of 77 marks in contrast to the slightly narrower range of 75 marks using the IPR approach. However, using a Spread Factor of 2 for the Normalised Personal Result calculations will yield a range from 15 through 100, that 85.

Figure A.3 Graphical comparison of the methods of calculating the Personal Result


Team Based Learning Score, TBL

Advantage: Very easily calculated from the sum of the components of the Peer Ratings for each of the ten component. Good face validity. This is useful information to provide students to show how they are rated by their team members irrespective of the Team Result.

Disadvantage: Takes no account of the Team Result. A TBL score in one team has no basis for comparison with the TBL score in another team. A score of 75 in one team may be awarded the best team member, whilst the same TBL may be awarded the lowest performing team member in another team.

Team Based Learning Index, TBLI

Advantage: Easily calculated from the Team Based Learning Score, TBL. Higher face validity than Team Based Learning Score, since the best rated student in teach team gains a TBLI of 100.  The TBLI therefore enables a better comparison of team contribution across the entire class than comparing TPS.

Like the TBL Score, TBLI is useful information to provide students to show how they are rated by their team members and in comparison across the entire class, irrespective of the Team Result.

Since the TBLI is used in the calculation of IPR and NPR, students can be assured by the teacher that attempts to ‘game’ the calculations will have no effect: the best rated team member in each team will, by definition, receive the TBLI of 100.

Disadvantage: Takes no account of the Team Result.

Indexed Personal Result, IPR

Advantage: Easily calculated from the Total Result (TR) and Team Based Learning Index (TBLI). Consequently, IPR has high face validity.

Disadvantage: The best peer-scored student will achieve, at most, the Team Result, by definition. Secondly, in our experience, the range of IPRs from lowest rated to highest is typically limited to 10 to 15 marks in practice. Such a narrow spread of marks appears to unfairly over-reward poor performing team members and under-reward more highly contributing team members.

Students tend to complain that it seems unfair that only the student with the highest Team Based Learning Score should earn the Team Result, when they have all contributed. The RPR and NPR methods overcome that feeling of unfairness, in that the AVERAGE team member achieves the Team Result, whilst higher and lower contributing students gain higher or lower Personal results.

Note that an optional bonus may be awarded to the best performing member of each team in the calculation of IPR. The bonus is established in the Analyser Dashboard.

Normalised Personal Result, NPR

Advantage: The Normalised Personal Result (NPR) approach spreads the range of marks for a team more fairly so that highly contributing team members can gain a much higher mark than a poorly performing team member. A key feature is that, in most cases, the Average Mark of a team’s NPR marks will equal the Team Result mark. Furthermore, the standard deviation of the team’s  NPR marks will equal a multiple of the team’s standard deviation calculated by the Indexed Personal Result (IPR) method. Specifically, the teacher has precise control over how much spread will be applied to the entire class’s marks, according to the Spread Factor selected.

Note that team and individual marks at the extreme ends (near zero or 100) will be clipped to no less than zero, or no more than 100. This clipping will affect the Target SD and Team Average IPR for the teams with clipped results. As the Spread Factor is increased upwards from the default value of 1.0, then the spread of marks tends towards approximating the marks calculated by the Rank Based Personal Result (RPR) method.

Disadvantage: a more complicated calculation to explain to people, yet intuitively attractive.

Rank Based Personal Result, RPR

Advantage: In contrast to the IPR approach, the RPR approach spreads the range of marks for a team more fairly so that highly contributing team members gain a significantly higher mark than a poorly performing team member. The RPR method overcomes the situation where a team is rather insipid in rating each other. Enables a highly-contributing team member to gain a mark well beyond the Team Result mark, even as high as 100 marks in the case of a high Team Result.

Disadvantage: more complex calculation. When a team honestly rates everyone highly, then the lowest ranked student will feel quite aggrieved that they receive a low score when the RPR method is chosen.


A7. Illustrative data for calculations and reports

The data in this section is used to demonstrate the calculation shown in earlier sections A1 through A6 for team members selected from Team 1 Brazilia. Table A.1 presents the Team Results for the six teams used to calculate the results presented later in Figures A.3, A.4 and A.5. Table A.2 shows statistics calculated for the illustrative team, Team Brazilia. These statistics are required for some of the calculations shown in the example calculations and Figure A.3.

A pdf document of all reports is presented in Section 13 Demonstration Peer Assess Pro - Reports.

Table A.1 Team Results for six teams

Team Number

Team Name

Team Result (TR)

1

Brazilia

50

2

Kubla

75

3

Patagonia

85

4

Stavros

90

5

Victoria

30

6

Whiskey

25

Table A.2 Team statistics calculated for example Team 1, Brazilia

Team 1: Brazilia

Sum

Average

Maximum

Minimum

Range

Standard Deviation

1

4

1

Team Based Learning Score (TBL)

233.8

58.4

70.0

46.7

23.3

10.6

1

Team Based Learning Index (TBLI)

335.0

83.8

100.0

67.0

33.0

15.0

1

Percentile Rank Within Team (PCR)

200.0

50.0

100.0

0.0

100.0

43.0

1

Rank Within Team (RWT)

10.0

2.5

4

1

3.0

1.3

1

Reverse rank

10.0

2.5

4.0

1.0

3.0

1.3

1

CAKE Fraction

100.0%

25.0%

40.0%

10.0%

30.0%

12.9%

1

Indexed Personal Result (IPR) (WITHOUT BONUS)

167.5

41.9

50.0

33.5

16.5

7.5

1

Indexed Personal Result (IPR)

177.5

44.4

60.0

33.5

26.5

11.6

1

Rank Based Personal Result (RPR)

200.0

50.0

80.0

20.0

60.0

25.8

1

Normalised Personal Result (NPR)

200.0

50.0

58.1

41.6

16.5

7.5

Note: In Table A.2 and A.3 the rows Reverse Rank and CAKE Fraction are not displayed in the PRODUCTION version of Peer Assess Pro. These rows are provided here to illustrate the intermediate calculations.

In the Section 25. Example personal snapshot report Figure 2.2 presents a snapshot of the complete personal results for an individual student. The left part of the table show the class statistics as a base for comparing the student’s specific results, shown in the far right column. The cells are colour coded to show where the student has achieved above the class average (green) or below (pink, red).

Figure A.3: Example Team Analysis Marks for Personal Result (PR) calculated by several methods (DASHBOARD subsheet)

Team 1: Brazilia

1

Karl MARC - B

Quinten CRISP - B

Ritchie CHOR - B

Sandy SHORE - B

1

Team Based Learning Score (TBL)

47

64

70

53

1

Team Based Learning Index (TBLI)

67

92

100

76

1

Percentile Rank Within Team (PCR)

0

67

100

33

1

Rank Within Team (RWT)

4

2

1

3

1

Reverse rank

1

3

4

2

1

CAKE Fraction

10.0%

30.0%

40.0%

20.0%

1

Indexed Personal Result (IPR) (WITHOUT BONUS)

34

46

50

38

1

Indexed Personal Result (IPR)

34

46

60

38

1

Rank Based Personal Result (RPR)

20

60

80

40

1

Normalised Personal Result (NPR)

42

54

58

46

Team 2: Kubla

2

Bridget GNOME - K

Julian COLANDAR -K

Lydia LOADED -K

Nigella PIZZA -K

2

Team Based Learning Score (TBL)

79

74

82

78

2

Team Based Learning Index (TBLI)

97

91

100

96

2

Percentile Rank Within Team (PCR)

67

0

100

33

2

Rank Within Team (RWT)

2

4

1

3

2

Reverse rank

3

1

4

2

2

CAKE Fraction

30.0%

10.0%

40.0%

20.0%

2

Indexed Personal Result (IPR) (WITHOUT BONUS)

49

46

50

48

2

Indexed Personal Result (IPR)

49

46

60

48

2

Rank Based Personal Result (RPR)

60

20

80

40

2

Normalised Personal Result (NPR)

51

48

52

50

Figure A.4 presents an example gradebook of the results, for a subset of entire class, sorted by Team Number. The results are colour coded to show green for best in class and red for lowest in class results. You may also view a pdf for the entire Analysis sheet including all the subsidiary calculations based on the ten quantitative questions in the survey form.

Figure A.4: Example gradebook showing selected results, sorted by Team Number (Analysis subsheet)

Note: Click here to view the entire sheet as a pdf.

The Analysis subsheet enables you to sort the results according to any of the columns. The Figure A.5 presents an example subset of results sorted by Normalised Personal  Result (NPR). Statistics are shown including the average, maximum, and minimum values for each column. In addition, a row shows the correlation of the selected sort column with each of the other columns.

The sort feature enables you to easily locate low and high performing individual students, and diagnose their strengths and weaknesses as assessed by their team members.

Click below the image to view a pdf for the entire Analysis sheet including all the subsidiary calculations.


Figure A.5: Example gradebook showing selected results, sorted by Normalised Personal Result (NPR) (Analysis subsheet)

Note: Click here to view the entire sheet as a pdf.

Comprehensive example reports

An example of all reports including Dashboard, Analysis, SelfAnalysis, Charts, and the input Responses data may be viewed as a single pdf in Section 13 Demonstration Peer Assess Pro - Reports.


APPENDIX B: Instructions for teacher’s deployment of Peer Assess Pro Survey

B1. Before you start: Getting Oriented

  1. Print the one page Section 14 Quick Reference Guide.
  2. Review the system components and process workflow in Section 0. Overview
  3. View the schedule of Official download links
  4. View the video demonstration, Section 12. Video overview and instructions.
  5. Note the related videos for each specific step of the workflow, Video guides for teachers
  6. Download and circumnavigate the Demonstration Peer Assess Pro Analyser spreadsheet in Section 13. Demonstration versions.

B2. Create the Class List

  1. Ensure you have created a Class List spreadsheet of student IDs, Full Names, and Email addresses that you can sort by the columns headed Student ID, Student Name, and (optionally) Team Name or Team Number.
  2. Relevant video for this instruction: I. Set up the Class List and Survey
  3. GOOD PRACTICE HINT: Add a one character suffix to the Student Name indicating the team to which that student belongs. Example Able Archer A; Amanda Anderson B; Barbara Baker A; … Hannah Hudson D. When you create the Survey Form later, list the names sorted by team. Then the team members will get a reminder about what team they belong to, and who their team members are.
  4. A Class List Template to generate the Good Practice format is presented in the schedule of Official Download Links, here: Class List Template

Basic requirements for a Class List

Student ID

Full Name

Email

Team

98123456

Able Archer

able@noreply.com

Alpha

97123456

Amanda Anderson

amanda@noreply.com

Bravo

97999999

Barbara Baker

barbara@noreply.com

Alpha

96123456

Brenda Ball

brenda@noreply.com

Charlie

12356799

Charlie Chan

charlie@noreply.com

Delta

22222222

Daphne Davis

daphne@noreply.com

Bravo

33333333

Dilbert Donaldson

dilbert@noreply.com

Delta

55555555

Erica Enron

erica@noreply.com

Alpha

54321001

Erin Ericson

erin@noreply.com

Charlie

44444444

Fiona Flagg

fionna@noreply.com

Delta

33333333

Frank Fox

frank@noreply.com

Bravo

12312312

George Galo

george@noreply.com

Bravo

32132132

Greta Gundmunson

greta@noreply.com

Charlie

35353535

Hannah Hudson

hannah@noreply.com

Delta

Good practice layout for a Class List

Student ID

Full Name

Email

Team

Full Name - T

98123456

Able Archer

able@noreply.com

Alpha

Able Archer - A

97999999

Barbara Baker

barbara@noreply.com

Alpha

Barbara Baker - A

55555555

Erica Enron

erica@noreply.com

Alpha

Erica Enron - A

97123456

Amanda Anderson

amanda@noreply.com

Bravo

Amanda Anderson - B

22222222

Daphne Davis

daphne@noreply.com

Bravo

Daphne Davis - B

33333333

Frank Fox

frank@noreply.com

Bravo

Frank Fox - B

12312312

George Galo

george@noreply.com

Bravo

George Galo - B

96123456

Brenda Ball

brenda@noreply.com

Charlie

Brenda Ball - C

54321001

Erin Ericson

erin@noreply.com

Charlie

Erin Ericson - C

32132132

Greta Gundmunson

greta@noreply.com

Charlie

Greta Gundmunson - C

12356799

Charlie Chan

charlie@noreply.com

Delta

Charlie Chan - D

33333333

Dilbert Donaldson

dilbert@noreply.com

Delta

Dilbert Donaldson - D

44444444

Fiona Flagg

fionna@noreply.com

Delta

Fiona Flagg - D

35353535

Hannah Hudson

hannah@noreply.com

Delta

Hannah Hudson - D

B3. Prepare your version of the Survey form

  1. Videos: II. Prepare the Analyser for receiving Survey (Responses)
  2. View the schedule of Official download links. Open Survey PRODUCTION - Peer Assess Pro in a browser such as Chrome.

  1. Your COPY of Survey - Peer Assess Pro will be created when you click on the url link.

  1. Adjust the Google form name of your copy to the file name you desire. For example, change “Copy of SURVEY - Peer Assess Pro - v14” to “Class 1234 Mid-Block SURVEY”


  1. WITHIN Section 1 of YOUR Google Form (example named) “Class 1234 Mid-Block SURVEY” adjust the title to that which you desire. Suggestion: make similar to the Google form name.

  1. In Section 1, you may wish to adjust the instruction concerning the due date for responses to be sent.

  1. Section 2, Responder Identification. For the following instructions, you require a Class List prepared according to the guidance in Section: B2. Create the Class List.

  1. In the sub-sections “Assessor” paste the names of the students into the line marked “Teacher! Paste your sorted list of names here. Delete this line!Beforehand, ensure the names are sorted alphabetically. Ensure all OLD names are deleted.

  1. In the sub-section “Student ID” paste the Student IDs into the line marked “Teacher! Paste your sorted list of Student IDs here. Delete this line!”. Beforehand, ensure the IDs are sorted numerically for your students’ convenience. More importantly, this ensures students cannot match names with Student IDs!!!  Ensure all OLD IDs are deleted.

  1. In the sub-section “Team Name” paste the list of team names as required for your class. PLEASE NOTE: Alpha or Numeric team numbers may be used.

  1. Section 3, Part A: Overall Assessment. In the sub-section “Assessee” paste the names of the students into the line “Teacher! Paste your sorted list of names here. Delete this line!” GOOD PRACTICE HINT: You can complete this task immediately after you complete the list of Assessors, Step 7.

.

Avoid adjusting the number or content of questions

DO NOT adjust the number of questions from the current limit of 10 quantitative and four qualitative.

DO NOT change the type of questions from scales to paragraphs or any other messing about.

YOU MAY NOT change the headings of the questions numbered 1 through 14.

You may change the long description for each question.

B4. Prepare the (Responses) spreadsheet for receiving responses from the survey form

  1. Click on the RESPONSES tab, shown to the right of the QUESTIONS tab, near top of the form.
  2. Click on the green spreadsheet icon to create the spreadsheet to receive respondents’ responses

  1. Note that the default  name for your responses spreadsheet is the name of your Google form with the suffix (Responses). For example, Class 1234 Mid-Block SURVEY (Responses).

  1. The (Responses) file should immediately appear when you click CREATE. Note the filename matches your Google Form name. Note the spreadsheet is empty except that the first line contains the headings of the data that will be submitted by the respondents to the survey form.

  1. The (Responses) spreadsheet will be created in your Google Drive. Locate the (Responses) sheet and Survey Form by clicking on ‘Recent’.
  2. GOOD PRACTICE HINT: To keep track of your Peer Assess Pro assets we suggest:
  1. Create a Google Drive Folder titled Peer Assess Pro Projects
  2. For EVERY cycle of using Peer Assess Pro, create a specific subfolder for the project eg Class 1234 Mid-Block
  3. Collect the related Peer Assess Pro assets into their own project folder:
  1. Class List eg Class List Class 1234 Mid-Block
  2. Survey Form eg Class 1234 Mid-Block - Survey
  3. Survey (Responses) eg Class 1234 Mid-Block - Survey (Responses)
  4. Analyser eg Class 1234 Mid-Block - Analyser
  1. Don’t ‘recycle’ the Survey. Make a copy if you are sending it to the identical class with identical teams.
  2. Don’t ‘recycle’ the Analyser. Always return to the User Manual Downloads to retrieve the latest ‘new, improved’ version.
  1. FAQ: What happens if I forget to include a Student ID, Team Name,  or Assessor?
  1. The Google Survey system is most forgiving. Edit the ‘live’ form to include the new data.
  1. Important! See Appendix F: F1. Avoid repeatedly selecting a destination for your survey


B5. Send the Survey form to your respondents

  1. Video: III. Preview and send the Survey for students
  2. On the Google Form, click on SEND. You will be provided with a variety of methods to send the form to your respondents: email, URL. Your choice!

  1. Once the survey has been sent out to respondents, you can view the progress of the survey responses by either
  1. Examining directly the (Responses) spreadsheet in your Google Drive
  2. Revisiting the Google Form, clicking on RESPONSES, then clicking on the green spreadsheet icon
  3. IDEALLY, viewing the RESPONSES and progress WARNINGS sub-sheets in your Peer Assess Pro Analyser. See APPENDIX D: Instructions for using Peer Assess Pro Analyser.

  1. GOOD PRACTICE HINT: Once you have created your Survey proceed directly to set up the Peer Assess Pro Analyser so that it is linked to the Survey (Responses) sheet and Class List. Instructions are provided in APPENDIX D: Instructions for using Peer Assess Pro Analyser 
  2. Use Peer Assess Pro Analyser to check the progress of responses, using the Warnings sub-sheet to identify student who
  1. Have yet to respond
  2. Have not been rated
  3. Have not rated all members of their team.
  4. Have rated team members over a narrow or nil range
  5. Have made errors in identifying their team membership, team members, or Student ID.
  1. Note Peer Assess Pro Analyser is especially helpful for monitoring the progress status of students completing the survey. Peer Assess Pro Analyser automatically imports and analyses the survey responses as they are submitted by the respondents. This real time monitoring enables you to view WARNINGS. For instance, you can respond by encouraging specific students to respond and/or submit corrected responses.


APPENDIX C: Instructions for using Peer Assess Pro Analyser

C1: Set up the Analyser Dashboard

  1. Video: II. Prepare the Analyser for receiving Survey (Responses)
  2. Video: IV. Review and correct Analyser warnings concerning incorrect responses from the Survey
  3. Prepare your copy of Peer Assess Pro Analyser: View the schedule of Official download links. Click on the link to Peer Assess Pro Analyser (PRODUCTION). This will create your personal version of the spreadsheet.


  1. GOOD PRACTICE HINT: Rename your copy of Peer Assess Pro to align with the Google Form survey you created for your class as described in APPENDIX B. For example, if  your Google form is named Class 1234 Mid-Block Survey” then we suggest the following name for your Peer Assess Pro Analyser spreadsheet: Class 1234 Mid-Block Analyser.
  2. Open the ‘Dashboard’ sub-sheet in your Peer Assess Pro version you copied and renamed
  3. Enter a ‘project title’ such as the name of your class. For example, Class 1234 Mid-Block Survey. This title will display on each spreadsheet and report produced by Peer Assess Pro, combined with the current date of analysis.
  4. Enter your Teacher Name.
  5. Enter your choice of method for calculating the Personal Result for each student. There is a drop down menu of alternatives. The alternatives are explained in APPENDIX A: Calculation of the Personal Result, PR
  6. You can adjust the foregoing settings later, before you despatch the results to your students.
  7. In Steps C2 and C3 you will link the Analyser to the Survey (Responses) and (optionally) the Class List. When you have successfully completed these linking processes, your DASHBOARD will look something like this:

C2: Link the Analyser to the Class List

  1. These steps are OPTIONAL. But read the GOOD PRACTICE HINT first, please!
  2. Video: Link Analyser to class list

https://www.youtube.com/watch?v=b7ohDi8UFas&list=PLrrxmJ7TOi8NIIUUq9lMi88enry92-ovx&t=01s

  1. GOOD PRACTICE HINT: Immediately you have despatched the Peer Assess Pro Survey to respondents,  link the Analyser to BOTH the Class List and Survey (Responses). Rationale: Through linking the Class List, you will receive warnings about Respondents in the entire class who have failed to submit any Survey responses. Through linking the (Responses) from the Survey you will receive in-progress WARNINGS about respondents who have made mistakes in their submissions, or insufficient submissions.. How to identify and correct these mistakes is detailed in section C4: Review Analyser warnings concerning incorrect Survey responses.
  2. Email addresses for snapshot despatch: (Optional) Copy and paste into the DASHBOARD sub-sheet a URL that links to the Class List spreadsheet of Student IDs, Names, and Email addresses. This linkage enables the Snapshot sub-sheet in the Analyser to display the email address for the specific student to receive their personal snapshot. This simplifies the task of manually dispatching a personal snapshot to one student.
  3. Moreover, linking the Analyser to the Class List is essential for to enable automatic mass email of personal snapshots to every student. See APPENDIX D: Using the YAMM add-on to email personal snapshots to students.
  4. The spreadsheet containing the emails must be structured as follows:
  1. Top row with column headers which MUST include ‘Student ID’ and ‘Email’.
  2. Other column names optional, and in any order, such as student names and Team Name.
  1. For further guidance on setting up the Class List including an optional template, see: Section B2. Create the Class List.
  2. The next few steps explain how to locate the URL link from your Class List sheet so you can paste it into the  Peer Assess Pro ‘DASHBOARD’ subsheet of the Analyser.
  3. Open your Class List spreadsheet. Note the blue Share icon to the top right of the sheet. 
  4. From your Class list sheet, get the shareable link. Click Share…
  5. Ensure the ‘can view’ option is selected.
  6. Click ‘Copy Link’. Click ‘Done’. Clicking DONE is CRUCIAL!

  1. Paste the copied link of your Class Link sheet into the DASHBOARD subsheet, after the field: 'Email lookup' Google Sheet File/Share link. See the second Dashboard illustration in Section C1: Set up the Analyser Dashboard
  2. Next, enter the sub-sheet from which the responses should be drawn from the (Responses) sheet. In general, this will be ‘Sheet1’.
  3. IMPORTANT: If you used the Class List template, ensure the subsheet is specified as ‘ClassList’.
  4. When you have completed this task successfully, in the Analyser you will find that the EmailLookup subsheet:
  1. Correctly imports the Class List
  2. Correctly identifies the columns in the Class List that contain the Student ID and Email Address
  1. In the example illustration, there is a WARNING that a Student ID has been identified that does not match any Student ID listed in the Analysis subsheet. This means the Student ID may be surplus to requirements, or, more significantly, a student that has yet to begin responding to the Peer Assess Pro Survey.


C3: Link the Analyser to the Survey Responses

  1. The next few steps explain how to locate the URL link from your (Responses) form so you can paste it into Peer Assess Pro ‘DASHBOARD’ subsheet of Peer Assess Pro Analyser.
  2. This step presumes you have completed the steps in Section B upto and including Section B4. Prepare the (Responses) spreadsheet for receiving responses from the survey form.
  3. Open your (Responses) spreadsheet. This example uses responses generated from a Google Form titled Class 123 ALPHA Survey.
  4. From your (Responses) sheet, get the shareable link for your response data. Click Share…

  1. Ensure the ‘can view’ option is selected.
  2. Click ‘Copy Link’. Click ‘Done’. Clicking DONE is CRUCIAL!

  1. Paste the URL of your shared (Responses) sheet into the DASHBOARD subsheet, after the field: Survey (Responses) File/Share Link. See the second Dashboard illustration in Section  C1: Set up the Analyser Dashboard
  2. Next, enter the sub-sheet from which the responses should be drawn from the (Responses) sheet. In general, this will be ‘Form Responses 1’.
  3. Magic will happen! The responses from the spreadsheet created by your Google Form survey will now import directly into Peer Assess Pro ‘Responses’ subsheet, as shown in the figure below. Furthermore, as students continue to respond to the survey, each submission response will automatically transfer to your copy of Peer Assess Pro. Thus you can review the WARNINGS sub-sheet to identify incomplete submissions, incorrect submissions, or zero submissions by students.
  4. Check: Examine the (Responses) sub-sheet copied from your use of the Peer Assess Pro Survey form. If there are warning messages or no responses, check that you copied and SHARED the URL of the Google Form (Responses) correctly, as explained in Steps 3 through 7 above. 
  5. Ensure the sequence of headings in your Survey (Responses) sheet matches the sequence of column headings in the Responses subsheet in Peer Assess Pro. That is: Timestamp, Assessor, Student ID, Team Name, Assessee, Recommendation, 1. Initiative, 2. Attendance, 3. Contribution, 4. Professionalism, 5. Ideas and learning,        Contribution to Task Accomplishment, 6. Focus and task allocation 7. Encourages contribution, 8. Listens and welcomes, 9. Conflict management and harmony, 10. Chairmanship, Contribution to Leadership, Developmental feedback, Teacher Advice.

  1. Note that you CANNOT make changes to this imported data in Peer Assess Pro. If you do need to make changes to the survey data, then make the changes directly in the Survey (Responses) sheet. For example, you may need to correct responses of a student who has specified their membership of the wrong team, or given an incorrect Student ID.
  2. You can practice this process using the Peer Assess Pro Survey Response Demonstration Dataset, shown in Section 13. Demonstration versions.

C4: Review and correct Analyser WARNINGS concerning incorrect Survey responses

  1. Video: IV. Review and correct Analyser warnings concerning incorrect responses from the Survey
  2. Open the ANALYSIS sub-sheet. On the left of the sheet note any missing Student IDs. That is a WARNING that those students have yet to make ANY submissions of the Peer Assess Pro Survey Form.

  1. Note you may sort the ANALYSIS subsheet on any of the columns, such as Student ID, Student Assessed, Team Number… In this case Recommendation is the sort criterion.
  2. Check for warnings and errors arising from your (Responses) data in the 'WARNINGS' sub-sheet. Errors specifically identified include:
  1. A student failing to submit any ratings of other students
  2. Missing assessments by a combination of Assessor/Assessee
  3. Student failing to self-assess
  4. Multiple assessments by one student of another student
  5. Student giving a team Name different compared with other assessors in their team. his error causes major catastrophes. The error will raise spurious ‘missing assessments’ until the fault is corrected
  6. Student rating someone in another team
  7. Student selecting more than one Student ID for themself
  8. Student rating everyone the same, high rating
  9. A team rating everyone within a narrow spread.

  1. You can repair some of the response errors by deleting self-assessment responses in the original (Responses) worksheet. Otherwise, you can communicate with your students to have them complete the survey. The additional data and corrections made to the (Responses) dataset will automatically import and correct the Responses sub-sheet in Peer Assess Pro Analyser.
  2. Peer Assess Pro attempts to correct several warnings as follows:
  1. Multiple assessments by one student of another student - The average of the assessments is calculated.
  2. Student giving their team name different compared with other assessors in their team. Peer Assess Pro guesses by assuming the maximum team number (equivalent) provided by the student. This is a haphazard assumption! Check the (Responses) dataset carefully and correct the student’s team membership.
  3. Many weird consequential warnings will result if Peer Assess Pro guesstimates the wrong team name, such as warning 4 (b) above. Fix the wrong team membership in the Survey (Responses), and many of the consequential warnings will disappear.
  1. HINT: To locate an incorrect team assignment by an assessor or assessee, try investigating your (Responses) data as follows:
  1. Sort by Team Name
  2. Sort by Assessor - Look for inconsistencies - Examine especially a Team Member who is given in the WARNINGS as belonging to multiple teams.
  3. Sort by Team Name
  4. Sort by Assessee - Look for inconsistencies, as above.
  1. A student rating someone in another team causes a big mess of consequential WARNINGS. Delete the rating.
  2. A student supplying multiple Student IDs for themself causes a big mess of consequential WARNINGS. Correct the (Responses) data.
  3. GOOD PRACTICE HINT: To help you minimise the risk of students selecting an incorrect team, see GOOD PRACTICE HINT in Section B2. Create the Class List.

C5. Reports produced by Peer Assess Pro

Use the sub-sheets to explore the several reports produced by Peer Assess Pro. Sample reports and explanations are presented in Section A7. Illustrative calculations and reports. A pdf document of all reports is presented in Section 13 Demonstration Peer Assess Pro - Reports.

  1. Video: V. Review team, class, and individual results in Peer Assess Pro Analyser
  2. Check Team Results: Open the DASHBOARD sub-sheet. Locate the sub-section headed Team Results.  Enter data for the column Team Result (TR) in the green cells… ONLY.
  3. BE PATIENT: It can take several minutes for the spreadsheet to undertake the calculations required to produce the data in the 'Analysis' and 'Team Marks' sub-sheets.
  4. DO NOT delete any rows in the DASHBOARD…. Or anywhere else in the Analyser!.If you have fewer teams just clear (delete) the data in rows that are not required.
  5. As you review the Analyser whilst the Survey is in progress, the Team Results part of the DASHBOARD will warn you if there are too few ratings for each team. This feature is illustrated here. For a team of four team members, there should be (4-1) x 3 = 12 ratings in total. Team Stavros has the correct number of ratings, but Team Brazilia has insufficient rating, with just 6 ratings so far.

  1. DASHBOARD: View the results grouped by team in the 'Dashboard’ sub-sheet. Similar to Figure A.3
  2. ANALYSIS: Use the ‘Analysis’ subsheet to view the entire class gradebook sorted on any column you require: Student ID, Team Name, Team Result, Team Based Learning Score. Similar to Figure A.4
  3. QUALITATIVE: View qualitative feedback in the 'Qualitative' sub-sheet, aggregated and anonymised for each assessed student. See the illustration above.
  4. CHARTS-CLASS: View histogram frequency distributions of marks for the class as a whole in the ‘Charts-Class’ sub-sheet. Similar to Figure A.2
  5. SNAPSHOT: View all the results for a specific student by making a selection in the 'Snapshot' sub-sheet. Similar to Figure 2.2 explained in Section 25. Example personal snapshot report.
  6. TEACHER: View feedback to the teacher in the ‘Teacher’ sub-sheet
  7. SELF-ANALYSIS: Use ‘SelfAnalysis’ to view comparisons of students’ self-assessment ratings compared with the ratings provided by other members of their team.


C6: Dispatch personalised result summaries to students using SNAPSHOT

  1. There are two methods for dispatching results to students
  1. Use the YAMM mailmerge addon to send out all results to all class students in one operation. This feature is detailed in APPENDIX D.
  2. Use the SNAPSHOT sheet to email or print individual results on a one-by-one,  as needed basis, detailed here.
  1. A personal snapshot appears similar to Figure 2.1: Snapshot of personal results for an individual student, shown in  Section 25. Example personal snapshot report. However, the Snapshot includes several additional features, such as a spider chart of comparative ratings Figure C.1), plus the specific quantitative ratings given by the student to each of their team members.
  2. To create and email a snapshot, select the SNAPSHOT subsheet in the Analyser.
  3. Select the student whose results you wish to display using the pulldown menu.


  1. Select the Google Sheets menu item: File: Download as: PDF Document (pdf).

  1. Check the Export parameters are:
  1. Export: Current sheet
  2. Paper size: typically A4 or Letter (US)
  3. Page Orientation: Landscape
  4. Scale: Fit to width
  1. Select EXPORT

  1. Open an email addressed to the student. If you attached the Class List including email addresses, then the student’s email address will show in the SNAPSHOT display to the right of the name you selected. See figures for Step 4 or 5 above.
  2. ATTACH the downloaded pdf to the email
  3. SEND the email
  4. REPEAT for every student in the class.

Good practice note: Use the SNAPSHOT email method only when you have a few results to send out. Otherwise, the method is tedious and error prone: you might miss out sending results to a student or two! It is worth the investment of your time to use the YAMM mailmerge addon, APPENDIX D.


Figure C.1 Example spider chart produced in Snapshot


APPENDIX D: Mass distribution of personal snapshots to students using the YAMM mailmerge addon

The YAMM mailmerge optional feature enables you to mass distribute by email a personal snapshot of each class member’s results. The snapshot the student receives is similar to Figure 2.1: Snapshot of personal results for an individual student, shown in  Section 25. Example personal snapshot report.

D1: Install the Google Sheets add-on YAMM

  1. Video: VI. Distribute personal results using the Yet Another Mail Merge (YAMM) add-on
  2. Install the Yet Another Mail Merge Add-on (YAMM) onto your Google Sheet. You need install this add-on just ONCE on each computer you use with Peer Assess Pro.

Yet Another Mail Merge - Google Sheets add-on. (n.d.). Retrieved 15 June 2016, from https://chrome.google.com/webstore/detail/yet-another-mail-merge/mgmgmhkohaenhokbdnlpcljckbhpbmef?hl=en

  1. For your broader education about mailmerge using YAMM, see

Vialard, R. (n.d.). How to do mail merge with Yet Another Mail Merge (Gmail & a spreadsheet). Retrieved from https://www.youtube.com/watch?v=SzGkAQJSgJU

D2: Ensure the Class list with emails is linked in the Dashboard

  1. Link to the Analyser the Class List that contains Student IDs and email addresses as explained in Section C2: Link the Analyser to the Class List
  2. In the Analyser subsheet ‘MailOut’ you will notice (a) a consolidated set of results (b) the email address for each student that the Peer Assess Pro Analyser recognises through looking up the subsheet ‘EmailLookup’.
  3. KNOWN LIMITATION: If a student has FAILED to provide a single response to the Peer Assess Pro Survey, then their Student ID will be absent from the Analysis subsheet, and, therefore, the ‘Mailout’ sheet. Consequently, you will need to print a pdf version of their Snapshot from the Analyser, and mail out that copy. This process is detailed in Appendix C Section C4.

D3: Create a draft Gmail email copying MailMergeDraft

  1. View the subsheet ‘MailMergeDraft’. Copy and paste this entire sub-sheet into a Google Mail Draft email.
  2. Follow the instructions embedded within the MailMergeDraft, which include:
  1. SELECT this entire subsheet from Cell A1 through approximately G128
  2. COPY and PASTE this entire subsheet into a Google Mail Draft.
  3. CUT and PASTE the SUBJECT: Snapshot .... row into the SUBJECT line of your GMail Draft
  4. Make minimal amendments to the draft email. TAKE CARE. The YAMM characters {{ and }} surround the name of data fields in the subsheet MailOut. If you accidentally adjust, delete, spindle, or mutilate those characters your MailMerge may not function correctly.
  5. Your name and project title will be copied from the DASHBOARD.
  6. Delete the red lines of instruction from your draft.                        
  7. Save, but do not send the draft, in your ‘Drafts’ folder on Google Mail.

D4: Link the draft Gmail to the Mailout subsheet

  1. Go to the MailOut subsheet. Activate the Add-on: Yet Another Mail Merge (YAMM).

  1. As the YAMM Add-on takes over your spreadsheet, select the Draft Google Mail you created earlier..
  2. Select the option to ‘Receive a test email’ sent to yourself.

  1. Check you received the test email in your Google Mail inbox.
  2. Review and adjust the draft Gmail should you notice errors that need correcting.

D5. Mass mail the Gmail to all students in the class using the YAMM addon

  1. Repeat the process from Step 9. Select Add-ons… Yet Another Mail Merge …. Start Mail Merge. Ensure the correct template from your GMAIL Drafts remains selected!
  2. This time, initiate your YAMM Mail Merge by selecting ‘Send xxx emails’. That selection will send out snapshot summaries to all the students with email addresses in the MailOut subsheet.
  3. You can return to the MailOut subsheet later to review the Merge Status column that displays which students have received and opened the emails you sent them!

Email Address

Merge status

Student Assessed

Student ID

Team Number

Team Name

ceo@myndsurfers.org.nz

EMAIL_OPENED

Frank        Fox

33333333

4

Delta

ceo@myndsurfers.org.nz

EMAIL_OPENED

George        Galo

12312312

6

Foxtrot

pdodd@unitec.ac.nz

EMAIL_SENT

Greta Gundmunson

32132132

3

Charlie

pdodd@unitec.ac.nz

EMAIL_SENT

Hannah        Hudson

35353535

5

Echo


APPENDIX E: Features and calculation of Net Recommender

Net Recommender enables a more precise comparison of a student’s peer rating by their team members. Net Recommender combines normalised values of the raw Recommendation and Team Based Learning Score (TBL) for the team member.

The Net Recommender calculation can be conceived as a function equally weighted according to the

Absolute rating within the class, determined from normalising the student’s Recommendation rating relative to the class as a whole.

Relative rating of a student WITHIN their team, determined from normalising their TBL score relative to their team members.

Net Recommender is analogous to the Net Promoter score used in assessing the reputation of a company.

E1. Design features of Net Recommender

By design, values of Net Recommender for a particular class response set have these features:

Average: zero

Standard Deviation: 40

Maximum possible range: from -100 to +100

By virtue of the design of the Net Recommender calculation, the following effects occur:

One half of the class values of Net Recommender will fall in the range -100 to zero (below the target average). Naturally, the remaining one half of values will fall in the range zero to +100 (above average).

Approximately ⅔ of Net Recommender values in the class will lie between -40 and +40. That is, within one standard deviation of the mean value, zero. (More precisely, 68.27 percent of values will lie between plus and minus one standard deviation of the mean).

Approximately ⅙ of students in the class will receive a Net Recommender value of either greater than +40, or less than -40. (More precisely, 15.9 percent of values).

Approximately 2.5 percent of students in the class will achieve a Net Recommender of greater than +80, or less than -80. In other words, 95 percent of students in the class will obtain a Net Recommender value between -80 and +80.

E2. Mathematical calculation

In the following definitions, the variables Normalised_Recommendation and Normalised_TBL are defined statistically so that they have a mean of zero and standard deviation of 1.0.

For a particular student, s

Net Recommender = 1.20 x SD_Target x (Normalised_Recommendation + Normalised_TBL)/2

Where:

SD_Target = the target standard deviation = 40, by definition.

Normalised_Recommendation = (Recommendation - AV_Recommendation) / SD_Recommendation

Recommendation = the Recommendation score for student, s, awarded by all peer team members of student s

AV_Recommendation = the class average Recommendation score

SD_Recommendation = the standard deviation of the class’s Recommendation scores

        

Normalised_TBL = (TBL  - AV_Team_TBL) / SD_Team_TBL

TBL = Team Based Learning Score for student, s

AV_Team_TBL - Average of the TBL scores for the members of the team specific to student s

SD_Team_TBL = Standard deviation of the TBL scores for the members of the team specific to student s

The factor 1.20 is a correction factor found in practice that ensures the normalisation of the raw Recommendation and TBL scores achieves the target standard deviation, SD_Target.

The divisor 2 is required because the sum of the two normalised functions, each with unit standard deviation, gives a result, (Normalised_Recommendation + Normalised_TBL), with standard deviation of 2.0.

Values of Net Recommender that calculate above +100 are trimmed down to +100. Similarly, values of Net Recommender that calculate below -100 are trimmed up to -100.


E3. Example calculations of Net Recommender

As defined in section E2, for a specific student:

Net Recommender = 1.20 x SD_Target x (Normalised_Recommendation + Normalised_TBL)/2

Student

Peter Johns

Michael Mass

Lydia Loaded

Recommendation

2.0

4.5

3.0

AV_Recommendation

3.0

3.0

3.0

SD_Recommendation

0.5

0.5

0.5

Normalised_Recommendation

(2-3)/0.5 = -2

(4.5-3)/0.5 = +3

(3.0-3.0)/0.5 = 0

TBL

30

50

50

Team

A

A

B

AV_Team_TBL

40

40

30

SD_Team_TBL

10

10

20

Normalised_TBL

(30-40)/10 = -1

(50-40)/10 = +1

(50-30)/20 = +1

N(Recommendation+TBL)

-2-1 = -3

+3+1 = +4

0+1 = +1

SD_Target

40

40

40

Correction factor

1.2

1.2

1.2

Net_Recommender

1.2 x 40 (-3) / 2 = -72

1.2 x 40 (4)/2 = 96

1.2 x 40 (1)/2 = 24


E4. Feedback provided to students

In the personalised feedback email to students, a statement similar to the following is included:

The Net Recommender Index is a normalised composite index combining your Team Based Learning Score (TBL) and Recommendation score. Your score is …. Normalisation means that half the class will achieve a Net Recommender Index between zero to +100, whilst the remaining half of the class will achieve a Net Recommender Index between -100 to zero. A Net Recommender Index of exactly zero means typically 1 in 2 students would recommend a person with a zero score (or higher) to another team, colleague or employer. A Net Recommender Index of greater than +40 implies that 5 out of 6 students would recommend you to another team, colleague or employer. A Net Recommender Index less than -40 implies that less than 1 in 6 students are likely to recommend you to another team, colleague or employer, considering the class as a whole.

E5. Example charts for Net Recommender

The following figures show a Net Recommender histogram, and the histograms for the Recommendation and Team Based Learning Score (TBL) data that contribute to the Net Recommender chart.

Figure D.1: Net Recommender histogram

Average = 0, Standard deviation = 40.4


Figure D=E.2: Histogram of Recommendation

Average = 3.7, standard deviation = 0.53

Figure E.3: Histogram of Team Based Learning Score

Average = 67, standard deviation = 11.3


E6. Assumptions about Net Recommender

The calculation of Net Recommender assumes several conditions, described as follows.

The statistical distributions of the Recommendation and Team Based Learning Scores (TBL) are assumed to be normally distributed. In practice, the distributions are typically asymmetric with negative skew.

The Recommendation score awarded to a student s in team t are assumed to be absolutely comparable to a similar Recommendation score awarded to another student x in another team y. In other words, a Recommendation score of student s in team t of 3.5 means exactly the same for student x in team y if they are also awarded a Recommendation score of 3.5. Similarly, a difference in Recommendation ratings of 1.0 unit means the same in any team. In practice, the Recommendations made by one team may not be consistent with the Recommendation values assigned by another team. However, given that Recommendation is a ‘top of mind’ peer assessment done at the start of the Peer Assess Pro survey, we think it is a reasonable assumption. Consequently, the Recommendation values are normalised using the mean and standard deviation of the entire class of responses

In normalising the Team Based Learning Score (TBL) it is assumed that each team possesses a uniform, random mix of student capabilities drawn from the entire class. Therefore, all things being equal, one would expect that the mean and standard deviations of each team’s TBL would be equivalent. However, in practice, this equivalence is rarely observed. Consequently, the need for normalising the TBL scores to create Normalised_TBL arises.

The Team Based Learning score awarded to a student s in team t are assumed NOT to be comparable to similar TBL scores that might be awarded to another student x in team y. In contrast, it is assumed that the TBL score of the average student in team s implies an equivalent peer rating to the average student rated in another team x, even though the arithmetic value of the TBL values may differ. The same reasoning applies to the spread of TBL values within teams, namely, that the best team member in team s should be rated comparably with the best team member in team x, even if their TBL scores differ. Consequently, the relative values WITHIN the team are scaled to match the relative values within other teams through normalisation using each team’s mean and standard deviation.

E7. Frequently Asked Questions about Net Recommender

Would a student received the same Net Recommender score if rated in another class?

In general, ‘NO’. If a student receives a Net Recommender of +50 in one class they may well receive a different net recommender in another class.


APPENDIX F: Frequently Asked Questions, known errors and limitations

FAQ 1. May I select a new destination for my Peer Assess Pro Survey?

Avoid repeatedly selecting a destination for your survey, (See Section B3 Prepare the (Responses) spreadsheet for receiving responses from the survey form). The result will be that both your existing responses and NEW responses will be sent to the new (Responses) sheet. Is that what you want?

Furthermore, repeatedly selecting the existing spreadsheet destination will add additional sub-sheets to your survey. That could be a useful added extra feature. To recover from this mishap, you will need to manually copy or transfer all the responses into one sub-sheet of your spreadsheet, such as “Form Responses 1”.

Perhaps you have conducted a mid-term survey and wish to complete an end of term survey? If you wish to ‘recycle’ an IDENTICAL survey form created for a specific class, to collect a NEW set of Survey responses, try the following:

From Version 7.2 (10-May-2016) the Analyser has been modified so that data is IMPORTED from the ”Form Responses 1” subsheet. You can can select to import data from an alternative Form Responses  subsheet in the Peer Assess Pro Dashboard subsheet.

FAQ 2. May I adjust the number or content of questions in the survey?

DO NOT adjust the number of questions from the current limit of 10 quantitative and four qualitative.

YOU MAY NOT change the headings of the questions for the 10 quantitative questions.

YOU MAY change the long description for each question.

Peer Assess Pro conducts a check and provides a WARNING if you adjusted the headings or number of questions. Don’t!

FAQ 3. May I remove rows or columns of Peer Assess Pro spreadsheet?

No. NO. NO!!

The copy of Peer Assess Pro means you have ‘brain surgeon’ access to everything in the spreadsheet. No cells are protected. So, you can make dangerous messes if you delete or change anything except data in the green cells. Unpredictable, probably chaotic errorful results will occur.

You may hide rows or columns using the Google Sheets Hide Row and Hide Column feature.

You can produce print or PDF reports that select specific cells using the Google Sheets Print or Download as PDF feature.

FAQ 4. Why are the members of a team not appearing correctly in some parts of the Analyser?

Problem:

User defined

“I am having a problem with Group A not reflecting on the analyser.”

Symptoms

Environment

Peer Assess Pro Version 14.2 (PRODUCTION)

Google Sheets

Browser: Unknown.

Solution

  1. User must Interrogate CAREFULLY the (Responses) sheet.
  2. Check that there are no individuals that could be identified as being in more than one team. That is, an Assessor may have placed themself (and their Assessees) in, say, Team A, whereas another team member might state that they belong to Team D (as occurred in this case). This check can be done by SORTING your (Responses) sheet by Assessee. Check carefully the Team Name against each Assessee. Do the same for Assessors.
  3. CORRECT the (Responses) sheet to ensure Team Assessors and Assessees are listed in one AND ONLY ONE team. Check against your Class List of Team Members.
  4. In future, refer to Prevention measure below.

Prevention

For Users: User Manual updated to advise Good Practice Hint

Reference:

B1. Before you start

Add a one character suffix to the Student Name indicating the team to which that student belongs. Example Able Archer A; Amanda Anderson B; Barbara Baker A; … Hannah Hudson D. When you create the Survey Form later, list the names sorted by team. Then the team members will get a reminder about what team they belong to, and who their team members are. (User Manual Version 14.6)

Systems development: Bug fixes and enhancements

Peer Assess Pro Version 14.2 updated to Version 14.2.2 (PRODUCTION) to trap team overflow error and warn user as follows:

Import Progress

Responses retrieved =

121

Unique Teams

6

Maximum team members per team identified

11

WARNING: CATASTROPHE! A team with 11 members has been identified from 'Responses' subsheet. However Peer Assess Pro restricts the number of members in a team to a maximum of 10

This general purpose Warning identifies Team Count Overflow. This possibility occurs when a team member is determined by Peer Assess Pro to be designated as being in multiple teams.

User must identify which team is causing the overflow. Then follow instructions above to correct team membership of people in that team, some of whom should be in another team.

Systems development: Future requirements


APPENDIX G: Future features requested


APPENDIX H: Workshop and interest group registration

p1409496661-6.jpgAnalysis example.png

Peer Assess Pro: enhancing the effectiveness of student team peer feedback

Innovations in Learning and Teaching

Peter Mellalieu and Patrick Dodd

Peer Assess Pro™ Ltd, Auckland, New Zealand

Next workshop

Date: To be advised

Location: To be advised

Abstract

There has been a long call from employees to develop graduates soft skills in addition to the traditional academic competencies. In recent years, the call for developing graduates’ soft skills is being realised through wider adoption of team-based pedagogies. Several challenges related to formative and summative assignment now arise as teachers increasingly adopt team-based pedagogies such as project-based learning and action learning. Specifically, in terms of summative assessment, how can a teacher quantify an appropriate contribution mark for each member of the team? Equally important, as a team proceeds to work together, how can the team be supported to improve team members’ contribution to team effectiveness in a proactive manner that will contribute to an improved end-of-semester outcome?

This workshop reports progress results for a prototype decision support system (Peer Assess Pro) designed to enhance the effectiveness of student team peer feedback. Peer Assess Pro provides teachers and team members with quantitative and qualitative information that enables timely, constructive conversations focussed on precise pinpointing of team members’ strengths, weaknesses, and opportunities to improve their contribution to their team’s achievement. Specifically, Peer Assess Pro comprises several elements:

At the workshop, the presenters will:

Register interest

If you wish to attend the workshop, or join our project newsletter about this project, please register your interest here:  www.peerassesspro.com

Pre-workshop preparation

VIEW the the video demonstrates the essential components of Peer Assess Pro.

Peer AssessPro: Demonstration [Teacher Playlist Version]. (2017). Auckland, New Zealand: Peer Assess Pro. https://www.youtube.com/watch?v=oGzH2kVsD7A&list=PLrrxmJ7TOi8NIIUUq9lMi88enry92-ovx&index=1&t=1s

User documentation

Mellalieu, P. J., & Dodd, P. (2017). Peer Assess Pro (User Manual). Auckland: Unitec Institute of Technology. Retrieved from http://tinyurl.com/peerfeed


APPENDIX I: Previous download versions

These previous versions are provided in case you really need them! Ideally, use the latest versions available here: 1. Official download links

I1. History of key improvements to Peer Assess Pro

Version 10.4  enables you to email snapshot summaries to each student. This mailout is achieved using the YAMM Google Sheet Add-on from the ‘MailOut’ subsheet in the Peer Assess Pro Analyser. Refer APPENDIX E: Using the YAMM mailmerge add-on to email personal snapshots to students

I2. Production version 10 released 2016-07-10 to 2017-05-31

Peer Assess Pro Survey

The teacher must make their personal copy of the survey form that can be adapted to her purposes. See APPENDIX B: Instructions for teacher’s deployment of Peer Assess Pro Survey

Peer Assess Pro Survey - [PRODUCTION] (Version 2.0) [Google Form]. Auckland. Retrieved from https://docs.google.com/forms/d/1wCgODC9tupHGzVDsVJQ-9r2j3ZGKuQvTboM6jOJZX8Y/copy

Peer Assess Pro Analyser

The following link creates the teacher’s copy of the Peer Assess Pro Analyser into which the teacher pastes the URL link to the (Responses) from the Peer Assess Pro Survey form he has created. See APPENDIX D for instructions on usage.

Version 10.4 and above incorporates the YAMM Google Sheet Add-on to email snapshot summaries to each student. See APPENDIX E.

Mellalieu, P. J., & Dodd, P. (2016). Peer Assess Pro Analyser (PRODUCTION) [Copy] (v11.1) [Google Sheets]. https://docs.google.com/spreadsheets/d/1aGgRN1eyCPK12dqhIo6OpbGhqubB6k943be3wVcCyQA/copy

I3. Demonstration versions v.11

These demonstration versions enable you to explore the functioning of the components of the Peer Assess Pro.

Demonstration survey form for students and teachers

Peer Assess Pro Survey - DEMONSTRATION (Version 2.0) [Google Form]. Unitec Institute of Technology. Retrieved from http://goo.gl/forms/E4TwJjq8Zl

Demonstration Peer Assess Pro Survey Analyser

Mellalieu, P. J., & Dodd, P. (2016). Peer Assess Pro Analyser (DEMONSTRATION) [Copy] (Version 11.1) [Google Sheets]. Retrieved from https://docs.google.com/spreadsheets/d/1H9DDJwe2tIxyk290nJLbme04RWH6CDgmCIw2lxgEPIQ/copy

Demonstration (Responses) dataset

You can practice copying or linking to this response data into the PRODUCTION version of Peer Assess Pro. BE PATIENT! It takes several minutes for the data to be analysed and percolate through the several sub-sheets. Remember to enter the team results in the Team Marks sub-sheet after copying the (Responses) data into the Responses sub-sheet.

Mellalieu, P. J. (2017). Peer Assess Pro Analyser - Demonstration Dataset (Responses) (Version 5.2) [Google Sheets]. Auckland, N.Z: Peer Assess Pro. Retrieved from https://docs.google.com/spreadsheets/d/12vM-qu0fjDiIR-o3liEw1fegYOSqbrXcVfoj1MJEIOQ/copy

I4. Production version 12 released 2016-06-01 to 2017-08-31

Peer Assess Pro Survey v12

Survey PRODUCTION - Peer Assess Pro (Version 12.0, 2017-06-01) [Google Form].

https://docs.google.com/forms/d/1tIxRnU2WkSeT0naGYQ4lc2xlzEQSULnwAB5oFnVcF7M/copy

The above creates your personal copy of the Peer Assess Pro Survey. You adapt the survey form to include your class names and student IDs.

Peer Assess Pro Analyser v12

Analyser PRODUCTION - Peer Assess Pro (Version 12.0, 2017-06-01) [Google Sheets]. https://docs.google.com/spreadsheets/d/16dJMcSLCxCVdAshzQ10D2m9xNxTrrCM9v8ZzNM32OaQ/copy

The above link creates your copy of the Peer Assess Pro Analyser. You  paste the URL that links to the (Responses) sheet generated by students’ responses to the Peer Assess Pro Survey you created and launched. For instructions on usage, see APPENDIX D: Instructions for using Peer Assess Pro Analyser v12.

Demonstration Peer Assess Pro - Analyser v12

Analyser DEMO - Peer Assess Pro (Version 12.0) [Google Sheets].

https://docs.google.com/spreadsheets/d/1ICg8aR38Yc8xtHbjlLGfBh_-t42Y6k_oMuZTR9I-yWc/copy

This demonstration version gives you an Analyser already linked to a dataset of Survey responses and Email addresses for a mailmerge.

User Manual v12

User Manual: Peer Assess Pro [PDF]. Version 12.2.4. 2017-06-02. Auckland: Peer Assess Pro Ltd. https://drive.google.com/file/d/0BzwyjX8GotyaSXYxYm92bW5HSk0/view?usp=sharing

Mellalieu, P. J., & Dodd, P. (2017). User Manual: Peer Assess Pro [WEB]. Auckland, New Zealand: Peer Assess Pro Ltd. https://docs.google.com/document/d/1NNHqqtnt2USMLyfxCP6bJNtiRx5ewy6bPJwF21qx7cw/pub

I5. Production version 14 released 2017-09-01 to 2018-01-31

Superseded by Version 15 released 2018-02-01.

IMPORTANT: Version 14 was superseded by Version 15 on 2018-02-01. View the Version 15 User Manual here.

Peer Assess Pro Ltd. (2018). User Manual: Peer Assess Pro v15 [WEB] (Version 15). Auckland, New Zealand. Retrieved from http://tinyurl.com/pfmanual

Peer Assess Pro Survey v14

Survey - Peer Assess Pro. (2017). (Version 14.2) [Google Form]. Auckland: Peer Assess Pro. https://docs.google.com/forms/d/1QQycKDEr1rdbC00JwE6VwmhmkUdNte4IrwIvHarAVSY/copy

The above link creates your personal copy of the Peer Assess Pro Survey. You adapt the survey form to include your class names and student IDs, typically presented in your Class List.

Peer Assess Pro Analyser v14

Analyser - Peer Assess Pro. (2017). (Version 14.2.2) [Google Sheets]. Auckland: Peer Assess Pro.

https://docs.google.com/spreadsheets/d/1Si_iNvONRtpaARENbgadG8eq9qRUSjjItkRX5jjnKw4/copy

The above link creates your copy of the Peer Assess Pro Analyser. In the Analyser, you link to the (Responses) spreadsheet generated by students’ responses to the Peer Assess Pro Survey you created and launched. It is good practice, but optional, to link the Analyser to the Class List.

Demonstration Peer Assess Pro - Analyser v14

Class 123 ALPHA Analyser - Peer Assess Pro

https://docs.google.com/spreadsheets/d/1zpP0Y7mDnj-yzbZAZ84A7djEc7fIeS4cdXNEtLiaNCw/copy

This demonstration version creates a copy of the Analyser linked to a demonstration dataset of Survey responses (Class 123 ALPHA) and email addresses for a mailmerge.

Class List Template v14

Class List Template - Peer Assess Pro. (2017). (Version 14.2.2) [Google Sheets]. Auckland: Peer Assess Pro.

https://docs.google.com/spreadsheets/d/1Pr7QcbThvzCV91yo1BBS8NG7gVKtN9ZHpv_PXC20838/copy

This template is an optional template structured in the correct format for Peer Assess Pro. The template includes a utility feature that (a) Generates Full Names appended with the Team ID and (b) Generates a list of the Unique Team Names in Sheet2.

User Manual v14

Peer Assess Pro Ltd. (2017). User Manual: Peer Assess Pro  [PDF] (Version 14). Auckland, New Zealand. Retrieved from https://drive.google.com/file/d/1ZvamSoIHdDPzh-oTNwoQGXfQkIXdyKoI/view?usp=sharing


APPENDIX J: Definitions of terms

Attribute

Abbreviation

Definition

Formula

Team Based Learning Score

TBL

A relative measure of the degree to which a team member has contributed to their team's overall achievement, team processes, and leadership. The Team Based Learning score (TBL) is calculated for each team member directly from their Average Team Contribution (ATC) and Average Leadership Contribution (ALC). A TBL score can generally be used only to compare the relative contribution of students WITHIN the same team, not BETWEEN teams. When used to create a relative index and combined with the team's Team Result, the Team Based Learning score can be used to calculate a several variants of a student's Personal Result (PR) that are comparable between teams. (See Indexed Personal Result (IPR), Normalised Personal Result (NPR), and Rank-Based Personal Result (RBR). Values for the TBL score range from zero through 100.

TBL = (5/4) x (ATC-1 + ALC-1) => 12.5 x (ATC + ALC -2). This formula creates values ranging from 0 to 100, because ATC and ALC each range from 1 through 5, a spread of 4. The (-1) in the formula corrects for the fact that the rating scales in the Peer Assess Pro Survey use a scale of 1 through 5.

Team Based Learning Index

TBLI

The Team Based Learning Score (TBL) indexed so that the person in the team with the highest TBL Score is awarded a TBLI of 100. All other team members receive a proportionally lower TBLI in the ratio TBL / TBLmax.

TBLI = 100 x TBL / TBL_max

Team Result

TR

The result awarded to the team for the outputs of their work. The Team Result (TR) may derive from grades from team reports, presentations, and results of Team Readiness Assurance Tests. When combined with a student's Team Based Learning Score (TBL), the Team Result is used to calculate a variety of Personal Results (PR) for each student, reflecting their specific relative contribution to the Team Result.

Measures of a student's personal result

Personal Result

PR

A student's personal result gained from combining their Team Result and Team Based Learning Index (TBLI). The teacher can select from several methods to calculate the Personal Result, including: Indexed Personal Result (IPR), Normalised Personal Result (NPR), and Rank-Based Personal Result (RPR). Figure 1 illustrates the statistical features, such as team average, range, and standard deviation, associated with each method.

Indexed Personal Result

IPR

A student's Personal Result calculated from the Team Result (TR) and the student's specific Team Based Learning Index (TBLI). The Indexed Personal Result method awards the TOP RANKED student (TBLI = 100) in the team the Team Result. All remaining students in the same team earn the Team Result proportionally downwards. The Indexed Personal Result calculation implies that NO team member can earn an Indexed Personal Result greater than the Team Result. Values for the Indexed Personal Result range from zero up to the Team Result. Optional feature: the tutor can award a bonus mark to the Indexed Personal Result of the top ranked student in each team. The teacher designates the bonus mark on the DASHBOARD of the Peer Assess Pro Analyser.

IPR = TBLI x TR / 100

Normalised Personal Result

NPR

A student's Personal Result calculated from the Team Result and the student's specific Indexed Personal Result (IPR). The Normalised Personal Result method awards the AVERAGE student in the team the Team Result (TR). All remaining students are awarded a personal result above or below the Team Result depending on whether their IPR is above or below that team's average Indexed Personal Result (TeamAv(IPR)). Two features of the Normalised Personal Result are that (a) The average of the team's Normalised Personal Results matches the Team Result (b) The standard deviation of the team's Normalised Personal Results matches the standard deviation of the Indexed Personal Results (IPR) that is calculated for that team. In contrast to the Indexed Personal Result method, the Normalised Personal Result method calculates a Personal Result ABOVE the Team Result for the above-average peer rated students in the team. Values for the Normalised Personal Result range from zero to 100. Calculations that exceed these ranges are clipped to fit within zero to 100. Optional feature: The tutor can increase the Spread Factor (SF) from the default value of 1 to increase the spread of the results around the Team Result. An increase in the Spread Factor will maintain an average Normalised Personal Result that matches the Team Result. The teacher makes the adjustment to the Normalised Personal Result Spread Factor on the DASHBOARD of the Peer Assess Pro Analyser. A Spread Factor of 2.0 is recommended.

NPR = TR + CF(IPR), where CF(IPR) = SF x (IPR - TeamAv(IPR))

Rank Based Personal Result

RPR

A student's Personal Result calculated from the Team Result and the student's specific Rank Within Team based on that student's Team Based Learning score. Similar to the Normalised Personal Personal Result this Rank Based Personal Result method awards the AVERAGE student in the team the Team Result. All remaining students are awarded a personal result above or below the Team Result depending on whether their Rank Within Team is above or below that team's middle-ranked student. Two features of the Rank Based Personal Result are that (a) The average of the team's Normalised Personal Results matches the Team Result (b) The standard deviation of the team's Normalised Personal Results is spread over a MUCH WIDER range than the Indexed Personal Result and Normalised Personal Result methods. In contrast to the Indexed Personal Result method, the Rank Based Personal Result method calculates a Personal Result significantly ABOVE the Team Result for the top ranked student in the team. Values for the Rank Based Personal Result range from zero to 100. Calculations that exceed these ranges are clipped to fit within zero to 100.

RPR = SFR x Team Result x Number of Team Members, where SFR = (Reverse Rank for student s) / Sum of (Reverse Ranks for each team member, x)

Measures of a student's relative performance within a team

Rank Within Team

RWT

A student's rank in the team as measured by the Team Based Learning (TBL) score, where 1= best in team. Useful for pinpointing the extreme performers in the class as a whole, that is, rank positions of 1, and rank positions at the other extreme. A 'worst in team' ranking is not necessarily an indicator of poor student performance, particularly when a Team Result is relatively high and/or when the spread of Total Peer Score within the team is low.

Percentile Rank Within Team

PCR

The percentage of all team members in the team, including yourself, who are rated equal to or lower than yourself. 100 is the best score. 0 is the worst. Like the Rank Within Team, the Percentile Rank Within Team is useful for pinpointing extreme performers in the class as a whole, excluding consideration of the Team Results. When a class has teams with DIFFERENT numbers of team members, the poorest performers in a team, will achieve a PCR of zero. If there are multiple 'best in team' ranks of 1 for a particular team, however, the top ranked students DO NOT gain a PCR of 100, but the next percentile lower! To identify the top performers, use Rank Within Team and look for those students with a value of 1 = best in team. Alternatively, students with Team Based Learning Index (TBLI) of 100 are the best ranked in their team, irrespective of multiple top ranked team members in that team.

Attributes contributing to Average Task Contribution (ATC)

Average Task Contribution

ATC

A relative measure of the degree to which a team member has contributed to the task output of the team. Average Task Contribution is the average of all the five task contribution ratings for a particular student given by all their team members through the Peer Assess Pro Survey. Values for ATC range over the values 1 through 5, a spread of 4.

ATC = (Initiative + Attendance + etc) / 5

1. Initiative

Shows initiative by doing research and analysis. Takes on relevant tasks with little prompting or suggestion.

2. Attendance

Prepares for, and attends scheduled team meetings and class meetings.

3. Contribution

Helps the team achieve its objectives. Makes positive contributions to meetings.

4. Professionalism

Reliably fulfils assigned tasks. Work is of professional quality.

5. Ideas and learning

Contributes ideas to the team's analysis. Helps my learning of course and team project concepts.

Attributes contributing to Average Leadership Contribution (ALC)

Average Leadership Contribution

ALC

A relative measure of the degree to which a team member has contributed to the team's leadership and group processes. Average Leadership Contribution is the average of all the five Leadership and Group Process contribution ratings for a particular student given by all their team members through the Peer Assess Pro Survey. Values for ATC range over the values 1 through 5, a spread of 4.

ALC = ( Focus + ... Chairmanship ) /5

6. Focus and task allocation

Keeps team focused on priorities. Facilitates goal setting, problem solving, and task allocation to team members.

7. Encourages contribution

Supports, coaches, or encourages all team members to contribute productively.

8. Listens and welcomes

Listens carefully and welcomes the contributions of others.

9. Conflict management and harmony

Manages conflict effectively. Helps the team work in a harmonious manner.

10. Chairmanship

Demonstrates effective leadership for the team. Chairs meetings productively.

Attributes relating to overall recommendation

Recommendation

How likely is it that you would recommend this team member to a friend, colleague, or employer?

Rated on a 5-point scale, from 1 = Extremely unlikely, to 5 = Extremely likely

Net Recommender

A composite index combining normalised values of Recommendation and the Total Peer Score.

Rated on a normally-distributed curve, clipped to range from -100, to +100. 50 percent of the class will achieve a Net Recommender score of 0 or greater.

Measures relating to realistic self assessment

Team Based Learning Score (INDEX)

TBL (INDEX)

A measure of the extent to which your SELF assessment is matched by the assessment of the OTHER members of your team. The index is calculated from the ratio of 100 x OTHERS / SELF for your Team Based Learning Score (TBL). An index above 95 indicates you may have a tendency to UNDERESTIMATE your team contribution when perceived by other team members. Consider developing more confidence in displaying your strengths. A score between 75 and 95 suggests you realistically understand your team contribution and how your team contribution is perceived by other team members. A score between 75 and 95 is typical of about 2/3 of students in the class. An index below 75 indicates you tend to OVERESTIMATE your team contribution when perceived by other team members.

Average Task Contribution (INDEX)

ATC (INDEX)

A measure of the extent to which your SELF assessment is matched by the assessment of the OTHER members of your team. The index is calculated from the ratio of 100 x OTHERS / SELF for your Average Task Contribution (ATC). An index above 95 indicates you may have a tendency to UNDERESTIMATE your team contribution when perceived by other team members. Consider developing more confidence in displaying your strengths. A score between 75 and 95 suggests you realistically understand your team contribution and how your team contribution is perceived by other team members. A score between 75 and 95 is typical of about 2/3 of students in the class. An index below 75 indicates you tend to OVERESTIMATE your team contribution when perceived by other team members.

Average Leadership Contribution (INDEX)

ALC (INDEX)

A measure of the extent to which your SELF assessment is matched by the assessment of the OTHER members of your team. The index is calculated from the ratio of 100 x OTHERS / SELF for your Average Leadership Contribution (ATC). An index above 95 indicates you may have a tendency to UNDERESTIMATE your team contribution when perceived by other team members. Consider developing more confidence in displaying your strengths. A score between 75 and 95 suggests you realistically understand your team contribution and how your team contribution is perceived by other team members. A score between 75 and 95 is typical of about 2/3 of students in the class. An index below 75 indicates you tend to OVERESTIMATE your team contribution when perceived by other team members.

Ten Attributes (Correlation)

The Ten Attributes Correlation compares statistically all ten of the individual attributes with your assessments by SELF and OTHERS. A perfect match is 100. However, for typical student classes, a correlation index above 45 suggests that others agree moderately agree with your self-assessment of your strong and less strong attributes. A score above 45 is typical of less than 1/6 of a typical class. A correlation index of less than 20 suggests that others disagree strongly with your own assessment of your strong and weak attributes.                                                        

Table of Contents            Peer Assess Pro™: User Manual Version 15.3.3 (2018-02-14)            PeerAssessPro.com