Evaluation report:

Evaluation of an educational learning resource

Rationale

At the start of 2013, every staff and student member of Mary MacKillop College gained access to their own school-based Google accounts and ability to use Google Educational Applications for teaching and learning. Staff and students are becoming multimodal learners in an ever growing online community of learners, working not only with the College, but also with schools from the entire CEO Sydney diocese. A variety of online learning spaces are being utlised by staff and students to maximise learning and make it accessible anytime, anywhere with anyone. One of these spaces is the Irene McCormack Centre site.

This evaluation report aims to highlight the effectiveness of the IMC as an educational learning resource as identified by students as the primary stakeholders. The report will therefore also attempt to provide a brief overview of:

Brief description of the resource

The Irene McCormack Centre, IMC (Appendix A) is a newly launched concept for the Mary MacKillop College Library. Ideas which underpin the IMC concept design are demonstrated and reflected through the IMC site. The target audience is students and staff. Through the IMC site, students are able to gain access to engaging educational learning experiences in the online and physical ‘zones’ of the IMC. The online learning space is also designed to allow students to extend their skills and understandings of ideas covered in their subject courses, as well as access support and resources to assist them in their learning. Learning experiences refer to activities which students engage in through the IMC Zones: Collaboration, Creation, Exploration, Investigation and Reading. Each of the zones is named according to the type of learning activities that are available for students and staff to take part in. It is important to mention that whilst the purpose of the IMC site is to foster motivation and encourage deep engagement with learning, measuring of student achievements according to outcomes of the courses they are undertaking, is not facilitated through the site. As such, a space for submitting assignments or accessing teacher feedback on tasks is not currently a feature of the IMC.

Description and justification of the instrument and evaluation

Gathering data from students is important as the first step in evaluating the IMC site, as they are the primary users of the resource. The College is very much focused on student-centered learning and allowing students to evaluate this resource is giving them the opportunity to have a voice in evaluating an online space designed with their learning in mind.  A selected group of students have also played a role in contributing to some of the content of the IMC site. As such, the evaluation instrument reported on here has been designed for students as evaluators.

In selecting the most appropriate evaluation method, a criteria for data collection was adapted from Taylor, et al. 1996, p.11, keeping in mind that the: “goal is to obtain trustworthy, authentic and credible evidence that will be used.” (Taylor, et al. 1996, p.6) Applying the criteria, highlighted that the instrument most effective for collecting information in evaluating the IMC site is a Questionnaire (Appendix B) (Harvey, 1998, p.50). Conducted as a survey it would allow for the entire College student population to be involved in evaluating the resource and providing sufficient data to evaluate this learning resource. This report will demonstrate and discuss the findings of a pilot student questionnaire.

Set up in four parts, the questionnaire is designed to collect most accurate information aligned with the intended purpose, as shown in Table 1. (Olney & Barnes, 2008, p.6)

Evaluation Question

Questionnaire items (a selection of questions)

How and to what extent are students engaging with the IMC site resources?

1.1 Approximately, how often do you use the IMC site? (dropdown menu)

2.4 When you visit the site, which of the learning zones do you most often access? (dropdown menu)

3.1 Do you like the design of the IMC site?

3.5 Do the visual aspects of the pages draw you to investigate the site further?

What areas of the IMC site might need to be adapted and improved to suit the students’ needs?

3.6 Is the text on the pages easy to read in terms of font?

4.1 Is the purpose of the IMC site clearly identified in the ‘About the IMC’ section?

4.2 The information about the Learning Advisors is clear?

4.4 Is there anything you need assistance with that is not covered on the site?

4.8 Which of the tools do you find most valuable for your study? (checkboxes)

4.29 Are there any additional resources you would like to see added to this space?

4.30 Are there any resources listed that you feel are not needed?

Does the IMC site fosters motivation and collaboration and if so, to what extent?

2.7 Does the site motivate you to learn?

4.5 Are you able to collaborate on the IMC site?

4.17 Is there anything you would like to suggest about this zone?

Table 1

The questionnaire is structured in a way that makes it accessible by all student participants in terms of questions asked, language used and ease of completion (Taylor, et.al 1996, p.11). It is set up as a Google Form and will collect students’ electronic identification automatically upon completion of the form. This will allow for gathering data from many participants, yet being able to gain an overview of the educational resource being evaluated and analyse data efficiently (Taylor, et.al 1996, p.11). Considering whether students will offer accurate responses, questions have been phrased so as to highlight any discrepancies. It can be expected that not all students will take the fifteen minute questionnaire. However, taking appropriate measures to ensure that the questionnaire is delivered at a suitable time, when students are not busy with assessment preparations (Harvey, 1998, p.10) will ensure good response rate and also students’ full attention and focus. Asking students to take the questionnaire during their Connected Learning Circles, a designated time during which similar procedures take place already, would prove beneficial in gathering accurate data and will also be the least disruptive method considering the school structures and student learning (Taylor, et.al 1996, p.11). Considering the social and cultural factors that may influence data collection, using an electronic questionnaire, instead of a paper copy is a process already familiar to students (Taylor, et.al 1996, p.11). The rationale for conducting this evaluation, set out at the start of the questionnaire, is phrased in a positive manner, encouraging students to contribute to improving a resource that will assist their learning.

Evaluation results

For the purpose of this pilot study, eight students were randomly selected and a personalised email was sent a few days before sending the link, asking them to participate. A reminder email was sent one week after (Appendix C). Six students completed the questionnaire anonymously. This equates to a 75% response rate that could be expected if the questionnaire was sent in a similar way to all MMC students, which is an acceptable standard by researchers (Olney & Barnes, 2008, pp. 10-11).

Questions were set out in a way that allows for quantitative and qualitative data collection, which has resulted in obtaining more substantive data (Olney & Barnes, 2008, pg.1). The numerical data has produced accurate graphs and pie charts, generated automatically through a Google Forms spreadsheet (Appendix D). The qualitative pilot data generated generous feedback, instead of simple ‘No’ answers (Figure 1). Each section concludes with an open-ended feedback question so that students have an opportunity to raise anything pertinent not covered by the questions. Data is collected as a list of suggestions and coded according to authenticity (ie. ‘No’ responses can be discarded in terms of formulating a summary, but not ignored in terms of quantity).

Figure 1: Sample Feedback Question as Part 4 - Reading Zone response (extracted from Google Summary of Responses - Appendix D)

Part 1 identifies students who never use the site, providing data on numbers of students and identifying reasons for not accessing the site. As there is a danger that more students might simply choose to say ‘never’ as a means for a quick completion of the questionnaire, this choice does not take them to the end as the instrument was designed to continue through the rest of the sections anyway. Data shows that participant 6 continued to give answers and optional qualitative feedback even after answering that they never visit the site (Appendix D, 2). For accuracy of evaluation, these answers can be omitted from the analysis, as this might be the first time the student is visiting the site and is therefore not familiar enough with the resource, so the answers would not necessarily generate reliable data. Participants who answered ‘yes’ in Part 2 on accessibility, provided data determining what areas of the site students engage with and how they access it. To evaluate the extent of their engagement and understanding of the resource, a Likert rating scale was used in selected questions. Mixing up the types of questioning was valuable in maintaining students’ interest in continuing to provide focussed and accurate responses. The data shows that whilst students are accessing the resources and are able to contact Learning Advisers, 60% of them don’t find the site to be very motivating (Figure 2). Part 3 dealt with presentation of the site, as a means for determining if aesthetics influence student engagement. The data indicates that there is a link, as shown particularly by question 3.4 (Figure 3). Part 4, as the largest section, dealt with content and its relevance to student learning and material required for study and extension of skills and understandings. Divided into learning zones as per site organisation, it provided data as an overview of each zone, how students engage with material and how these pages may be improved to meet students’ learning needs.

In addressing the intended purpose of the evaluation, it can be concluded that:

   

Figure 2: Accessibility - Appendix D

Figure 3: Presentation - Appendix D

Reflection

The evaluation results show that the process of emailing an appropriately timed and phrased, easily accessible questionnaire to students is an effective method of collecting data. Looking at the quality of data collected in relation to the purpose, the instrument successfully collected relevant data, that is able to be analysed and used to improve the IMC site and address gaps identified by students.

Some technical aspects of the instrument need adjusting to improve efficiency and completion time. A multiple choice button for Yes/No answers would be more effective than the current use of the dropdown menu. The Resources section is also missing the ‘none of these’ option, which may have caused inaccurate results, but judging from the answers collected, this was not the case in this pilot. The option would need to be added for the large scale future evaluation. Moving through the questionnaire, the content section may become monotonous for some, as there are possibly too many items listed in some of the questions that allow for multiple checkboxes. Questions could be rephrased to allow for ‘yes/no’ answers in the case where more than six items are listed, as respondents may not consider every item in a long list (Olney & Barnes,2008, p.7). In Part 2, clearer questions are needed. 2.1 and 2.2 ask a similar question, yet responses deliver conflicting data. Question 2.1 asks if the site is well organised, with 40% responses ‘agree’. Question 2.2 asks if the site is easy to follow/navigate, with equal percentage ‘disagree’(Figure 4). It could be argued that students found the site overall well organised, but did not know what the specific sub-sections referred to. Yet, one could conclude that the site should be easy to navigate if it is well organised. As the wrong aspect of the site was measured, whilst the responses were matching up in terms of measure, they were missing the intended purpose, so we can conclude that this is this reliable, but not valid data (Trochim, 2006).

          Figure 4. Part 2 of the Questionnaire - Accessibility - Appendix D

Student and staff involvement in the creation and evaluation of the IMC site is ongoing. Systems are in place with the IMC team for ensuring strategic change takes place as needed in order to keep students engaged and motivated (Oliver, 2000, p.21). The evaluation will therefore be repeated following major changes to the site, aligning resources with students’ needs. Teacher evaluation will also be conducted. This might show a very different set of data, in which case evaluators will need to consider reasons for the differences and devise a strategy for alignment between students’ and teachers’ views. The IMC site uses technology to support innovation in teaching and learning, (Oliver, 2000, p.3), and embed it in learning experiences it facilitates.

(1991 words)

References

Harvey, J. (Ed.). (1998). Evaluation cookbook: A practical guide to evaluation methods for lecturers. Edinburgh: The Learning Technology Dissemination Initiative. Retrieved 20-03-2013, from www.icbl.hw.ac.uk/ltdi/cookbook/cookbook.pdf

Oliver, M. (2000). An introduction to the evaluation of learning technology. Educational technology & society, 3(4), 20-30. Retrieved 05-04-2013, from http://www.ifets.info/index.php?http://www.ifets.info/issues.php?id=16

Olney, C A. & Barnes, S. (2008) Collecting and Analysing Evaluation Data. Planning and Evaluating Health Information Outreach Projects Booklet. Retrieved 10-04-2013, from http://www.pol.ulaval.ca/perfeval/upload/publication_231.pdf

Reeves, T. C. (1997). Evaluating what really matters in computer-based education. Retrieved 10-04-2013, from http://www.eduworks.com/Documents/Workshops/EdMedia1998/docs/reeves.html

Taylor-Powell, E., & Steele, S. (1996). Collecting evaluation data: An overview of sources and methods. Program evaluation and development. Retrieved 25-03-2013, from http://learningstore.uwex.edu/pdf/G3658-4.pdf

Taylor-Powell, E., Steele, S., & Douglah, M. (1996). Planning a program evaluation. Program evaluation and development Retrieved 25-03-2013, from http://learningstore.uwex.edu/pdf/G3658-1.PDF

Trochim, William M.K. (2006). Reliability & Validity. Retrieved 12-04-2013, from http://www.socialresearchmethods.net/kb/relandval.php

Appendix A: Irene McCormack Centre Google Site

Appendix B: Questionnaire

Appendix C: Email to students

Appendix D: Evaluation Results

EDPC 5012 - Assignment 1: Evaluation Report. Alma Loreaux