A. Problem statement
Feedback (during and after) assessment tasks is critical for effectively promoting student learning. Without feedback students are limited in how they can make judgements as to their progress, and how they can change their future performance. Feedback is the lynchpin to students’ effective decision making, and the basis of improved learning outcomes. However, feedback is under-utilised and often misunderstood by both students and educators. This project is about improving student learning (and experience) through improving institutional, educator, and student capacity to stimulate and leverage assessment feedback.
The aim of this project is to improve student learning and experience by improving the way in which the Australian Higher Education sector enacts feedback. Our approach will deliver a pragmatic, empirically based framework of feedback designs to guide educators, academic developers and instructional designers, as well as institutional policy. This will be supported by large scale data highlighting patterns of success and 7 rich cases of feedback designs to demonstrate how that success can be achieved. In addition, this project will facilitate the likelihood of adoption through a series of dissemination activities including national workshops built on a participatory design approach.
This project has been designed so that each of the above four challenges are aligned with four phases of project activity, each with specific deliverables. From the outset, issues of sustainability, impact and dissemination have been considered in the project design. This project is ambitious but achievable. The project team has (a) substantial expertise in the field of feedback, (b) considerable experience in conducting high-impact large-scale projects, and (c) have a track record of working together. The deliverables are:
[* Note: due to the time frame of writing and peer review these publications are unlikely to fall within the phase period. However, journals/publishers with early release or open access options will be preferred. The other academic outputs in this section should be read in a similar way.]
The following IMPEL key questions and matrix also include responses to the D-Cubed Model for dissemination. This section should be read in conjunction with section B in relation to specific outputs that also feed into the dissemination strategy.
This response builds on the arguments around sector need and readiness in Section A. In brief, feedback is generally recognised as being potentially valuable but highly variable in quality and impact with resulting student dissatisfaction. As a consequence all Australian universities to our knowledge seek to improve their approach to feedback. Universities need a way in which they could systematically plan for effective feedback at the institutional, educator and student level. Examples of the voracity of the sector for empirically sound pragmatic approaches are included in Section A, including the fact that a single research paper by Henderson and Phillips resulted in invited presentations to 7 faculties/divisions and 5 other universities. Consequently we expect (and are experienced in leveraging) considerable interest from across the sector in the outcomes of this project.
A project of this size and national importance is driven by a desire to have an impact on the sector as a whole. As described above, there is a climate of readiness. However, the framework and other outcomes of this project are unlikely to provide simple single-step ‘fixes’ to the critical problem of feedback in Higher Education. Consequently, this project is designed to invest considerable time on working with stakeholders, particularly in the Phase 4 participatory design workshops (please see Question 3 of this section for more details about strategies to engage stakeholders). These are seen as being particularly powerful in supporting participants (academic developers, leaders in Learning and Teaching divisions and other key stakeholders) to plan how the framework can be implemented within their ecologies.
The below IMPEL matrix offers an indicative plan for impact
| Anticipated changes (projected Impact) at: | |||
| Project completion | Six months post-completion | Twelve months post-completion | Twenty-four months post-completion |
1. Team members | · Increased inter/national profiles through project activities – with increased attention on the project, and opportunities for invited presentations, papers, awards, policy advice, etc. · Improvements to own feedback practice | |||
2. Immediate students | · Improved learning & satisfaction through enhanced feedback designs · Improved student capacity for seeking and acting on feedback. | · Improved learning & satisfaction – via refined feedback designs. [Also provides ongoing comparative data for project which supports ongoing impact through future dissemination] | ||
3. Spreading the word (Contributions to knowledge in the field) | · Dissemination of outputs, including infographics, videos, and professional learning resources via social media, direct email, media coverage, etc. · 2 articles published - peer recognition of validity of findings (see Outputs) · 2 conference presentation - to expand influence (see outputs) | · 2 articles published - to demonstrate peer recognition of validity of findings (see Outputs) | · Additional journal and conference paper arising from ongoing data analysis. | |
4. Narrow opportunistic adoption (at participating institutions) | · Adoption of framework and use of resources by educators, academic developers and leaders who participated in the study | · Presentations to University & Faculty leading to adoption of practices and policy changes · Improved student learning and experience. Improved student capacity for seeking feedback and self-regulated learning. | ||
5. Narrow systemic adoption (at participating institutions) | · Policy change in Faculty of Education, Monash [establishing feasibility] | · Policy change within Faculties at Monash and Deakin. | · Policy changes & funded initiatives at institution level at Monash and Deakin. | |
6. Broad opportunistic adoption (at other institutions) | · Changed practices of up to 180 academic developers, instructional designers and lecturers from all Australian universities who participated in the Phase 4 workshops across 6 capital cities. | · Snowballing impact from the workshops – esp. driven by participants in academic development and leadership roles – leading to improved student learning and experience and improved student capacity for seeking feedback and self-regulated learning. This in turn feeds into the broad systemic adoption | · Take-up of practices through dissemination of findings, eg. via conference presentations | |
7. Broad systemic adoption (at other institutions) | · Raising of awareness of framework and policy implications via the Phase 4 leadership webinar conducted with up to 80 leaders and managers from all 39 Australian Universities. | · Raising awareness through reference group and media releases to peak bodies such as CADAD, ACODE, etc. · Policy changes and funded initiatives. | ||
This project has been carefully designed so that dissemination strategies are an integral component of the activities and outputs of each Phase (please read the below in conjunction with Sections B and D).
Each phase of our project requires us to have substantial engagement with students, educators, professional staff, and senior leaders. These groups are key informants but also we see them as key audiences for our work. A portion of the budget has been assigned to support this engagement, for example incentives to participate in the survey, focus groups and case studies. The Phase 4 participatory design workshops and to a lesser extent the leadership webinar are designed to be a particularly effective mechanism to impact on the ‘change enablers’ across the nation.
In each Phase of the project we also intend to engage with the broader HE community (ie. beyond our project participants) through dissemination of our findings, particularly in ‘easy to digest’ forms of infographics, media releases, and “first glimpses of the data”:
A further key strategy involves the reference group as well as the external evaluator, to further disseminate and draw attention to the project through their extensive networks.
Finally, the project team is high accomplished at, and deeply invested in, the dissemination of findings through their networks, in particular at their regular attendance at national and international conferences such as HERDSA, AARE, and ASCILITE. The team will produce at least two conference papers during the project and four journal articles/book chapters in relation to each phase.
Our outputs will be designed with reuse and adaptation in mind. We will ensure that the source documents for our work are readily available so they can be embedded in institutional and departmental programmes. University teachers, academic developers, committees, leaders, each have different needs and ecologies within which they work. The participatory design workshops are a particularly powerful strategy to both disseminate our findings while also providing a way in which the participants can be supported to adapt them to their own needs (with a further benefit of these outcomes ‘feeding back’ into the project outcomes to provide further examples of adaptations that other institutions could find helpful.
We will also provide ready-to-use and tested (an outcome of the 6 participatory design workshops) continuing professional development resources that can be used by individuals, teams and academic developers to build capacity to improve feedback.
The risk to this project is a continuing investment in old conceptions of feedback as only “giving comments to students”, with resistance to what may be perceived as time consuming and complex alternatives. However, this project is designed from the outset to address the issue of why feedback designs in one context may not be successfully adopted in another context. Phases 2-4 specifically seek information from students, educators, academic developers, managers and senior leaders about the conditions for success and the barriers that are involved at all levels. This project is designed to address the pragmatic need for effective feedback designs by raising awareness of the necessary conditions that need to be in place. We will have a particular focus on feedback designs that operate within standard learning and teaching workload models. It is important to recognise how we can work within (rather than ignore) constraints such as workload, budget, and time.
We will record and track each engagement with the project participants and wider HE community enabling us to confidently measure our ‘immediate reach’. We will also track the volume of website visitors, as well as the views and downloads of specific resources (eg. video, infographic, and professional development PowerPoints). These are in addition to familiar analytics such as academic article downloads, citations, conference attendees, workshop attendees, invited presentations, media interviews, etc. We will also email all of the Phase 4 workshop participants two months after their participation seeking information about the traction of any outcomes (changes to policy, training, practices, plans, etc.). Tracking these metrics will be a part of the Project Manager’s role.
Our website with its resources will be hosted for at least five years. It will be maintained and updated during that five year period by the Learning with New Media Research Group, under the supervision of A/Prof Henderson. The website will be updated with relevant academic articles and sector initiatives as they become known to the project team. In addition, the website will feature social media streams (eg. Twitter), which will facilitate the site as a ‘go to’ location for feedback related designs.