JISC Developing Digital Literacies - Evaluation REALITY-CHECK Task
About this checklist

The questions below were used for the Jisc Developing Digital Literacies programme. They may be helpful to others as a reality-check exercise for clarifying and ratifying an evaluation design. Reflection on where you are at with evaluation at the start and mid-way through a project is a useful exercise to do as a team and with stakeholders.

A clear picture of evaluation designs is of practical benefit to projects (v.v. overall methodology, specific tasks & timelines - what, who, where & when). The Jisc DesignStudio provides a number of resources to assist in planning evaluation activities across a project timeline.

This checklist should help you in: (1) identifying relevant types of evidence - early/final key indicators, intended/unintended, direct/indirect outcomes, final impact measures; and (2) planning evaluation activities that gather and analyse that specific evidence in a variety of forms (surveys, interviews, usage stats, video responses, etc).

You can bookmark and return to this form at: http://bit.ly/dlevalform

Jay Dempster: email: jay@belanda-consulting.co.uk
---------------------------------------------------------------------------------------------------------------------------------------------

PROJECT NAME *
For reference to answers :-)
Your answer
Your email address *
A copy of your responses will be emailed to you prior to the webinar.
Your answer
Q1. Briefly describe your approach to evaluating your DL project. *
What's needed here is not a lot of detail or tables, just an outline of your overall methdology/design and/or the stage you feel you are at?
Your answer
Q2. In what ways have you used the DL synthesis framework or evaluation templates offered by the programme, and/or created your own tools/approaches? *
For example, you may have developed your evaluation design or plan in each of the six focus areas (strategy/policy, infrastructure, support, practice, expertise, stakeholders), some areas may be more relevant to your project and its evaluation, or you may have extended or adapted these tools or those from other programmes, institutional audits, skills/competence/graduate frameworks, etc..
Your answer
Q3. What are your 'big picture' questions in relation to your intended changes/enhancements? *
You should refer to your project aims, objectives & impact descriptions in your most recent project plan. Clarify what and who are you aiming to change or enhance? You may need to distinguish questions for your different stakeholders (who is most interested/concerned about these areas?).
Your answer
Q4. How far along are you in identifying what 'success' looks like in relation to those intended outcomes? *
Please say where you are with creating/sharing a clear set of key indicators and impact measures at interim and final stages. For instance, what kinds of early evidence of progress towards outcomes do you expect and have plans to gather; what processes do you have in place to capture anecdotal/unintended outcomes (blogs, user diaries, observation, focus groups, note taking at stakeholder events), and how are you engaging with stakeholders to review & respond and come to conclusions about emerging findings, benefits and impact measures that are interesting and convincing to them?
Your answer
Q5. How well do you feel your evaluation activities integrate with other project activities and gather data about uptake and engagement levels? *
E.g. Do you have workpackage tasks that detail specific evaluation activities, including the above as well as other impact measures such as usage/participation/satisfaction data. Does your timeline show clearly the linkages/opportunities to integrate evaluation with baselining, development work, implementation/pilots, stakeholder consultation/dissemination etc. ?
Your answer
Q6. What further support do you feel you need or would benefit from in relation to designing, planning out or undertaking your project's evaluation? *
For instance, preference for wiki resources/toolkit, webinars, cluster sessions, individual skype conferences, identifying external evaluators.
Your answer
Q7. Are there any other points you'd like to raise about evaluating your project?
Anything that hasn't come up in these questions.
Your answer
Q8. Finally, reading through your answers above, what are the critical elements you feel you may need to pay closer attention to? (I suggest you put some ideas down now, and revisit this as an action point after the webinar.) *
For instance, you might wish to adjust the scope or focus/priorities of your evaluation questions, review evaluation milestones or timeframes for identifying/gathering different kinds of quantitative and qualitative evidence, consider alternative methodologies & data gathering or analysis needs, align evaluation with other stakeholder engagement activities as opportunities to ask about types of evidence that would be relevant and convincing to them, discuss early findings, review project effectiveness & communication, consult on emerging outcomes and final impact measures).
Your answer
Submit
Never submit passwords through Google Forms.
This content is neither created nor endorsed by Google.