
Trauma Informed Communities - Learning Briefing 6
Peer learning from the Trauma Informed Work in the Communities Fund around the trauma informed approach. March 2024
Summary
This short report is based on the reflections of 9 different Trauma Informed Communities projects from the Trauma Informed Work in the Communities Fund and the Trauma Informed Work with Minoritised Ethnic Families fund. The peers have attended two sessions with Emily Ashdown, a Trainee Clinical Psychologist working with the Integrated Trauma Informed Practice team and are looking at evaluating through  trauma informed lessons.
The work is to build upon the evaluative tools and processes that already exist within an organisation rather than duplicate/add more evaluation. The projects have non-recurrent funding and there is uncertainty around the funding landscape across the third sector. The aim of this current evaluation focus is to work collaboratively around the evaluation so that it supports how the impact of the Trauma Informed Communities projects is shared. Themes included:
- Evaluation as telling a story
- Making data meaningful
- Showing evidence, playing the game
- Deciding what to evaluate: process, impact or outcome.
- Different way to evaluate
- Qualitative vs quantitative
- Standardised vs collaborative tools
- Evaluating through a trauma informed lens
- How can we make evaluation more trauma informed using the principles?
- Is it worded in a positive strengths based way?
- Do evaluations support that idea that we are helping people to build up resilience and trying to connect them with protective factors in their lives?
- Finding a united framework to use
- Five Ways to Wellbeing
- Strengthening Familiesâ„¢ and the Protective Factors Framework
*tools - A method of collecting data
*framework - A model/theory
Discussion Points
Evaluation as telling a story
- Sometimes evaluation can feel like a tick box exercise, an additional chore but it can be really useful and meaningful depending on how we think about it and how we create it and what we do with it. If we do it for the sake of it, we can miss opportunities for growth.
- Different pressures, different commissioners and organisations require different outcomes so there is a need to consider ‘playing the game’ and the boundaries and limits that are put upon us and use evaluation as a persuasive tool to highlight service impact when applying for grants.
- When thinking about what kind of story you are wanting to tell about your service, it can be useful to think about:
- Process (How is the programme implemented and how is it operating?)
- Impact (the effectiveness of the programme - have the goals been achieved?)
- Outcome (What are outcomes like for people who have accessed the programme?)
- Some of these outcomes might be unchanged for some of the people if people are in really difficult circumstances. How do we make sure that the outcomes we are looking at are really meaningful?
Different way to evaluate
Quantitative vs Qualitative
Quantitative approaches - Looking at numbers
- Facts and figures can provide a universal language for anyone.
- Data is captured quickly and cost effectively.
- Offers opportunity for comparisons
- Does not always capture detail or the mechanism for change. We have seen something change but what is the why?
- It does not really allow the voices to be heard. We have decided the questions.
Qualitative approaches - Looking at personal accounts
- Looking at personal accounts rather than numbers, data is richer and can give us more context to someone’s experience.
- Allows us to hear the voices and perspectives of people accessing the services.
- Takes longer to implement these approaches and requires more consideration in how it is reported.
Standardised Tools vs Collaborative Tools
Standardised tools - Pre-existing measures that are already there to use
- Pre-existing tools and quick to implement.
- Useful to compare results across services if other organisations/contexts use the same tools.
- Evidence based measures that are recognised  by commissioners so lands more credibility to what you are using.
- Might not capture what you or people using the organisation find important.
- Does it have to be one or the other? Can collaborative approaches be weaved in?
Collaborative tools - Measures created in collaboration with the people who are using the organisation.
- Allow the people we support and their needs to be centred in the evaluation.
- Empowering to be involved in the process. What do you think is important for us to assess here? What is important for you?
- Offers an opportunity for information exchange.
- Collaborative or co-produced tools are not scientific and measures are not tested for reliability.
- These tools are more time consuming to create.
- Balancing the idea that is tested for reliability with something that is relevant and created in partnership with those who we are working with.
Evaluating through a trauma informed lens
- There is not one way to evaluate in a trauma informed way.
- Doing it in a trauma informed way is really going to depend on your service context and the context of the people you are supporting.
How can we make evaluation more trauma informed using the principles?

- Does the evaluation have an element of collaboration?
- Does it offer an opportunity for the people to be empowered through the process?
- Are the Cultural, Historical and Gender issues considered of the people we support considered in the content of the evaluation or the delivery of the evaluation?
- Are you transparent about how the data is being used?
- Is there trust? Do people that we work with, do they trust us how we use this data?
- Do we ask questions in a supportive, safe way?
Does the evaluation take a strengths-based approach?
- Shifting from what’s wrong to what’s strong instead.
- Considering how your evaluative tools are worded. Is it worded in a positive strengths based way? Are we thinking about how we can build upon someone’s strength or is it framed in quite a negative way?
- In the tool’s content, are we thinking about everything that is happening that is difficult for the people we support or are we thinking about how strengths are showing up in that person’s journey through our organisation or the work we are doing with them?
- Is there an opportunity to highlight people’s strengths a bit more and get people connected to those strengths through the process?
- Do our evaluations support the idea that we are helping people to build up resilience and connecting them with protective factors in their lives?
Can evaluation support context rather than the individual outcomes?
- Often it is not the individual outcomes of people – it is not something about the person that needs to be fixed or is deficient.
- Often it is people’s contexts that mean that they are stuck in difficulty.
- Change/Fixing does not need to be done on a person level but on a wider level and how do we fix that.
Finding a united framework to use
The group looked at two frameworks: Five Ways to Wellbeing and the Strengthening Familiesâ„¢ and the Protective Factors Framework. The group were asked to pick a framework that would support them to collectively tell their story but each organisation/project would use their own evaluative tools. Both frameworks could be amended to suit the needs of the group. E.g. wording to be adapted for the specific age range.
- Reminded a grant holder of the 5 points in early years curriculum - could make a board to share with yourself. Clear and achievable.
- Loose guidelines to assess where you would want to evaluate your work.
- Involving the YP in telling the story of the service - informal and less structured way to gauge feedback from the service users to see how the project has impacted them
- This is the simplest especially when already struggling for time and balancing workload. .
Strengthening Familiesâ„¢ and the Protective Factors Framework
- Potentially a really useful framework for those who work with young people, parents and families.
- Can be amended to suit the organisation’s needs e.g. Parental resilience - YP personal resilience.
- Due to the wording needing to be adapted, it did not feel like this would be the simplest framework to use.
Next steps
The group will next be focusing on the tool that they use in their evaluation and applying a trauma informed lens to their chosen tool.

www.forumcentral.org.uk                                             Trauma Informed Communities Learning Briefing 6 Version 1 04.04.24                   Â