| A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z | ||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1 | This spreadsheet was created by Lime Green Consulting. Please check the data independently before using it yourself. | ||||||||||||||||||||||||||
2 | Funder | Checked | Source of info | Overall stance on using AI | Summary of their overall position and advice | Pros | Cons | Are they using it to assess applications? | How are they using it? | ||||||||||||||||||
3 | Arts Council England | 10/12/2025 | Website statement | Generally negative | Use with caution | We understand that AI technologies may be useful in streamlining reports, and/or the process of applying. | Generative AI tools create potential risks around bias, transparency, data protection and the moral and legal rights of creators. We are also increasingly aware that the use of these tools can lead to the submission of applications with similar language and text, which can distract from an application’s unique proposition. | No | Assessment of Arts Council England funding applications is carried out by Arts Council staff, who are experts in their field. We do not use AI technologies in carrying out the assessment, or decision-making on applications. | ||||||||||||||||||
4 | Baring Foundation | 10/12/2025 | PDF guidance notes | Neutral | A decision for individual organisations to take and we have no preference. | None cited | [Need to] make sure it is an accurate reflection of your work and plans. | Not currently but may review | The Baring Foundation does not currently use AI to aid its decision-making about grants or in our administrative processes. We recognise that this is a fast-moving area of technology and our approach may need to shift as the possibilities and implications of AI become clearer. | ||||||||||||||||||
5 | Barts Charity | 10/12/2025 | PDF guidance notes | Generally negative | Use with caution | None cited | [Applicants need to] protect confidential information and personal data; consider the risks when using AI tools with unpublished research, original intellectual property, commercially sensitive findings; consider the risk of bias; remove any false, misleading or hallucinated information. | For some purposes but not assessment | Whilst Barts Charity may use AI tools to support our processes, we do not use it when deciding whether to award funding. Staff may use generative AI tools to assist in the administration of grant applications, in creating content to promote funded grants, and when writing proposals for fundraising. | ||||||||||||||||||
6 | Children In Need | 10/12/2025 | Included in their A to Z Policies and Guidance | Generally negative | Use with caution and carefully review the output. | We understand [AI] can support organisations writing applications. | AI often produces generic content without capturing your organisation’s unique skills, experience and the voice of the children and young people you work with. If using AI generated content in your application, it is vital that you carefully review it to ensure it is an accurate representation of your organisation and the work that you do. Be careful what information you input into AI tools, to not compromise your legal obligations in relation to Confidentiality, personal data, Intellectual Property or similar. We also recognise the environmental impact of AI and encourage applicants to consider sustainable and responsible AI use. | For some purposes but not assessment | Whilst we may use AI tools to support our funding processes we do not use AI for decision making at BBC Children in Need. | ||||||||||||||||||
7 | City Bridge Foundation (specifically for their Access to Justice programme) | 10/12/2025 | PDF guidance notes | Neutral | Use AI ethically and transparently. AI should support, not replace, your thinking, expression, and lived experience. | AI tools can be a helpful way to improve clarity, structure, or grammar; translate content, adapt language, or rephrase for readability; support early-stage drafting, provided that the final content reflects your organisation’s voice. | It’s often noticeable when AI has been over-relied on. Using AI does not give you the best chance of success. Such applications can feel generic and may not bring out your organisation’s voice, context or uniqueness. [Applicants should] avoid submitting content that is generic or detached from your values; check for accuracy (AI tools can invent data, produce false or misleading information, and incorrectly reference); protect privacy; comply with data protection, copyright and intellectual property law. | Unknown | |||||||||||||||||||
8 | Coop Foundation (specifically for their Future Communities Fund Round Two) | 10/12/2025 | PDF guidance notes | Generally negative | Whilst we will consider applications that have been partially generated using AI, we recommend that you consider the following [see column F]. We reserve the right to reject any AI-generated applications if we have concerns around their factual accuracy. | AI is a powerful tool that can help organisations to work in more efficient ways. We recognise that AI tools can support individuals who have little to no experience of writing funding applications and can remove barriers for people with disabilities and individuals with English as an Additional Language (EAL). | Applications generated using AI are often generic and fail to capture an organisation’s unique knowledge, value, distinctiveness, skills or impact. This can make it difficult for us to understand what is different or special about your organisation and the work that you deliver. Being too generic may disadvantage your application. AI can generate inaccurate content which can undermine the strength of your application. [Other concerns mentioned are storage of data, ethical challenges (lack of regulation; use of data and imagery without consent and in-built biases which can reinforce discrimination); water and energy requirements. | Not currently but may review | Co-op Foundation will not use generative AI tools in any part of the assessment or decision-making process for this funding programme. As a funder, we are still exploring and learning more about the potential implications of AI tools. As such, in the future, we may update our position on the use of these tools. | ||||||||||||||||||
9 | Esmee Fairbairn Foundation | 10/12/2025 | Website statement | Neutral | No preference - a decision for individual organisations. If you use AI to help draft applications, make sure it’s an honest reflection of your work and plans. | None cited | None cited | For some purposes but not assessment | Using AI tools to support funding processes, not for decision-making. Using it to streamline processes, facilitate better communication and collaboration, improve accessibility. Examples include notetaking and live transcription. | ||||||||||||||||||
10 | Forever Manchester | 10/12/2025 | Website statement | Neutral | Use with caution - as a starting point for drafting and formatting, but check carefully | We know that AI tools can be a huge time-saver for grassroots community groups and charities when it comes to writing applications, and we recognise how useful they can be. They have the potential to level the playing field for smaller groups and charities who don’t have paid staff or fundraisers to help. | Using AI to write a funding application can give you a useful starting point but often what it produces for you is not as strong as it might first appear. AI-generated applications can sometimes be generic in the answers they give and can sometimes make things up entirely. It may not capture the personality, knowledge and strengths of your organisation. [They also flag concerns with safety/security and environmental impact] | Unknown | |||||||||||||||||||
11 | Henry Smith Foundation | 10/12/2025 | Included in their FAQs | Neutral | Use AI thoughtfully to support – not replace – your unique voice. Let it help with your structure, but write the words yourself. | AI tools, like ChatGPT, can be powerful aids, offering new ways of working that save time, improve accessibility, and support creativity. | Be sure not to share any sensitive or personal information when using AI. We have noticed that AI-generated applications can include incorrect facts. If you use AI for research, we suggest using tools that provide links to their sources, like Perplexity. This allows you to double-check the information you are using and make sure it’s from reputable sources. | Unknown | |||||||||||||||||||
12 | John Lyon's Charity | 10/12/2025 | Website statement | Neutral | You can use it, but remember the most important elements of a good grant application. | AI can bring benefits, improve efficiency, and support those who are not familiar with, or face additional barriers in, writing grant applications. | However, AI-generated text can be generic, inaccurate, and may not fit [our] grant guidelines, which could affect your chances of receiving a grant. As a relational funder, John Lyon’s Charity often visits grantees, and we expect your written submissions to reflect the experience we would have when visiting you in person. | No | Does not currently use generative AI too ls in any part of the assessment or decision-making process. | ||||||||||||||||||
13 | Joseph Rowntree Foundation | 10/12/2025 | Website statement about the use of AI in policymaking (NOT funding applications) | Strongly negative | Adopting AI for public policymaking would be to submit policy to wider corporate and ideological agendas. | None cited | Despite its claim to assist or even automate the production of policy solutions, AI is incapable of addressing the tricky structural issues that impede actual change. What it offers instead is a performative ‘fix’ that both obscures and amplifies the underlying problems. [The main issues explored are] Correlations and hallucinations, Scaling injustice, Unsustainable solutionism | Unknown | |||||||||||||||||||
14 | Kent Community Foundation | 10/12/2025 | Website statement | Neutral | Use these tools responsibly and ethically | Digital and AI tools and technologies are becoming increasingly accessible and have the potential to save time for charities in many ways. Technology should be used to support you, giving you prompts and ideas, speeding up the process of editing. | Generic applications created by AI alone will not provide the information we need to assess a funding application. [...] we are not looking for perfect spelling and grammar; what matters most is that we clearly understand what you are trying to achieve with the funding you are applying for. | Not currently but may review | We do not currently use AI as part of our assessment process and we review every individual application. If this changes, we will tell you. | ||||||||||||||||||
15 | Leeds Community Foundation | 10/12/2025 | PDF guidance notes | Generally negative | Use with caution | AI can help people to communicate their own thoughts and ideas, particularly for those for whom English is an additional language or those who face barriers to making written applications. [Identical wording to London Community Foundation] | In most cases of AI generated answers we have seen to date, they’ve made the application or report worse. Answers tend to lack depth, local context and include inappropriate use of jargon. In applications, this has made the proposals seem less feasible and poorly aligned with the aims of the fund, making them unlikely to meet the required quality threshold to be fundable. | Not currently but may review | Currently, our staff don’t use AI software anywhere in the grant making process, and we will inform our stakeholders if this changes. | ||||||||||||||||||
16 | Lloyds Bank Foundation | 10/12/2025 | Website statement | Generally negative | We understand if you want to use AI to support you, but be mindful of some key considerations. | There are some benefits to using AI, from increased productivity to removing barriers for Disabled people and those whose dominant language is not English. | As a funder and an employer, we are seeing a trend of over-reliance on AI in applications. It is often noticeable when AI has been over-relied on and, in many cases, this is not giving applicants the best chance of success. These applications are often generic, and do not bring out your own voice or uniqueness. [Applicants need to] Avoid generic responses; Look out for AI inaccuracies; Protect your data; Consider the impact on our environment. | No | We read every single funding and job application we receive and do not use AI in any part of our decision making. | ||||||||||||||||||
17 | London Community Foundation | 10/12/2025 | Website statement | Neutral | Be cautious when using AI to generate answers to questions in grant applications and reports. | AI can help people to communicate their own thoughts and ideas, particularly for those for whom English is an additional language or those who face barriers to making written applications. | If using AI generated answers, make sure the final submission includes sufficient depth, relevant local context, and avoids inappropriate use of jargon. | For some purposes but not assessment | We have recently established an AI working group, and [...] are beginning to trial AI tools to streamline internal processes, such as AI-powered notetaking. However, AI tools are not currently influencing any grant decision-making, and we will inform our stakeholders should this position change. | ||||||||||||||||||
18 | National Lottery Community Fund (also used as the basis for the National Lottery Heritage Fund policy) | 10/12/2025 | Website statement | Neutral | Use with caution | AI tools can support you if English is not your first language or if you’re new to writing funding applications. Many of our customers find that using AI helps them write their applications faster and with less effort. | AI supported applications do not tell the unique story of your community and how you want to support them. Being too generic in content may disadvantage your application. AI tools often produce generic content or include buzzwords that don’t capture your unique perspective or your community’s voice. [Other advice:] Look out for inaccuracies, Your data might not be private, AI has an environmental impact | Unknown | No info, but they have published a short document summarising their 10 principles when using AI | ||||||||||||||||||
19 | Paul Hamlyn Foundation | 10/12/2025 | Website statement | Generally negative | Use AI tools responsibly, following relevant legal and ethical standards | Generative AI can save time, and it can make processes more accessible such as by helping with translation or transcription. | They provide detailed information about "Concerns about AI" broken down by Ethics (e.g. in-built bias, lack of regulation, human/environmental cost) , Uniqueness and Accuracy | For some purposes but not assessment | We do not use AI to assess grant applications or job applications. Some of our administration processes are supported by AI – for example, using transcription services and meeting summaries. We may also use tools that identify the use of AI in your grant and job applications. These are used to provide additional information to staff rather than acting as a replacement for any previous ways of gathering information, assessing applications, and making decisions. | ||||||||||||||||||
20 | Suffolk Community Foundation | 10/12/2025 | PDF guidance notes | Neutral | Use with caution to ensure applications remain authentic, detailed, and impactful | AI tools are increasingly shaping the way we work, offering efficiency and accessibility | Inaccuracies, data privacy, environmental impact | Unknown | |||||||||||||||||||
21 | UFI VocTech Trust | 10/12/2025 | Website statement | Generally positive | Organisations should provide guidance to their staff on how to use AI responsible and be honest, transparent and accountable about its use in funding applications | AI is a powerful tool that when used effectively and ethically, has the power to enhance human capabilities and unlock human potential. | They emphasise the need to protect confidential information and personal data; remove any false, misleading or hallucinated information; be aware of third party intellectual property rights. | Unknown | We will respect confidential information and will declare any use of generative AI in the administration of grant applications. | ||||||||||||||||||
22 | Wellcome Trust (as part of the Research Funders Policy Group) | 10/12/2025 | Website statement | Neutral | Use responsibly and in accordance with relevant legal and ethical standards where these exist or as they develop | AI tools present opportunities and bring benefits in the context of research such as supporting content generation for computer code or assisting neurodivergent researchers or reducing potential language barriers [as part of funding application process] | AI tools present potential risks for research in areas such as rigour, transparency, originality, reliability, data protection, confidentiality, intellectual property, copyright, and bias. We want to protect against potential ethical, legal and integrity issues in the use of generative AI tools to maintain the high standards of the research and innovation we fund. | Unknown | |||||||||||||||||||
23 | |||||||||||||||||||||||||||
24 | |||||||||||||||||||||||||||
25 | |||||||||||||||||||||||||||
26 | |||||||||||||||||||||||||||
27 | |||||||||||||||||||||||||||
28 | |||||||||||||||||||||||||||
29 | |||||||||||||||||||||||||||
30 | |||||||||||||||||||||||||||
31 | |||||||||||||||||||||||||||
32 | |||||||||||||||||||||||||||
33 | |||||||||||||||||||||||||||
34 | |||||||||||||||||||||||||||
35 | |||||||||||||||||||||||||||
36 | |||||||||||||||||||||||||||
37 | |||||||||||||||||||||||||||
38 | |||||||||||||||||||||||||||
39 | |||||||||||||||||||||||||||
40 | |||||||||||||||||||||||||||
41 | |||||||||||||||||||||||||||
42 | |||||||||||||||||||||||||||
43 | |||||||||||||||||||||||||||
44 | |||||||||||||||||||||||||||
45 | |||||||||||||||||||||||||||
46 | |||||||||||||||||||||||||||
47 | |||||||||||||||||||||||||||
48 | |||||||||||||||||||||||||||
49 | |||||||||||||||||||||||||||
50 | |||||||||||||||||||||||||||
51 | |||||||||||||||||||||||||||
52 | |||||||||||||||||||||||||||
53 | |||||||||||||||||||||||||||
54 | |||||||||||||||||||||||||||
55 | |||||||||||||||||||||||||||
56 | |||||||||||||||||||||||||||
57 | |||||||||||||||||||||||||||
58 | |||||||||||||||||||||||||||
59 | |||||||||||||||||||||||||||
60 | |||||||||||||||||||||||||||
61 | |||||||||||||||||||||||||||
62 | |||||||||||||||||||||||||||
63 | |||||||||||||||||||||||||||
64 | |||||||||||||||||||||||||||
65 | |||||||||||||||||||||||||||
66 | |||||||||||||||||||||||||||
67 | |||||||||||||||||||||||||||
68 | |||||||||||||||||||||||||||
69 | |||||||||||||||||||||||||||
70 | |||||||||||||||||||||||||||
71 | |||||||||||||||||||||||||||
72 | |||||||||||||||||||||||||||
73 | |||||||||||||||||||||||||||
74 | |||||||||||||||||||||||||||
75 | |||||||||||||||||||||||||||
76 | |||||||||||||||||||||||||||
77 | |||||||||||||||||||||||||||
78 | |||||||||||||||||||||||||||
79 | |||||||||||||||||||||||||||
80 | |||||||||||||||||||||||||||
81 | |||||||||||||||||||||||||||
82 | |||||||||||||||||||||||||||
83 | |||||||||||||||||||||||||||
84 | |||||||||||||||||||||||||||
85 | |||||||||||||||||||||||||||
86 | |||||||||||||||||||||||||||
87 | |||||||||||||||||||||||||||
88 | |||||||||||||||||||||||||||
89 | |||||||||||||||||||||||||||
90 | |||||||||||||||||||||||||||
91 | |||||||||||||||||||||||||||
92 | |||||||||||||||||||||||||||
93 | |||||||||||||||||||||||||||
94 | |||||||||||||||||||||||||||
95 | |||||||||||||||||||||||||||
96 | |||||||||||||||||||||||||||
97 | |||||||||||||||||||||||||||
98 | |||||||||||||||||||||||||||
99 | |||||||||||||||||||||||||||
100 | |||||||||||||||||||||||||||