In the last year, the considerable impact of Artificial Intelligence (AI) has become a prominent topic of discussion and deliberation nationally, particularly within the charity domain. Whether you’re a trustee currently utilising AI, contemplating its implementation, or uncertain about its potential, it’s crucial to understand both the advantages and pitfalls. The primary focus should be on employing AI in a responsible manner that advances the objectives of your charity.
What is AI
AI, often defined as the capacity of machines to carry out cognitive tasks typically associated with human intelligence, is not a novel concept, contrary to recent media emphasis. Machine learning, a subset of AI, has been employed in healthcare since the 1970s. While much attention has been drawn to the emerging ChatGPT tool, numerous widely-used services that may not immediately register as “AI” already integrate it to varying degrees. These encompass the algorithms powering streaming platforms, predictive traffic data in navigation applications, email spam detection systems, and the voice-activated assistants ubiquitous in our daily lives.
How are charities currently using AI?
AI is anticipated to trigger significant transformations across all economic sectors, with charities being no exception. According to the 2023 Charity Digital Skills report, 35% of charities were already employing AI for specific tasks, and an additional 26% had intentions to integrate it in the future.
The potential benefits of AI are manifold, especially in enabling charities to streamline resource-intensive activities, thereby allocating more time to critical areas. In discussions within the sector, Generative AI stands out as one of the rapidly expanding fields. This technology utilises human prompts to generate written and visual content. Some charities are leveraging these writing tools for fundraising materials, bid writing, speeches, and policy drafting, while also employing ‘speech to text’ tools for meeting minutes.
Furthermore, there are emerging opportunities to directly incorporate AI into service delivery. For instance, the Surrey Wildlife Trust is spearheading the three-year Space4Nature program, funded by the People’s Postcode Lottery. This initiative combines satellite earth observation imagery with volunteer observations and AI to map and assess habitats across Surrey, illustrating one innovative application of AI in the charitable sector.

Evaluating the potential…
Initially, evaluate the potential applications of AI for your charity and assess their suitability. Reflect on the advantages and potential risks involved, and devise strategies for managing these within the framework of your trustee responsibilities and charity objectives. This analysis may entail identifying areas where AI tools could offer valuable insights or fill existing gaps, as well as determining the requisite skills for leveraging these tools effectively within your charity. Assess whether individuals among your trustees, staff, or volunteers possess these necessary skills, and consider any ongoing AI initiatives already underway within your organisation.
Maintaining good governance protocols
While opportunities abound, it’s prudent to exercise caution due to inherent risks associated with AI, which must be carefully evaluated and managed. AI, being a dynamic field, is still evolving and may not always provide accurate results. For instance, it’s not yet sophisticated enough to offer precise legal counsel, and Generative AI models can inadvertently produce flawed, plagiarised, or biased outcomes without recognising the potential issues.
Ultimately, trustees retain the responsibility for decision-making. Reliance solely on AI or AI-generated content for critical decisions, without conducting independent verification, could lead to breaches of duty. Charities must also adhere to broader legal obligations, including copyright and content safety, especially when incorporating AI into their operations, which can elevate risks such as data security and regulatory compliance.
Given AI’s nascent stage, responsible usage is paramount. Understanding data handling practices associated with different AI tools is essential, particularly for charities dealing with vulnerable beneficiaries or sensitive information like medical records.

Human oversight remains crucial, as it not only helps mitigate errors but also aligns with the relational aspect of charity work. Additionally, charities should be mindful of external risks like reputational damage stemming from AI misuse, such as the proliferation of fake news or deep fakes.
Remain mindful of trustee duties and managing AI risks
The Charity Commission is advancing its work on AI to gain further insight into its potential and risks, and to assess its compatibility with their regulatory role. Additionally, the Commission continues to engage with the sector, central government, and other regulators.
The Charity Commission does not currently anticipate producing specific new guidance on the use of AI, preferring – as for cryptocurrency – to encourage trustees to apply their existing guidance to new technologies as they emerge. However, the Commission will update guidance where appropriate to reference examples of new technology, as it did with the refreshed guidance on internal financial controls.
Charity digital skills report: Charity-Digital-Skills-Report-2023.pdf (charitydigitalskills.co.uk)