Generative AI, the powerful technology that has revolutionized various sectors, is yet to make a significant impact in the field of philanthropy. While the rise of AI tools has led to a wave of innovation in many areas, their uptake in the realm of philanthropy has been relatively limited. One reason for this might be the abundance of freely available AI tools that can be used for a wide range of tasks.
Grant applications, for instance, present a prime use case for generative AI in philanthropy. The current process is often manual and time-consuming, requiring organizations to answer similar questions in different ways. This is where AI tools can shine. By inputting existing internal documentation, these tools can provide prompt answers to a variety of criteria, such as specific word counts. Many charities have already started using AI tools to support their grant application process.
However, there are drawbacks to the increasing reliance on generative AI in philanthropy. Some organizations have reported a surge in generic grant applications due to the ease with which AI tools can create plausible content. Consequently, these generic applications are being rejected. This raises an important issue: the need for careful and human-centered consideration when using generative AI tools in philanthropy.
When it comes to AI tools, their effectiveness ultimately depends on the questions posed to them. For example, if asked to write an annual report, an AI tool might request information about the organization or simply generate text resembling an annual report based on its training data. The problem with the latter option is that the content produced may lack factual accuracy and fail to provide any context or references.
Using AI as a shortcut to create content undermines the authenticity of philanthropy. While it may seem like a time-saving measure for nonprofits, it is crucial to first develop a strategy and build a compelling story before utilizing generative AI. The process of powerful storytelling requires an understanding of the target audience, the intended medium, and a clear articulation of the organization’s purpose, all of which are aspects that AI tools cannot grasp.
Whether writing a grant application or a social media post, AI tools can be valuable for providing feedback or generating a first draft. However, it’s important to recognize that humans should remain in control. AI should be viewed as the supporting first officer assisting the pilot (the human) in achieving their objectives.
At Lightful, we have taken a comprehensive approach to AI by establishing a dedicated cross-functional team called the AI Squad. By focusing on the challenges faced by the nonprofits we work with, we explored how AI tools could help overcome those obstacles. Trust, equity, and responsibility are the core principles that guide our AI development process. We actively address potential bias through prompt design, collect feedback from diverse groups, and work to ensure that AI tools are not perpetuating biases unknowingly.
Furthermore, the concern about biased content arises from uncertainty regarding the training data used for AI models. Often, the specific content of the training set is unknown, especially in areas related to underrepresented groups. It is only by using AI tools, applying them to niche or complex tasks, and critically reviewing the outputs that biases can be uncovered and addressed. A self-fulfilling prophecy can occur if biased individuals are the primary users of AI tools, leading to further bias over time.
While AI tools have the potential to deliver incredible benefits in terms of storytelling, audience reach, and time-saving, they have limitations. Only through usage can these limitations be explored and decision-makers can determine whether AI tools are suitable for their philanthropic endeavors.
In conclusion, the smart utilization of generative AI tools offers significant opportunities to enhance efficiency and effectiveness in philanthropy. However, it is vital to avoid generating plausibly generic content and stay committed to the human-centered aspect that makes philanthropy powerful. By adhering to these principles and using AI tools responsibly, organizations can harness the potential of AI while keeping the focus on their mission to create a positive impact in the world.
Jonathan Waddingham is the Managing Director of Learning at Lightful, a B Corp.