Funders are receiving more AI proposals, but not all are well-equipped to assess them
Why It Matters
Very few funders are giving grants for AI projects to begin with. Corporate and technology funders are more likely to have the in-house technical talent to evaluate grant applications, while traditional philanthropic foundations might need to call on third-party resources.

Not all funders feel confident assessing artificial intelligence grant proposals they receive from a technical or ethical perspective, according to new research.
Findings from Project Evident show that funders who have their own AI strategy in place internally feel more confident in evaluating the feasibility of grant applications they receive.
“If funders are not AI-engaged, they risk falling behind,” the report said.
The report also calls for more unrestricted funding for non-profits wanting to implement AI solutions, as it can enable more experiments and testing.
In a separate piece of research last year, the U.S. based organization – which helps social purpose organizations with its data and evidence-gathering strategies – also found that more non-profit staff were likely to be using artificial intelligence tools than those in grantmaking organizations.
This year, Project Evident specifically sat down with funders and grantmakers that had made one or more grants for an AI implementation project: 34 funders in total in the U.S. only.
Many funders are seeing an upward trend in the number of proposals that come their way with some element of AI implementation. According to one interviewee, the number of these proposals went from three in 2024, to 20 in 2025 so far.
While the sample size was too small to assess how different funders were approaching AI adoption, there was a clear distinction between corporate and philanthropic funders, said Sarah Di Troia, managing director of the OutcomesAI stream at Project Evident.
“It’s not surprising that there are assets inside of a corporate entity that can support grantmaking [to non-profits] in that they have access to engineering talent and data scientists,” she said. “That is just not as readily available in a standalone foundation.”
Philanthropy has the same technology problems as the non-profit sector
Project Evident specified the types of grants it wanted to explore as “funding that could support a technical build of an algorithm, purchase of an AI application or tool, support for an AI technical consultant, purchase of compute, purchase of AI credits (e.g. to run ChatGPT, Claude etc.), or other investments to implement AI.”
Interviewees included corporate foundations such as Google.org and OpenAI; family foundations; general foundations, such as the Patrick J. McGovern Foundation, which focuses on technology granting; and intermediary foundations / pooled funds such as Fast Forward.

Community foundations were missing from the sample, and 88 per cent of the sample grantmakers came from the largest 10 per cent of foundations, with annual grantmaking budgets of more than US$10 million.
“I don’t think that’s terribly surprising,” Di Troia said. “I think these are folks who have more dollars and who perhaps are willing to move forward into things that are new.
“One of the interesting things for foundations is that they are behind non-profits, but they actually share some of the same problems that non-profits share,” she said, adding that finding the right technical talent remains a challenge for philanthropic organizations, too.
Despite the wealth of education and training on artificial intelligence, much of it is catered to the for-profit sector, Di Troia found.
Adjusting the non-profit / funder partnership
A lack of funding dedicated to technology and general operations in the non-profit sector has been well-documented.
In its previous research carried out in 2023, Project Evident found that just under 80 per cent of the grantmakers surveyed did not have a separate priority funding area for technology. At the time, 70 per cent did not plan to create such a separate entity in the upcoming year, either.
“You can talk to any [non-profit] practitioner and they’ll tell you how hard it is to get general operating support,” Di Troia said.
Instead, many grantmakers still prefer to fund specific programs and projects.
She likened it to funding an HR policy within a single program, where this would be vital to underpin the whole organization. The same is true for technology investments, she said.
Six of the funders shared the criteria they use to assess AI grant proposals as part of the research. Project Evident evaluated each for common patterns and trends, blending them into a combined grantmaking rubric.
Funders were interested in knowing how prospective grantees had arrived at the conclusion that AI was the best and most appropriate tool to address the problem, how affected stakeholders and beneficiaries had been included, what sorts of data the tool would collect, and who would own the final product.
They also asked questions about risk mitigation to offset artificial intelligence’s potential negative impacts and data privacy and security standards.
Most funders said they needed some support to evaluate AI proposals: 59 per cent of program officers said they relied on internal expertise, while 68 per cent also said they sought support from external subject matter experts.
Here, there may also be a discrepancy in the type of funder – corporate or philanthropic – and the types of support networks they find themselves depending on.
As part of the research, a representative from the Chan Zuckerberg Initiative said their AI grant assessments “involve input from in-house software engineers and data scientists who work on our education team, and also from a central technology team that supports data privacy, security and trust infrastructure questions across our initiative.”
Meanwhile, the Siegel Family Endowment “brings in third parties as part of [their] inquiry.
“Not as a consultant or in a paid relationship necessarily, but more that when we see somebody who has a keen level of insight and who is willing to take an hour to come and talk to us, we really value that,” said a representative from the Endowment.
Project Evident also spoke to five non-profits that had applied AI about their grant-seeking and fundraising practices. They said they felt “enormous pressure to move quickly” in their AI tools and innovations.
Practitioners also noted a difference in the grants received from different types of funders. AI accelerators from large technology companies were thought of as great learning opportunities, but “there is also recognition that these grants are not entirely altruistic,” the researchers found.
“They [technology companies] are doing it because they’re trying to get your business. Everything isn’t free. They’re giving you money, but they’re also charging you money. It’s in their interest to do this,” said one interviewee.
This over-reliance on technology companies “could also contribute to a ‘leaky pipeline’ for funding where accelerators provide start-up capital, but there are no funders willing to invest in growth and sustaining AI applications,” the report pointed out.
AI adoption will need trust-based philanthropy
“Given the emergent nature of AI in non-profits, it is unlikely that the plans outlined in a grant proposal will come to fruition as expected,” the research highlighted.
Di Troia concluded that unrestricted funding—that is, funding that is not dedicated to particular programs but can be applied as general operating dollars—is best suited to AI experimentation in non-profits.
This can feel risky to grantmakers, as there may not be clear outcome metrics to track and report on progress.
“You can always find a reason to not fund something, to not give organizations the space and capacity to be able to do exploratory work because it can feel precarious or dangerous,” said one of the interviewees from the Ballmer Group.
“Our role as the funder is not for us to mitigate every single possible scenario that can happen as a piece of these projects. You need to empower organizations to do that work as part of a trust-based philanthropy approach.”
NOTE: Future of Good is interested in conducting similar research in the Canadian philanthropic sector. If you are a grantmaker / funder that is setting aside dollars for AI in the non-profit sector, we encourage you to reach out to: sharlene@futureofgood.co.