Is AI your new non-profit coworker? Here are the ways that social purpose organizations are adopting ChatGPT.
Why It Matters
It only took five days for ChatGPT to reach a million users. While there are clear benefits to using an AI-enabled tool when it comes to reducing time and resources spent on tasks, there are also some open questions for the social impact world in particular: what are the ethics of using artificial intelligence when working directly with communities?
This independent journalism on data, digital transformation and technology for social impact is made possible by the Future of Good editorial fellowship on digital transformation, supported by Mastercard Changeworks™. Read our editorial ethics and standards here.
To what extent does ChatGPT have a role in community work? Future of Good decided to ask the tool itself that question. In response, it said: “While I can’t directly participate in community work, I am here to support and assist those who do.” Future of Good then asked what the most pressing social issues in Canada are at the moment, to which it gave only four responses: the Covid-19 pandemic, climate change, income inequality, and Indigenous rights.
Launched in November 2022 by OpenAI, ChatGPT is a conversational bot driven by artificial intelligence. The intuitive nature of the algorithm, and the human-like responses it provides, meant that it only took five days for the platform to reach a million users. In comparison, it took Instagram two months to get to its first million, and Netflix three and a half years.
With non-profits constantly under pressure to become leaner in their operations and cut costs, ChatGPT has presented a great opportunity to relieve some of these administrative burdens, and shift staff focus to more high-value, urgent, people-facing tasks.
Heenal Rajani is a social entrepreneur who set up a package-free grocery store and community hub in London, Ontario, the traditional lands of the Anishinaabek, Haudenosaunee, Lunaapeewak and Attawandron. “I run a small non-profit and I’m also a contractor. In those contexts, while it’s not earth-shattering, it has given me excess capacity,” Rajani says. “I’ve been using it almost like an intern, to summarize reports, reply to emails and build social media posts. I can direct my energy to more high-value things.”
Even for more complex tasks, Rajani has found himself using ChatGPT. As a contractor, he has used it to draft an initial framework of an operational plan for one of his clients, which is a document outlining the goals of an organization, and the processes required to get there. “It’s not my area of expertise, but ChatGPT was helpful for giving me some initial pointers. It came up with a framework, which I could take back [to the client], and then put feedback back into ChatGPT.”
While Rajani will often start with very broad parameters – essentially asking ChatGPT for initial pointers for what to include in an operational plan –, he will then repeatedly ask the AI tool to build each point out until the output gets more focused. However, he has also noticed that often the longer the thread of text, the more likely ChatGPT’s output is to feel like it has lost focus, or to repeat itself.
When Future of Good asked ChatGPT itself why its output became less focused as its responses became longer, it seemed confused at the question. Below is the response, which after around five minutes of waiting, remained unfinished – likely due to high website traffic.
Another example of effective use of ChatGPT comes from Roots and Rivers, a social innovation consultancy. They are currently working with a non-profit client to help them improve their understanding of spreadsheets and data analysis. In order to do that, they need some data to play around with: specifically, they need stories of people who have been part of the non-profit’s programs, and their opinions about the programs. But using existing, real-life data from the non-profit’s own clients raises all sorts of privacy and sensitivity issues, says Interim Executive Director Hayley Rutherford.
So instead, Rutherford and the team have been using ChatGPT to generate dummy case studies that resemble client data. For example, Rutherford will ask the tool to produce a short story of an individual of a certain age, who has attended a community program run by a local non-profit, and ask what their experience of it has been.
These dummy stories help the non-profit in two ways: they provide a set of data that can be used to help non-profit staff improve their data analysis skills, without compromising real client data in the process.
But the AI’s output is far from perfect, Rutherford says. “It starts to pick up certain bits of language and feed it back, when that’s not necessarily how people would describe their experience,” she says. ChatGPT would also generally produce stories that had a positive angle, making the assumption that everybody who had interacted with the non-profit’s program had had a positive experience.
How is ChatGPT helping social purpose organizations manage content, communications and social media?
Meanwhile, the team at Venture2Impact, which links skilled volunteers with non-profits, has been using ChatGPT to break down and execute an SEO strategy – in other words, the tool is helping the organization increase its likelihood of being higher in Google search rankings, both for corporate volunteers and non-profits looking for technical help.
Co-founder and Executive Director Fadi Al Qassar and his team have been feeding one-line prompts into ChatGPT so it can produce relevant long-form content. “We can go from [producing] one blog post a month to one a week,” Al Qassar says. “That is going to have an impact on our search rankings on Google, so non-profits and for-profits looking to get involved in our initiatives can find us.”
Al Qassar has also had to contend with the challenge of vague outputs when working with ChatGPT. “I asked it to produce one blog post about the benefits of skills-based volunteering. It gave pretty accurate information, but it was pretty general. There were no citations, for example.” He adds that it’s likely that teams will still need to take time to customize any output that ChatGPT produces. Here is a snippet of the output from ChatGPT, specifically after Al Qassar asked the tool for specific research to validate its findings:
“One study found that skills-based volunteering can help nonprofit organizations to improve their capacity and effectiveness, leading to more positive outcomes for the communities they serve.
Other research has found that skills-based volunteers report higher levels of personal and professional satisfaction compared to traditional volunteers, possibly because they are able to use their skills and expertise to make a bigger impact.”
Like Rutherford, Al Qassar recognizes the importance of storytelling to non-profits. “But it’s tough and it takes a long time to do, and we’re all resource-strapped,” he says. He adds that ChatGPT can help non-profits build some of that communications and storytelling capacity. And instead of needing to expand the staff team to focus on these areas, the organization can instead invest in hiring staff to expand outreach and community programming, as well as investing more in staff support.
The team at End Homelessness St John’s have also begun using ChatGPT to craft social media content. “[ChatGPT] immediately gives you a starting point and a prompt that we can then build off of,” says Manager of Communications Matt George. “It gets you over writer’s block.” While George adds that he doesn’t think the tool is “world-changing” at this point, it can help non-profits save time on certain “low-stakes” tasks.
What are some of the challenges of relying on ChatGPT in its current form?
But George also emphasizes that the content has never been perfect on the first try, and that each social media post that is generated by ChatGPT still goes through multiple people in the organization before being posted out to the community. The free version of the tool is, at the moment, a “research preview”, which means that it is gathering information and feedback to inform an updated – and likely paid-for – version.
“It will confidently give you the wrong answer,” George says. “Everyone who is using it needs to hold that close to their heart.”
There’s also the challenge of blindly trusting what the tool produces, without questioning whether the information is correct and accurate – ultimately, George suggests, non-profits just don’t know what they don’t know. As someone who works in the housing and homelessness sector, he can generally gauge the accuracy of the information that ChatGPT provides him with. However, he adds, “I don’t know about particle physics, and I’m not good at sussing out how accurate the information is there.”
“It’s not going to be a replacement for people just yet,” George adds. “You’re always going to need human input – from someone who knows the ins and outs – on these things.”
This is where the need to input specific parameters into ChatGPT comes in. The more vague the question you ask the tool, the more vague the response is likely to be. For Rutherford, that raises some red flags when it comes to generating internal policies. “An HR policy might look completely different in Ontario in comparison to BC,” she says. “ChatGPT might not be accounting for critical information unless you’re asking it to.”
Similarly, lawyer Mark Blumberg asked the tool a number of charity law questions, finding that it gave incorrect information about how quickly the CRA would revoke a charity’s registration if it failed to file its T3010. “In fairness to ChatGPT, which is probably using internet content, much of the internet content on Canadian charity regulation is inaccurate, misleading [and] out of date,” Blumberg concludes. However, for organizations looking for factual or regulatory information, ChatGPT can end up spitting out inaccurate information, and it’s vital to carry out additional due diligence.
What are the wider implications of relying on an AI-based tool in social purpose work?
“Though we’re in the beginning stages of ChatGPT and other AI-based tools, we’re also at the start of a fundamental paradigm shift for technology,” George says. “We’re at the level that the internet was in 1989 right now, but in a couple of years, people are going to be surprised how it evolves and what it will be able to do. It will change how we’re all interacting with the internet itself.”
While ChatGPT may be the most ubiquitously used AI-based tool at present, Rajani points out that there is a growing portfolio of others, some with quite specific uses: Bearly generates blogs, summarizes documents and corrects grammar, while Cody acts as a virtual HR assistant, using artificial intelligence to respond to employee queries within an organization.
Having started out as a free tool, ChatGPT is often oversubscribed – so, not all users trying to access the site at once are able to. The steep increase in demand has led OpenAI to announce a paid subscription of the tool, called ChatGPT Plus, as of February 2023. At $20 per month, ChatGPT Plus customers will be able to guarantee access to the tool despite heavy website traffic, as well as priority access to new features.
That opens up more questions about “how the technology is going to be equally distributed,” Rajani adds. “It can be a leveler, or it can exacerbate differences. It can enable someone who is already making a lot of money to make even more money. What is that going to do for the distribution of wealth and power?”
Michelle Baldwin, Senior Advisor of Transformation at Community Foundations of Canada also raises the issue of the knowledge that the AI is trained on coming from “colonized and white-dominant structures.”
She adds: “How are we perpetuating certain knowledge and voices?”
When Future of Good posed the question to ChatGPT on its use of Indigenous knowledge, the response seemed vague. While it recognized the importance of Indigenous knowledge to its own efficacy, and that it wasn’t able to “include the full breadth of Indigenous knowledge”, it also didn’t seem to be taking active steps to include more non-Western forms of data into its learning model.
The team at Keela, a donor management platform, have been using ChatGPT both to create first drafts of documentation, and to “summarize lengthy articles into digestible takeaways,” says Meredith Gray, senior marketing manager. For non-profits, she says, it’s important to use ChatGPT in relation to “the areas of your work that can be enhanced with the tool, not replaced.”
And it’s likely that non-profits will be relying on the free version of ChatGPT, which is, at present, gathering data and feedback to inform the development of an even more intuitive tool. “Since the prompts that we are putting into the tool are recording and helping it learn, it’s important to consider the privacy and trust of stakeholders when entering [prompts] into ChatGPT,” Gray says.
“Privacy is incredibly important and every organization should have policies in place to ensure the safety of their data. For non-profits specifically, it may not be the right tool to use for things such as prospect research.” It’s also important for non-profits to be aware of inputting potentially sensitive client data into the tool, especially without the explicit consent of clients themselves.
For Mark Abbott, the managing director at the Engineering Change Lab who recently launched the Tech Stewardship Program, it’s the rapid pace and power of ChatGPT that differentiates it from other transformative technologies. There’s an ongoing tension between being able to do something and considering what we should be doing with the technology, he adds.
“Tech is human, so it’s up to us what this means. What are the opportunities to help people transition to new jobs? Can [ChatGPT] be used for retraining in specific fields? Can it be used to make jobs more fulfilling?” The answer, he says, is yes, but it’s likely that these will not be the focus areas for people using the technology for some time to come – which will only perpetuate the same systems of inequity we see today.
“It’s not hard to imagine that generative AI [artificial intelligence that can create content, such as text, images and video] will be more often focused on selling us stuff we don’t need, as opposed to expanding quality healthcare to underserved areas,” he says. “It’s not hard to imagine that we will charge ahead with using generative AI to disrupt multiple sectors and then eventually clean up the mess of unintended consequences that will result with regulation some years down the road.”