Online misinformation is adding to the administrative burden of community organizations – what can they do about it?

Charities and non-profits are increasingly handling the effects of false, inaccurate and misleading information, which impacts staff wellbeing and the communities they serve.

Why It Matters

Misinformation and disinformation that originates online can not only cause confusion and distrust among communities, but can also be directly tied to racism, misogyny and queerphobia, putting certain people at risk. For staff in community organizations, having to speak to the community about the origins of false information, or reporting the information appropriately such that it doesn’t spread, can add to their already heavy workload.

Getting your Trinity Audio player ready...
A forum held by the Anti-Black Racism Committee from the Family Services of Peel, in April 2023. Photo: courtesy of Family Services of Peel
A forum held by the Anti-Black Racism Committee from the Family Services of Peel, in April 2023. Photo: courtesy of Family Services of Peel

This independent journalism on data, digital transformation and technology for social impact is made possible by the Future of Good editorial fellowship on digital transformation, supported by Mastercard Changeworks™. Read our editorial ethics and standards here.

Elder Raymond is a fictional character on the Our Medicine Path platform, a “highly respected Cree elder who lives in Timmins, Ontario,” and a member of the Moose Cree First Nation. “Like many of you, I’ve had questions about decisions made by the government regarding the pandemic, vaccinations and now boosters,” he says. “I’ll take you along on my journey and share the answers to many of the questions I had.” 

Elder Raymond then addresses the “historical mistreatment of Indigenous people in Canada” and how “being first in line to get vaccinated reminded [him] of medical experimentation that was done on Indigenous people in residential schools and Indian hospitals.” Along with a user on the platform, he validates those concerns, outlines the government’s responsibility towards Indigenous communities, and shares ongoing research and data about the safety of COVID vaccinations. 

Through gamification and culturally relevant messaging, Our Medicine Path aims to unpack vaccine hesitancy within Indigenous communities. Users can choose from four personas that most speak to their own experiences, including a community leader, a doctor, and a mother of two children. Digital Public Square – who built the platform with funding from the Public Health Agency of Canada – have also developed Moving at the Speed of Trust, which seeks to explore vaccine hesitancy in the Black community through confidential and optional surveys.  

During the COVID-19 pandemic, health and science misinformation may have claimed at least 2800 lives, while those who believed the disease was “a hoax or exaggerated added roughly $300 million to hospitalization or ICU costs in Canada,” according to a report released by the Council of Canadian Academies in early 2023, outlining the socioeconomic impacts of health and science misinformation. 

“Science and health misinformation exposes us to harms, both personal and collective. On an individual level, it can leave us vulnerable to baseless fears, harm from preventable diseases, and exploitation by those who promote misinformation for profit or power,” the report says.

“On a collective level, it erodes trust, fosters hate, undermines social cohesion, and diminishes our capacity for collective action.”

In the context of health misinformation, given how closely it is tied to “ideology and identity”, there is potential for misinformation to become “increasingly weaponized for political gain,” the report continues. 

Aside from the Public Health Agency of Canada, the Department of Canadian Heritage have also been paying close attention to the impact of misinformation and disinformation, both in the context of health and more broadly. In an email interview with Future of Good, the Canadian Heritage team said: “Online disinformation is a complex and multi-faceted problem. When it falls on fertile ground in Canada because of social isolation, racism and other reasons, this can cause significant impacts on society and ultimately our democracy.” While misinformation refers to false or inaccurate information, disinformation is content that is deliberately incorrect, with the intent to mislead people reading or consuming it online.

Since January 2020, the Digital Citizen Contribution Program (DCCP) “has provided over $15 million, for 96 projects in total, to third-party organizations undertaking research and carrying out learning activities, such as public awareness tools and online workshops, to help Canadians become more resilient and think critically about the information they consume online.” The most recent call for proposals, for which award recipients were announced earlier this year, focused specifically on projects looking to understand and counter misinformation and disinformation. 

For some community organizations, fielding online misinformation and disinformation among the communities they serve has been unexpected, leading to a drain on staff time and resources. Future of Good spoke to four organizations that have received funding under the DCCP in 2023 to actively tackle misinformation and disinformation, each using the funding for different use cases. Some are specifically developing programs to address misinformation and disinformation in their communities, while others are focusing on the effects that false and inaccurate information has on non-profit staff and resources. 

Combating anti-Black racism and social media microaggressions 

The Peel region – the traditional territory of the Anishinabek, Huron-Wendat, Haudonesaunee and Ojibway/Chippewa peoples – is extremely diverse, says Sandra Rupnarain, the executive director of Family Services of Peel (FSP). 69 per cent of those living in the region identify as being part of a racialized group, in comparison to 34 per cent in Ontario overall. There has also been a 72 per cent increase in the population of racialized people since 2006. “We were getting feedback about Black youth not seeing themselves represented in the community or in the media,” Rupnarain adds, saying that that was impacting their sense of belonging to the community in Peel. 

Using funding from the DCCP, the organization launched the Peel Community Anti-Black Racism Social Action Response. “One part of the project focused on microaggressions, and social media targeting that is being used to demoralize the group. Even those posting about injustice [towards the Black community] were getting pushback.” As well as a literature review into social media microaggressions and harms, the FSP team carried out a full day training for members of the community to understand these online harms affect young people. A lot of young people attended, Rupnarain adds, and were financially remunerated for their time. 

One of the solutions that FSP has been adopting is using social media itself to counteract the narratives within the community. “Say we have a Black member of staff who has just graduated: we want that to be posted online and be big. We can, as an agency, take responsibility,” Rupnarian says. The effects of the training can already be seen in the form of new allies, she adds. “A young man – who is white – came to the training with his grandmother. He has since formed an organization in his school, and is going to be working with the Anti-Blank Racism Committee [at FSP] to get the message out to white communities of his age.” 

Finding new channels of communication for information 

On Treaty 6 Territory, on land colonially known as Edmonton, the Islamic Family & Social Services Association (IFSSA) has been working directly with the local Muslim community since 1992. According to executive director Omar Yaqub, public health messaging that is aimed at racialized and faith-based communities has missed the mark. “The default approach is to take conventional messaging and translate it from English. So even if people have gone through the trouble of translating something into Punjabi for example, that assumes a bias towards literacy. In 2016, 20 per cent of female refugees who came to Canada didn’t read their first language, so translation might be equally incomprehensible to them.” 

In comparison, Yaqub says, the vast majority of refugees who arrive in Canada are able to use a smartphone, often sending voicenotes or using WhatsApp. For community organizations, these can become vital mediums of delivering messages and broadcasts at a “hyperlocal” level, he adds. In developing and iterating on these community broadcasts, Yaqub and the IFSSA team found that people responded well to information that was sandwiched between community updates and more human content – especially if that information pertained to something that they either didn’t know a lot about or would otherwise have ignored.

“If you [as an organization] mandate that someone should believe something, you’ll be discounted immediately. In comparison, you could communicate from a place of support and build relationships,” Yaqub says. “We have to focus on relationships more than trying to change opinion.” 

Empowering frontline workers to communicate in this way can also improve the dissemination of crucial local information. For example, Yaqub adds, if communities see a frontline worker speaking with them through a text message or WhatsApp message on their phones – an “intimate device”, he says – they’re more likely to trust the information coming from them, as well as feeling more empowered to ask questions. “We set up frontline workers to just check on levels of deprivation in a community rather than guiding people towards support,” he says. 

Equipping non-profit staff with the tools to handle misinformation 

The team at the Ontario Digital Literacy and Access Network (ODLAN) used the funding from the DCCP to research the ways that non-profit organizations and charities that serve the 2SLGBTQIA+ community in Canada have been targets of queerphobic hate in online spaces. “While we didn’t ask specific questions about misinformation and disinformation, it came up a lot as the underpinning of the hate that folks are receiving,” said Alex Tesolin.

“This includes citing biased or poorly conducted research on trans folks and gender affirming care, and narratives such as queer and trans people grooming or posing threats to youth.” 

The knock-on effect of these misinformed narratives is twofold: firstly, there is an emotional and mental health impact on staff and the communities they serve. “Seeing their work portrayed as ‘grooming’ can be damaging to staff’s mental health,” Tesolin adds. “For community members, having misinformation or disinformation visible on an organization’s website [such as comments] can compromise the perception of it as a safe space to access.” In addition, there is a new “administrative drain that comes with addressing and countering misinformation and disinformation,” they say, particularly if staff in organizations are attempting to engage with and correct misinformation. “That takes valuable time and energy away from work they are normally doing in the community.”

The pace at which misinformation can spread means that not only do non-profit staff need to protect communities from its impact – with limited funds, time and resources – but also protect themselves, adds Christopher Dietzel. “In the last few months, with the rise of generative AI, people are recognizing more and more the impacts that technology can have on their perceptions of reality,” he adds. “It’s challenging to critically assess what is real and fake – these are not necessarily skills that young people are learning in schools and community organizations are not trained on digital literacy when they’re just trying to help others.” 

Tesolin and Dietzel both add that organizations they’ve spoken with are “hungry” to learn more about handling online misinformation. “Many organizations we spoke with are in the process of updating their policies, but they’re all really diverse in terms of size, type and capacity they have to tackle misinformation,” Tesolin says. “For a lot of smaller grassroots organizations, the moderation of social media is falling to general staff or volunteers who might not have the education or training to deal with a targeted queerphobic hate campaign. People might not be experienced in social media, but come into a role in an organization where it falls into their list of things to do.“People are super keen to learn about privacy settings, reporting and blocking across different social media platforms. For smaller organizations, they’re interested in touching base with other organizations that are facing this to compare strategies.” 

Researching young people’s reactions to misinformation on social media

Banner on online misinformation
Photo: 2023 MediaSmarts

MediaSmarts is a charity that has been campaigning for digital media literacy for around 30 years, and online harms are one of its key areas of research, says Kara Brisson-Boivin, director of research. “Cultures of prejudice such as racism and misogyny are unfortunately the baseline of many online communities. It’s not just on the deep net, but everyday social and video-sharing platforms that everyone is using.” Using the DCCP funding, MediaSmarts conducted a series of interactive focus groups for young people between the ages of 13 and 29. Participants were given the opportunity to evaluate the reporting processes on major social media platforms by the efficiency and meaningfulness. 

“Even if youth aren’t exactly the target population of these platforms, they certainly are one of the most frequent to occupy these spaces,” Brisson-Boivin adds. “Demographic factors like age are misunderstood – young people were more likely to believe COVID-19 misinformation, because young people are using [social media] sites as a search engine.” That can very quickly progress from just using a platform to find information about the best bars in a city, to validating health information on social media, she adds.

“Some young people ignore misinformation altogether because it takes too much time or energy to report, and they don’t want to draw more attention to the content. Some went elsewhere to verify the information because they were interested in wanting to know whether it was fake or a piece of propaganda. They were only more likely to engage with the content if it came from somebody close to them, but they’d engage offline or in a private message to the person sharing it.” 

Fundamentally, young people didn’t feel like they could trust social media platforms to keep them informed or safe. “They were all quite confused as to what the platform would consider to be misinformation,” Brisson-Boivin adds. “What are the processes for deciding that? It would be a helpful learning opportunity to understand what experts [at social media platforms] are doing to authenticate information. The platforms are also very opaque about this – [as consumers] we don’t know how many reports are being made or what happens to reports.” 

Brisson-Boivin also stresses the importance of building a national framework for measuring digital media literacy in Canada. While countries like the UK have built these frameworks and mapped out a network of organizations working to counter misinformation, Canada lags behind, making it difficult to measure the impact of programs in the short and long term. And, she highlights, there’s still a big elephant in the room: “Where is the funding to be able to do that kind of measurement work?”

Tell us this made you smarter | Contact us | Report error

  • Sharlene Gandhi

    Based between the UK and Canada, Sharlene has been reporting on responsible business, environmental sustainability and technology since 2018. She has worked with various organizations in this time, including the Stanford Social Innovation Review, the Pentland Centre for Sustainability in Business at Lancaster University, AIGA Eye on Design, Social Enterprise UK and Nature is a Human Right. Sharlene moved to Tkaronto in early 2023 to join the Future of Good team, where she has been reporting at the intersections of technology, data and social purpose work. Her reporting has so far spanned a number of subject areas, including AI policy, cybersecurity, ethical data collection, and technology partnerships between the private, public and third sectors.

    This independent journalism is supported by Mastercard Changeworks™. Read our editorial ethics and standards here.

    View all posts