You may not be relying on the best available evidence to make decisions and design programs — here’s why.

Boosting the social impact sector’s culture of preparedness by bringing global evidence practices to Canada

Why It Matters

Sometimes the right evidence reaches the right people at the right time. Sometimes it doesn’t, with dire consequences. The pandemic heightened the need to inject best evidence into the social impact response. And there’s never been a more pressing time to learn from what worked and what didn’t in getting best evidence to those who need it so we can better prepare for catastrophes of all kinds. From public health crises to natural disasters and humanitarian emergencies, relying on the best local and global evidence is crucially important, but it shouldn’t take an active crisis to invest in the infrastructure needed for reliable and decision-relevant data and evidence.

This story is in partnership with the Canadian Red Cross (Red Cross).

In the throes of the pandemic’s first waves and lockdowns, Dr. Paul Hébert, special advisor for the Canadian Red Cross, advised the public to reach out to anyone in their social group who may be vulnerable to the social and psychological impacts related to COVID-19. At the time, new survey data showed nearly half of young adults in Canada were experiencing elevated anxiety and symptoms of depression, 40 percent were not feeling hopeful about the future, and 38 percent worried they couldn’t get immediate help if they needed it. 

“Young people (18- to 34-year-olds) were adversely affected — financially, emotionally, everything, their world was rocked. You expect to see rates of loneliness at 10-20 percent in the average population, but we saw rates of 40 percent. For depression and anxiety, we expect rates of 20 percent, but we saw 40-50 percent,” says Hébert, who worked with the Red Cross to commission a polling firm to conduct the study. 

The findings were a call to action, but they also stumped the Montreal-based physician. Why hadn’t the data pointed to older adults (aged 65+) — particularly those who are frail, isolated and living alone — as being disproportionately affected too? Years in emergency rooms and critical care showed Hébert the detrimental effects social isolation and emotional disconnectedness have on frail older adults, who have less ability to cope with what, for others, may be considered minor setbacks. And so, the critical care doctor asked the polling firm to sample more frail older adults. When they did, rates of loneliness shot up for older adults.

“It was a sampling problem. It showed two groups were disproportionately affected — the old and the young,” says Hébert. “Polling, surveys, even town halls — they help [us] understand the problem,” says Hébert, but those evidence sources do not address all the information needed to inform a health and social impact response, he says. 

“Do our programs really help people? We’re currently not equipped to answer that well,” says Hébert referencing the information gaps between understanding the problem, determining the solution and evaluating its outcomes. 

Overall, the the Canadian Red Cross study — which tracked a range of self-reported social and psychological impacts of COVID-19 on more than 2,000 Canadians aged 18+ over a four month period — shows the power and pitfalls of relying on data analytics in crisis response and preparedness. Data analytics is one type of research evidence, holding promise for addressing some concerns — for example, elucidating the scope of a problem or showing progress once course-correction measures are taken — but inadequate for addressing others. For example, understanding the benefits, harms or cost-effectiveness of various solutions is more suited to process evaluations that examine how and why an option works.



Aligning best evidence to decision-making is the focus of the Global Commission on Evidence to Address Societal Challenges. The Evidence Commission aims to strengthen the use of evidence by all decision-makers — government policymakers, organizational leaders, professionals, or citizens — in addressing societal challenges, be it for routine or crisis decision-making. 

“Now is the time to systematize the aspects of using evidence that are going well and address the many shortfalls, which means creating the capacities, opportunities and motivation to use evidence to address societal challenges, and putting in place the structures and processes to sustain them,” says Dr. John Lavis, co-lead of the Evidence Commission secretariat as well as Canada Research Chair in Evidence-Informed Health Systems and director of the McMaster Health Forum at McMaster University.

In his thirty-year career, Lavis says there’s never been a more opportune time to inject evidence into decision-making. 

“The world has reached a crucial time point in the pandemic, with governments and other decision-makers forced to rethink the way they’re doing things. They can either build on what works in using evidence to inform decision-making, or carry on with what doesn’t. The pandemic has offered numerous examples of both scenarios,” says Lavis.

What doesn’t work is relying on single studies or “expert knows best” approaches, says Lavis, while what works is relying on syntheses of all studies addressing a common question. 

Lavis says the same ‘what works’ lessons for relying on best evidence in a health emergency response are instructive for responding to other societal challenges – from homelessness to food insecurity to income inequality to climate change. And with that breadth comes learning opportunities too. For example, some evidence producers create regularly updated or “living” evidence products. The Organization for Economic Co-operation and Development (OECD) produces weekly data analytics tracking economic activity for most OECD and G20 countries, for example. And the Global Carbon Project updates annually its estimates of the five major components of the global carbon budget (based on modeling and empirical studies). Unlike typical studies that show their age once published, “living evidence” is regularly updated as new data are available or new studies are published. Another example is the COVID-19 Evidence Network to support Decision-making (COVID-END), which produces living evidence syntheses about COVID-19 vaccine effectiveness against variants.

“Many government policymakers and other decision-makers have come to expect such regular updating for COVID-19 and will likely start to ask why such products can’t be maintained for other high-priority societal challenges,” says Lavis, who co-leads COVID-END with Dr. Jeremy Grimshaw, co-lead for the Evidence Commission too.



Dr. Ahmad Firas Khalid, CIHR health systems impact fellow at the Ottawa Hospital Research Institute, wanted to know what factors influence whether and how research evidence informs policy development in humanitarian crises. Khalid’s doctoral work examined health-system policy responses to the Syrian refugee crisis in Lebanon and Ontario

“There’s an unprecedented amount of resources dedicated to humanitarian aid and using evidence to develop humanitarian aid policy can increase the effectiveness and efficiency of interventions, whether for health services or offering protection, shelter or food,” says Khalid. But his research, which examined Lebanon’s 2016 Health Response Strategy and Ontario’s 2016 Phase 2: Health System Action Plan, Syrian Refugees, found research evidence often took a back seat to politicking. 

As background, the Syrian conflict started in 2011 as a result of a civil war leading to an estimated 6.6 million people being displaced within Syria and over 5.6 million refugees seeking safety all over the world. When it comes to their health, Syrian refugees face a variety of medical issues from trauma-related mental health disorders, skin, digestive system, and respiratory diseases to chronic health conditions (half of Syrian refugee households in Lebanon report at least one member living with a non-communicable disease). 

In order to assess whether solutions to address these health challenges are effective, it’s imperative to use the best available research evidence, says Khalid, for example, relying on evidence synthesis rather than single studies, expert opinion or jurisdictional scans. But humanitarian settings are often highly stressful, volatile and those calling the shots are more likely to seek professional judgments and expert opinions, says Khalid, rather than dive into research evidence.

One of the reasons policymakers facing a humanitarian response do not source best evidence, says Khalid, is the misperception that no evidence exists or is scarce. In fact, says Khalid, there are reliable sources such as Evidence Aid, which makes best-available evidence available to decision-makers facing a range of humanitarian challenges from addressing the health of refugees and asylum seekers to climate change to natural disasters like earthquakes. Khalid’s research pointed to other helpful strategies, such as involving decision-makers in research priority-setting; skilling up evidence aids workers to find and apply research evidence; and enhancing search optimization of research databases so decision-makers can find what they need when they need it.



“There’s a saying that data is boring until we have a crisis and then everyone wants to know, ‘what does the data say?’” says Paul Hébert.  

But it shouldn’t take an active crisis to invest in the infrastructure needed to have a steady flow of reliable data and evidence. In fact, the Evidence Commission argues such infrastructure ought to be a “global public good” — in other words, available to all.

It’s a similar premise behind the growing Canadian public sentiment that online services (both public and private sector) ought to be readily available to anyone experiencing a disaster. Findings from a Canadian Red Cross poll show Canadians impacted by disasters increasingly expect access to online services, including ways to find information about their family, access to financial assistance, and updates on their property and potential damage. The findings show emergency situations led nearly a third to sign up to receive information during or after an incident with Facebook being the most preferred platform, followed by email and then text alerts.

“We saw a 6000 percent increase in engagement on social media overnight during the May 2016 Alberta wildfires,” says Sara Falconer, former director of digital communications for the Canadian Red Cross. At the time, the Fort McMurray wildfires swept through the community before reaching across northern Alberta and into Saskatchewan. In three months, the fires engulfed 590,000 hectares/1,500,000 acres making it the largest wildfire evacuation in Alberta’s history. A staggering 88,000+ people were forced from their homes; ~2,400 homes and buildings were destroyed; and damage was estimated at $9.9 billion (making it the costliest disaster in Canadian history).

As that example showed, Falconer says social media, online alerts and email have the potential to be an effective alternative (to traditional media) for sharing timely information and increasing awareness of the services available. These channels are, at once, emergency communication channels and sources of data and information, e.g., pinpointing disaster zones and hotspots. It’s all part of an informed crisis response, getting on-the-ground information, but answering Paul Hébert’s broader question (how effective are our social impact supports and services?) requires building beyond data to a robust collection of evidence supports.

And if the work of the Evidence Commission teaches us anything, it’s that waiting for the next crisis will already be too late.