This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Public Health and Surveillance, is properly cited. The complete bibliographic information, a link to the original publication on https://publichealth.jmir.org, as well as this copyright and license information must be included.
True evidence-informed decision-making in public health relies on incorporating evidence from a number of sources in addition to traditional scientific evidence. Lack of access to these types of data as well as ease of use and interpretability of scientific evidence contribute to limited uptake of evidence-informed decision-making in practice. An electronic evidence system that includes multiple sources of evidence and potentially novel computational processing approaches or artificial intelligence holds promise as a solution to overcoming barriers to evidence-informed decision-making in public health.
This study aims to understand the needs and preferences for an electronic evidence system among public health professionals in Canada.
An invitation to participate in an anonymous web-based survey was distributed via listservs of 2 Canadian public health organizations in February 2019. Eligible participants were English- or French-speaking individuals currently working in public health. The survey contained both multiple-choice and open-ended questions about the needs and preferences relevant to an electronic evidence system. Quantitative responses were analyzed to explore differences by public health role. Inductive and deductive analysis methods were used to code and interpret the qualitative data. Ethics review was not required by the host institution.
Respondents (N=371) were heterogeneous, spanning organizations, positions, and areas of practice within public health. Nearly all (364/371, 98.1%) respondents indicated that an electronic evidence system would support their work. Respondents had high preferences for local contextual data, research and intervention evidence, and information about human and financial resources. Qualitative analyses identified several concerns, needs, and suggestions for the development of such a system. Concerns ranged from the personal use of such a system to the ability of their organization to use such a system. Recognized needs spanned the different sources of evidence, including local context, research and intervention evidence, and resources and tools. Additional suggestions were identified to improve system usability.
Canadian public health professionals have positive perceptions toward an electronic evidence system that would bring together evidence from the local context, scientific research, and resources. Elements were also identified to increase the usability of an electronic evidence system.
In the time of growing funding restraints for public health in Canada and across the world, public health professionals and organizations must function efficiently to meet the expanding public health needs. Changes to the funding structure of public health have been underway across Canada for several years [
In addition to the impacts of restructuring and decreasing funding, the public health sector is challenged to function effectively with the exponential increase in the amount of scientific evidence generated and the local contextual data available, as seen in response to the COVID-19 pandemic. The amount of information available now exceeds the capacity of public health professionals to comprehensively assess, consider, and use in program planning decisions. Given these challenges, there is a need to understand how public health professionals and organizations can meet increasing demands for evidence-informed decision-making with fewer resources [
A 2016 scoping review identified 4 factors that were associated with improved efficiency in public health systems: (1) increased financial resources, (2) increased staffing per capita, (3) jurisdictions serving a population of 50,000 to 500,000 people, and (4) evidence-based organizational and administrative features [
Evidence-based public health and practice is defined as “the process of integrating science-based interventions with community preferences to improve the health of populations” [
The National Collaborating Centre for Methods and Tools (NCCMT) has developed a model to guide the consideration of different sources of evidence, providing a structure for the use of different types of evidence in the decision-making process (
The goal of precision public health is similar to that of evidence-informed decision-making—to put forth effective public health interventions that improve population health [
The National Collaborating Centre for Methods and Tools’ evidence-informed decision-making model.
The available evidence systems are limited by the type of evidence they provide, requiring large time and expertise input by professionals to gather and analyze data from multiple platforms [
A web-based cross-sectional survey was used to assess the preferences of public health professionals across Canada with respect to an electronic evidence system.
Eligible participants were individuals currently working in any field in public health organizations in Canada. The web-based survey was available for completion in either English or French. Individuals who identified as students studying public health without any indication of work experience were excluded. Participants were recruited over a 2-week period in February 2019 through the NCCMT’s mailing list (survey was disseminated via email to 11,525 recipients, and 3288 emails were opened) and the Canadian Public Health Association’s bulletin listserv (survey was disseminated via email to 1370 recipients, and 488 emails were opened). Ethics review was not required by the host institution, as this evaluation aimed to inform about the needs for and future development of an electronic evidence system.
The survey was developed by members of the research team with expertise in public health, AI, and informatics. The survey underwent multiple rounds of consultation between study investigators. Once agreement was reached, the questionnaire was translated by a certified French translator. The final questions were mainly multiple-choice questions, with 1 Likert scale question and 3 open-ended questions.
Upon initiation of the questionnaire, via LimeSurvey (LimeSurvey GmbH), respondents were asked to consider the following hypothetical scenario:
Imagine an electronic system that combines data about your local population with relevant research evidence about the effectiveness of health interventions. The data in this system would include measures of determinants of health, morbidity, and demographics, and could also be compared to similar measures for other geographic regions / populations. The research evidence could include information on the effectiveness of the interventions in different settings/populations and the resources required for implementing those interventions.
Participants were asked to complete an 18-item questionnaire comprising questions on respondents’ characteristics, preference and need for an electronic evidence system, and barriers and facilitators to use (
Quantitative analysis was completed using SPSS (version 25.0, IBM Corp). Descriptive statistics were calculated as means and SDs or percentages, where appropriate. Quantitative responses were categorized post hoc into three types of evidence from the evidence-informed decision-making model: community health issues and local context, research evidence, and public health resources [
To analyze the data from the open-ended questions and all other qualitative responses included in the
A total of 487 respondents clicked on the survey link, initiating the survey. After removing surveys that were not started (n=107) or completed by students (n=9), data from a total of 371 respondents (347 full surveys and 24 partial respondents) were included in this analysis. Respondents were primarily English speakers, with at least a master’s degree, and working in either local, provincial, or territorial government (
Characteristics of included responses from professionals working in the public health field in February 2019 (N=371).
Characteristics | Respondents, n (%) | ||
|
|||
|
English | 361 (97.3) | |
|
French | 10 (2.7) | |
|
|||
|
Local or regional government | 175 (47.2) | |
|
Provincial government | 72 (19.4) | |
|
University or research center | 40 (10.8) | |
|
Federal government | 31 (8.4) | |
|
Not-for-profit organizations | 26 (7) | |
|
Territorial government | 8 (2.2) | |
|
Indigenous organization | 3 (0.8) | |
|
Consultant organizations | 3 (0.8) | |
|
Primary care or hospitals | 2 (0.5) | |
|
Other or no response | 11 (3) | |
|
|||
|
Master’s | 206 (55.5) | |
|
Bachelor’s | 96 (25.9) | |
|
Doctorate | 42 (11.3) | |
|
Diploma | 12 (3.2) | |
|
Doctor of Medicine | 11 (3) | |
|
Other or no response | 4 (1.1) | |
|
|||
|
Program or project staff | 110 (29.6) | |
|
Consultant specialist | 87 (23.5) | |
|
Frontline public health or community provider | 73 (19.7) | |
|
Program or project management (eg, manager) | 55 (14.8) | |
|
Faculty | 29 (7.8) | |
|
Senior management or administration (eg, director or executive) | 25 (6.7) | |
|
Government official including policy | 21 (5.7) | |
|
Chief medical or medical or associate medical officer of health | 4 (1.1) | |
|
Other or no response | 12 (3.2) | |
|
|||
|
Program evaluator or planner | 76 (20.5) | |
|
Health promoter | 74 (19.9) | |
|
Public health nurse | 68 (18.3) | |
|
Epidemiologist | 55 (14.8) | |
|
Knowledge broker or knowledge translation specialist | 52 (14) | |
|
Health analyst | 38 (10.2) | |
|
Policy analyst | 36 (9.7) | |
|
Administrator or administration | 29 (7.8) | |
|
Policy advisor | 24 (6.5) | |
|
Public health educator | 21 (5.7) | |
|
University or college educator | 20 (5.4) | |
|
Dietitian | 18 (4.9) | |
|
Student | 17 (4.6) | |
|
Librarian or information specialist | 13 (3.5) | |
|
Physician | 12 (3.2) | |
|
Public health inspector | 11 (3) | |
|
Nutritionist | 10 (2.7) | |
|
Other health clinician | 10 (2.7) | |
|
Research staff | 6 (1.6) | |
|
Dentist | 3 (0.8) | |
|
Other or no response | 7 (1.9) | |
|
|||
|
Social determinants of health | 131 (35.3) | |
|
Chronic disease (eg, nutrition and physical activity) | 130 (35) | |
|
All areas of public health | 105 (28.3) | |
|
Health policy | 88 (23.7) | |
|
Mental health including substance use | 85 (22.9) | |
|
Injury prevention | 62 (16.7) | |
|
Infectious disease | 58 (15.6) | |
|
Family health or reproductive health | 54 (14.6) | |
|
Environmental health | 44 (11.9) | |
|
Reproductive health | 29 (7.8) | |
|
Emergency preparedness or response | 25 (6.7) | |
|
Dental health | 17 (4.6) | |
|
School or child health | 11 (3) | |
|
Hospital care | 6 (1.6) | |
|
Other or no response | 9 (2.4) |
The majority of respondents reported that the proposed electronic evidence system would extremely (186/371, 50.1%), very much (141/371, 38%), or moderately (37/371, 9.9%) assist them in their roles. Less than 2% of respondents indicated that an electronic evidence system would only slightly (3/371, 0.8%) or not at all (3/371, 0.8%) help with the work they do. Moreover, 0.3% (1/371) of participants did not answer. Participants’ preferences for community health issues and local contextual data are shown in
Preferences for community health issues and local context among public health professionals who completed the web-based needs assessment in February 2019 (n=370).
|
Respondents, n (%) | ||
|
|||
|
Risk | 357 (96.5) | |
|
Demographics | 352 (95.1) | |
|
Other | 107 (28.9) | |
|
|||
|
Local to regional | 283 (76.5) | |
|
To smaller subdivisions | 283 (76.5) | |
|
To larger regions | 255 (68.9) | |
|
Other | 26 (7) | |
|
|||
|
Prevalence | 351 (94.9) | |
|
Incidence | 347 (95.1) | |
|
Other | 41 (11.1) | |
|
|||
|
Age | 363 (98.4) | |
|
Sex | 352 (95.1) | |
|
Income | 351 (94.6) | |
|
Education | 336 (90.8) | |
|
Ethnicity | 326 (88.1) | |
|
Other | 98 (26.5) |
A summary of preferences for the types of research evidence is shown in
Preferences for research evidence among public health professionals who completed the web-based needs assessment in February 2019 (n=347).
|
Respondents, n (%) | ||
|
|||
|
Best practice guidelines | 323 (93.1) | |
|
Systematic reviews or meta-analyses | 312 (89.9) | |
|
Practice-based evidence | 305 (87.9) | |
|
|
||
|
|
Qualitative | 131 (37.8) |
|
|
Quantitative | 121 (34.9) |
|
Other | 13 (3.7) | |
|
|||
|
Magnitude of effect | 316 (91.1) | |
|
Quality of study | 315 (90.8) | |
|
Required human resources | 271 (78.1) | |
|
Required financial resources | 261 (75.2) | |
|
Heterogeneity in effect | 230 (66.3) | |
|
Other | 47 (13.5) |
Information about the preference for information about public health resources required is presented in
Preferences for information on public health resources among public health professionals who completed the web-based needs assessment in February 2019 (n=347).
|
Respondents, n (%) | ||
|
|||
|
Type and intensity of training | 295 (85) | |
|
Type of training to sustain program | 277 (79.8) | |
|
Number of staff required | 273 (78.7) | |
|
Discipline of staff | 235 (67.7) | |
|
|||
|
Cost-effectiveness | 310 (89.3) | |
|
Cost | 263 (75.8) | |
|
Cost-utility | 160 (46.1) | |
|
Economic modelling data | 124 (35.7) | |
|
Other | 23 (6.6) |
When comparing preferences across the 3 decision-making levels (ie, frontline staff, program management, and senior management), a few notable differences were found (
Preferences for an electronic evidence system among frontline public health or community providers, project or program management, and senior management or administration who completed the web-based needs assessment in February 2019.
|
Frontline public health or community providers, n/n (%) | Program or project management, n/n (%) | Senior management or administration, n/n (%) | ||
|
|||||
|
Risk factors | 71/73 (97) | 52/55 (94) | 24/25 (96) | |
|
Demographics | 64/73 (88)a | 54/55 (98)a | 25/25 (100) | |
|
|||||
|
Compare your local region with a similar region in size | 53/73 (73) | 43/55 (78) | 21/25 (84) | |
|
Compare subregions within your local regions | 59/73 (81) | 41/55 (75) | 18/25 (72) | |
|
Compare your local region with a larger region in size | 50/73 (68) | 33/55 (60) | 19/25 (76) | |
|
|||||
|
Prevalence | 69/73 (95) | 52/55 (95) | 25/25 (100) | |
|
Incidence | 68/73 (93) | 50/55 (91) | 25/25 (100) | |
|
|||||
|
Age | 71/73 (97) | 54/55 (98) | 25/25 (100) | |
|
Sex | 68/73 (93) | 53/55 (96) | 25/25 (100) | |
|
Income | 66/73 (90) | 53/55 (96) | 25/25 (100) | |
|
Education | 68/73 (93) | 47/55 (85) | 22/25 (88) | |
|
Ethnicity | 62/73 (85) | 51/55 (93) | 23/25 (92) | |
|
|||||
|
Magnitude of effect | 57/67 (85) | 50/53 (94) | 22/24 (92) | |
|
Quality of study | 55/67 (82) | 48/53 (91) | 22/24 (92) | |
|
Required human resources | 55/67 (82) | 42/53 (79) | 18/24 (75) | |
|
Required financial resources | 52/67 (78) | 39/53 (74) | 18/24 (75) | |
|
Heterogeneity in effect | 36/67 (54)c | 40/53 (75)c | 15/24 (62) | |
|
|||||
|
Best practice guidelines | 64/67 (95) | 50/53 (94) | 23/24 (96) | |
|
Systematic reviews or meta-analyses | 53/67 (79) | 45/53 (85) | 19/24 (79) | |
|
Practice-based evidence (program evaluations) | 56/67 (84) | 51/53 (96) | 20/24 (83) | |
|
|
||||
|
|
Qualitative | 27/67 (40) | 21/53 (40) | 6/24 (25) |
|
|
Quantitative | 24/67 (36) | 18/53 (34) | 7/24 (29) |
|
|||||
|
Type and intensity of training required to be competent to deliver interventions or programs | 23/67 (96) | 49/53 (92) | 19/24 (79) | |
|
Type of training required to sustain program | 19/67 (79) | 41/53 (77) | 21/24 (87) | |
|
Number of staff required to implement the program | 20/67 (83) | 47/53 (89) | 20/24 (83) | |
|
Discipline of required staff | 6/67 (25)a | 40/53 (75) | 19/24 (79)a | |
|
|||||
|
Cost-effectiveness | 61/67 (91) | 47/53 (89) | 21/24 (87) | |
|
Cost | 47/67 (70) | 41/53 (77) | 19/24 (79) | |
|
Cost-utility | 28/67 (42) | 29/53 (55) | 12/24 (50) | |
|
Economic modeling data | 23/67 (34) | 20/53 (38) | 10/24 (42) |
aIndicates statistically significant difference (
bSome participants only provided partial answers to the survey; thus, the sample sizes differ across questions.
cIndicates statistically significant difference (
Qualitative data from open-ended questions identified several specific needs, concerns, and suggestions for an electronic evidence system. Echoing the preferences for cross-jurisdictional comparisons found in the quantitative results, respondents identified the ability to compare indicators across geographic areas, the inclusion of equity indicators and epidemiologic data, and the use of geographic information systems as
...generally, any data that might link to poverty measures, immigration status, housing situation (e.g., housed, homeless), recipient of childcare subsidy, recipient of social assistance etc.
A major theme that emerged with respect to the type of research evidence to be included was the usefulness of research beyond what is typically considered public health interventions, such as organizational interventions and interventions from the fields of education, social services, and law. Regardless of the type of research, there was a strong desire for all evidence to be critically appraised and be presented alongside summaries or statements to help interpret the evidence, as illustrated in the following quote:
...while I would be open to including all kinds of research, I would want them to be graded, to ensure that one could assess the quality of the evidence.
Similarly, participants also emphasized the need for practice-based evidence that provides contextual information on the outcomes of interventions and implementation. This included evidence on the context in which an intervention was implemented, adoption of the intervention, and considerations on how to deliver and sustain it in the community. This is reflected in a respondent’s comment:
[I] need a way to analyze context where an intervention is used. For example, if previously similar interventions had been tried in an area or subpopulation there may already be a delivery system or key partnerships in place, and there may also be a learning effect from previous work that is beneficial to achieving results with a “new” intervention.
To support the need for contextual and implementation data, respondents also specifically mentioned the need for qualitative and mixed-methods research and needs assessments conducted within other communities or organizations.
Related to resources and tools for practice, a need for theories, methods, or frameworks to support adaption or to implement a program in their community was identified. Some respondents mentioned specific frameworks, such as the Reach, Effectiveness, Adoption, Implementation, and Maintenance framework, whereas others had general suggestions for
In addition to the specific needs for an electronic system, a number of potential concerns or barriers emerged. Concerns were related to either the electronic system itself or the ability to adopt a system within public health organizations. Concerns about keeping a system up to date stemmed from the understanding that evidence is created at rapid rates and new data are constantly being collected. For such a system to be useful, data would need to be current. Sustainability of the system beyond its initial creation was seen as a critical element for successful implementation, with some participants citing concerns if the system were to be funded by a research grant. An understanding of plans for long-term upkeep and sustainability may be a requirement for individual users or organizations to invest time in learning how to use the system.
The potential for duplication of existing resources was another concern related to such a system, with respondents citing specific databases or systems that already exist, and how existing databases and systems would complement or conflict with any new system. One respondent captured this sentiment, stating that:
...these systems are difficult to set up AND keep up to date. In addition, other similar systems (except for intervention data) already exist and this may add to the confusion for users (which data is THE official data?) Why do we observe differences between two systems for same indicator? Etc.
Related to the ability of individuals and organizations to adopt and implement the system, major themes about usability and costs emerged. The cost of the proposed system was seen as a key potential barrier, with questions about who would pay for it arising frequently. Second, the ability of a system to work with existing information technology infrastructure, such as outdated or restrictive computer systems and limited or slow internet connectivity, was raised as a concern. Beyond the initial barriers of cost and access, an organization’s ability to adopt the use of a system in their regular workflow was reported to be dependent on the ability of individual staff to use the system adequately, which requires not only buy-in by the individual employee but also senior-level management. Finally, concerns about data privacy and maintenance of confidentiality were also expressed.
A number of suggestions for success emerged from the qualitative data. The most frequently mentioned requirement to facilitate use of the system were transparency of methodology used, including the criteria to select evidence for inclusion, the methods used to evaluate and synthesize evidence, and the overall quality of the evidence included. One respondent stated that they “...would need a very detailed ‘methods’ section of this system to be able to be confident in it.” Sufficient staff training was also suggested to support the use of the proposed system.
Finally, respondents requested specific functions or system formatting elements, such as the ability to make graphs, print or export data, and retrieve contact information of data sharers on the system.
The purpose of this study is to understand the preferences of Canadian public health professionals for an electronic evidence system. The results indicate that there is a perceived need for an electronic evidence system; however, certain considerations related to the type of information included and how it would be presented must be addressed for such a system to be adopted and used effectively for public health decision-making.
Preferences for all 3 types of evidence (community health issues and local context, research evidence, and public health resources) were generally high. This aligns with previous research that public health professions value different sources of evidence [
The need for information to examine and address the determinants of health and health equity came through strongly in this study. This is not surprising given the previous literature that suggests that equity information is commonly lacking in scientific publications. A 2016 scoping review of population health interventions found that most studies included minimal contextual information on the target population and intervention setting [
A key concern that emerged from the qualitative data was avoiding duplication of existing resources, some of which were already in use within their organization. For example, in Canada, the Canadian Best Practice Portal captures intervention evidence on effective health promotion and chronic disease prevention, but it is no longer updated [
Barriers to the use of an electronic evidence system identified in this survey are similar to those found in a previous systematic review on barriers to public health data sharing [
In our survey, motivational, political, and ethical barriers were not raised; however, the survey did not specifically seek feedback on these factors. Although motivational barriers, which limit data sharing at an individual or organizational level, were not explicitly mentioned, some respondents suggested possible ways to overcome a component of this barrier, disagreements in data use [
An additional barrier identified in the qualitative responses was the need for ongoing training of staff to use the system. Although AI has the potential to compile, process, synthesize, and analyze patterns at rapid rates and to improve efficacy in the use of evidence, blind reliance on its outputs runs the risk of misrepresenting variables or groups of people as it is dependent on data collection methods and evidence inputs [
There are some limitations to this study, which should be considered when interpreting the findings. First, we did not collect any individual demographic data or years of experience working in public health, limiting the extent to which we can characterize the types of individuals who took part in this survey. In allowing participants to select all that apply for organization type, role, area of public health, and practice discipline, our analysis was limited in its ability to compare differences in preferences across each of these categories. Although the survey was disseminated through two large Canadian-based listservs to recruit public health professionals, there was no qualifying question to confirm that the preferences that emerged were solely of public health professionals in Canada. Second, respondents in the survey ranged across roles, areas, and disciplines, and their results may not be generalizable across all Canadian public health professionals, as respondents who participated may have prior awareness of, or an interest in, electronic evidence systems or evidence-informed decision-making. Finally, the survey questions for a hypothetical electronic evidence system without considerations of feasibility may have skewed responses positively, where respondents more favorably indicated the need for all items listed [
Public health professionals and organizations face many hurdles, including changes in structure, lack of funding and time, and exponential increases in new evidence. However, there is broad agreement that the hypothetical electronic evidence system proposed would make informed decisions more accessible. On the basis of our findings, public health professionals see the value in an electronic evidence system that combines local contextual evidence, research and intervention studies, and public health resources and tools. Our findings also highlight a number of elements that should be considered to ensure usability and facilitate trust in such an electronic evidence system. These elements include quality appraisals, interpretations of evidence, and transparent methods and funding models. Such an electronic evidence system may support professionals in evidence-informed decision-making, thereby enabling the Canadian public health system to be more effective in an environment with limited investment.
Categorized needs assessment questionnaire items and answer options.
artificial intelligence
National Collaborating Centre for Methods and Tools
This work was supported by a Canadian Institutes of Health Research project grant (application 397883) awarded to DLB and MD. At the time of the study, DLB held a position as the Canadian Institutes of Health Research Chair in Applied Public Health Research and SENS was supported by a postdoctoral fellowship from the Canadian Institutes of Health Research. Data collection, analysis, and content of this study were not influenced by the funding received from the Canadian Institutes of Health Research.
None declared.