Background: True evidence-informed decision-making in public health relies on incorporating evidence from a number of sources in addition to traditional scientific evidence. Lack of access to these types of data as well as ease of use and interpretability of scientific evidence contribute to limited uptake of evidence-informed decision-making in practice. An electronic evidence system that includes multiple sources of evidence and potentially novel computational processing approaches or artificial intelligence holds promise as a solution to overcoming barriers to evidence-informed decision-making in public health.
Objective: This study aims to understand the needs and preferences for an electronic evidence system among public health professionals in Canada.
Methods: An invitation to participate in an anonymous web-based survey was distributed via listservs of 2 Canadian public health organizations in February 2019. Eligible participants were English- or French-speaking individuals currently working in public health. The survey contained both multiple-choice and open-ended questions about the needs and preferences relevant to an electronic evidence system. Quantitative responses were analyzed to explore differences by public health role. Inductive and deductive analysis methods were used to code and interpret the qualitative data. Ethics review was not required by the host institution.
Results: Respondents (N=371) were heterogeneous, spanning organizations, positions, and areas of practice within public health. Nearly all (364/371, 98.1%) respondents indicated that an electronic evidence system would support their work. Respondents had high preferences for local contextual data, research and intervention evidence, and information about human and financial resources. Qualitative analyses identified several concerns, needs, and suggestions for the development of such a system. Concerns ranged from the personal use of such a system to the ability of their organization to use such a system. Recognized needs spanned the different sources of evidence, including local context, research and intervention evidence, and resources and tools. Additional suggestions were identified to improve system usability.
Conclusions: Canadian public health professionals have positive perceptions toward an electronic evidence system that would bring together evidence from the local context, scientific research, and resources. Elements were also identified to increase the usability of an electronic evidence system.
In the time of growing funding restraints for public health in Canada and across the world, public health professionals and organizations must function efficiently to meet the expanding public health needs. Changes to the funding structure of public health have been underway across Canada for several years . In the province of Quebec, the public health budget was cut by 33% in 2015; cuts of up to 30% were proposed in Ontario in 2019; and more recently, cuts of up to 10% were proposed in Alberta [ - ]. Constraints of public health funding are not limited to Canada; countries such as the United States and England have seen similar trends [ , ]. Exceptions to this trend can occur during times of crisis, including the current COVID-19 (SARS-CoV-2) pandemic, whereby further funding cuts are halted or funding is even increased; however, these exceptions may be limited in duration [ ].
In addition to the impacts of restructuring and decreasing funding, the public health sector is challenged to function effectively with the exponential increase in the amount of scientific evidence generated and the local contextual data available, as seen in response to the COVID-19 pandemic. The amount of information available now exceeds the capacity of public health professionals to comprehensively assess, consider, and use in program planning decisions. Given these challenges, there is a need to understand how public health professionals and organizations can meet increasing demands for evidence-informed decision-making with fewer resources .
A 2016 scoping review identified 4 factors that were associated with improved efficiency in public health systems: (1) increased financial resources, (2) increased staffing per capita, (3) jurisdictions serving a population of 50,000 to 500,000 people, and (4) evidence-based organizational and administrative features . Although the first 3 factors are controlled at a subnational or federal government level, institutional changes to support evidence-based practices occur at a local level and, therefore, present opportunities for change. Within the category of administrative evidence-based features, one umbrella review identified five high-priority, locally modifiable best practices that contribute to public health system productivity: workforce development, leadership, organizational climate and culture, interorganizational relationships and partnerships, and financial processes [ ]. Specifically, access to and free flow of relevant information were identified as factors that can contribute to public health system performance in the short term; this includes ready access to high-quality information and tailored messages for evidence-based decision-making [ ].
Evidence-based public health and practice is defined as “the process of integrating science-based interventions with community preferences to improve the health of populations” , whereas evidence-informed public health is defined as “using research evidence with public health expertise, resources, and knowledge about community health issues, local context, and political climate to make policy and programming decisions” [ , ]. Using the term informed rather than based allows for nuances of the decision-making process that are not solely based in research evidence, such as considerations of the political climate and expertise of public health professionals [ , ]. Using evidence to inform program planning decisions increases the likelihood that services with known effectiveness will be delivered and supports the efficient use of human and financial resources. Across Canada, evidence-informed decision-making is becoming a central tenant of public health and is now incorporated into public health standards in a growing number of provinces, including Ontario, Nova Scotia, and British Columbia [ - ]. Globally, similar concepts are gaining traction, for example, evidence-informed practice has been acknowledged by the Centers for Disease Control and Prevention as a central component of essential public health services to improve and innovate public health functions [ ].
The National Collaborating Centre for Methods and Tools (NCCMT) has developed a model to guide the consideration of different sources of evidence, providing a structure for the use of different types of evidence in the decision-making process () [ ]. The 4 spheres of this model are research evidence (published scientific literature, including qualitative or quantitative studies), local context (consideration of the specific needs of the community through quantitative surveillance data, ie, population health indicators), community preferences (using qualitative methods to assess the needs and interests of its members), and resources (human and financial) [ ]. Gathering evidence within each of these spheres and making sense of the evidence in relation to a specific jurisdiction is an increasingly daunting task, as the amount of evidence in all spheres grows exponentially [ , , ]. Previous research has shown that public health professionals value evidence-informed decision-making but encounter barriers such as lack of time; management support; and knowledge and skills to locate, critically analyze, and interpret evidence [ ]. Additional challenges exist in appraising, synthesizing, and interpreting different types of evidence, such as limited capacity to apply evidence from the local context and community preferences to program planning [ ]. Acquiring and analyzing data to support evidence-informed decision-making can be an intensive process; thus, to truly increase efficiency and effectiveness, system-level support and multiorganization data sharing and computational methods such as artificial intelligence (AI) may offer solutions [ , , ].
The goal of precision public health is similar to that of evidence-informed decision-making—to put forth effective public health interventions that improve population health . Precision public health is defined as an “emerging practice to more granularly predict and understand public health risks and customize treatments for more specific and homogenous subpopulations, often using new data, technologies and methods” [ ]; it aims to improve population health outcomes by enabling the right interventions to be delivered to the right populations at the right time to prevent disease and to protect and promote health [ , ]. Although surveillance systems have traditionally monitored infectious diseases, it is now possible for systems to simultaneously consider data from many sources and apply statistical and AI methods to estimate and monitor the impact of risk factors and diseases on health and other outcomes [ ]. AI is a generic term used to define "nonhuman intelligence that is measured by its ability to replicate human mental skills or acting rationally" [ , ]. A hypothetical evidence system that encompasses multiple sources of data and evidence would require the large statistical capabilities of AI to make use of the evidence feasible. It holds promise as a methodological toolbox for supporting public health decision-making and improving population health outcomes, although the evidence is based on a small number of preliminary studies [ , ]. There are many potential uses of AI methods, such as machine learning, in public health, including processing patterns in complex data, modeling policy decisions, and understanding the causal pathways through which interventions influence health outcomes [ ]. However, there has been limited implementation of AI in public health initiatives internationally [ ]. Although the potential for AI to significantly impact population health exists, substantial human input is required to develop algorithms that can sort and assess evidence inputs and make recommendations for policy and practice [ ].
The available evidence systems are limited by the type of evidence they provide, requiring large time and expertise input by professionals to gather and analyze data from multiple platforms [, - ]. Currently, there are no public health evidence systems described in the literature that bring together multiple evidence sources in 1 central location with large statistical analysis abilities similar to that of AI; to our knowledge, there is little or no information available on the perceived need for such a system among public health professionals across Canada or internationally. An understanding of the preferences of public health professionals for an electronic evidence system and the desired functionality is critical to inform the development of such systems. The purpose of this study is to identify the needs and preferences of Canadian public health professionals for an electronic evidence system that combines data about local population parameters and context with relevant research evidence about health intervention effectiveness and resources required for successful implementation.
A web-based cross-sectional survey was used to assess the preferences of public health professionals across Canada with respect to an electronic evidence system.
Eligible participants were individuals currently working in any field in public health organizations in Canada. The web-based survey was available for completion in either English or French. Individuals who identified as students studying public health without any indication of work experience were excluded. Participants were recruited over a 2-week period in February 2019 through the NCCMT’s mailing list (survey was disseminated via email to 11,525 recipients, and 3288 emails were opened) and the Canadian Public Health Association’s bulletin listserv (survey was disseminated via email to 1370 recipients, and 488 emails were opened). Ethics review was not required by the host institution, as this evaluation aimed to inform about the needs for and future development of an electronic evidence system.
The survey was developed by members of the research team with expertise in public health, AI, and informatics. The survey underwent multiple rounds of consultation between study investigators. Once agreement was reached, the questionnaire was translated by a certified French translator. The final questions were mainly multiple-choice questions, with 1 Likert scale question and 3 open-ended questions.
Upon initiation of the questionnaire, via LimeSurvey (LimeSurvey GmbH), respondents were asked to consider the following hypothetical scenario:
Imagine an electronic system that combines data about your local population with relevant research evidence about the effectiveness of health interventions. The data in this system would include measures of determinants of health, morbidity, and demographics, and could also be compared to similar measures for other geographic regions / populations. The research evidence could include information on the effectiveness of the interventions in different settings/populations and the resources required for implementing those interventions.
Participants were asked to complete an 18-item questionnaire comprising questions on respondents’ characteristics, preference and need for an electronic evidence system, and barriers and facilitators to use (). All responses on LimeSurvey were anonymous, and no identifying data were collected.
Quantitative analysis was completed using SPSS (version 25.0, IBM Corp). Descriptive statistics were calculated as means and SDs or percentages, where appropriate. Quantitative responses were categorized post hoc into three types of evidence from the evidence-informed decision-making model: community health issues and local context, research evidence, and public health resources . Given the previous findings that preferences for specific sources of evidence vary by position levels within public health [ , ] and understanding the different perspectives that these groups bring, we planned a subgroup analysis to compare responses by position. We compared the responses of 3 independent categories of positions respondents indicated they held whereas other positions had overlap, as respondents were able to select all position levels that applied. These 3 categories are frontline public health or community providers, project or program management, and senior management or administration. For continuous data, the Levene test was used to assess the homogeneity of variance across the three position groups. Where the assumption of homogeneity was met (P=.05), we used a 1-way analysis of variance across the 3 independent groups. When the assumption of homogeneity of variance was not met, the Games-Howell post hoc test was used and the differences among the 3 groups were presented. For categorical data, the Pearson chi-square test was used with comparison across columns. When the cell sizes were less than 5, the Fisher exact test was used to compare the groups.
To analyze the data from the open-ended questions and all other qualitative responses included in the other multiple-choice questions open text, data were imported into NVivo (version 12, QSR International). The analysis began with an initial scan of the responses and a discussion of possible themes. Two authors (BD and SENS) independently reviewed the responses using an inductive line-by-line approach and then discussed themes emerging from data and refined the coding scheme . Within the larger theme of needs and preferences, a deductive approach was used where appropriate to code responses according to the following spheres in NCCMT’s evidence-informed decision-making model for public health: community health issues and local context, research evidence, and public health resources [ ]. Codes and themes were discussed continuously until the final coding was agreed upon by both the authors.
A total of 487 respondents clicked on the survey link, initiating the survey. After removing surveys that were not started (n=107) or completed by students (n=9), data from a total of 371 respondents (347 full surveys and 24 partial respondents) were included in this analysis. Respondents were primarily English speakers, with at least a master’s degree, and working in either local, provincial, or territorial government (). Although many respondents selected multiple positions, frontline public health or community provider (73/371, 19.7%), program or project management (55/371, 14.8%), and senior management or administration (25/371, 6.7%) were largely unique. Respondents reported working in an average of 2.3 (SD 1.8) specific areas of public health, the most commonly being the social determinants of health, chronic diseases, and all areas of public health.
|Characteristics||Respondents, n (%)|
|Local or regional government||175 (47.2)|
|Provincial government||72 (19.4)|
|University or research center||40 (10.8)|
|Federal government||31 (8.4)|
|Not-for-profit organizations||26 (7)|
|Territorial government||8 (2.2)|
|Indigenous organization||3 (0.8)|
|Consultant organizations||3 (0.8)|
|Primary care or hospitals||2 (0.5)|
|Other or no response||11 (3)|
|Doctor of Medicine||11 (3)|
|Other or no response||4 (1.1)|
|Program or project staff||110 (29.6)|
|Consultant specialist||87 (23.5)|
|Frontline public health or community provider||73 (19.7)|
|Program or project management (eg, manager)||55 (14.8)|
|Senior management or administration (eg, director or executive)||25 (6.7)|
|Government official including policy||21 (5.7)|
|Chief medical or medical or associate medical officer of health||4 (1.1)|
|Other or no response||12 (3.2)|
|Program evaluator or planner||76 (20.5)|
|Health promoter||74 (19.9)|
|Public health nurse||68 (18.3)|
|Knowledge broker or knowledge translation specialist||52 (14)|
|Health analyst||38 (10.2)|
|Policy analyst||36 (9.7)|
|Administrator or administration||29 (7.8)|
|Policy advisor||24 (6.5)|
|Public health educator||21 (5.7)|
|University or college educator||20 (5.4)|
|Librarian or information specialist||13 (3.5)|
|Public health inspector||11 (3)|
|Other health clinician||10 (2.7)|
|Research staff||6 (1.6)|
|Other or no response||7 (1.9)|
|Area of public health|
|Social determinants of health||131 (35.3)|
|Chronic disease (eg, nutrition and physical activity)||130 (35)|
|All areas of public health||105 (28.3)|
|Health policy||88 (23.7)|
|Mental health including substance use||85 (22.9)|
|Injury prevention||62 (16.7)|
|Infectious disease||58 (15.6)|
|Family health or reproductive health||54 (14.6)|
|Environmental health||44 (11.9)|
|Reproductive health||29 (7.8)|
|Emergency preparedness or response||25 (6.7)|
|Dental health||17 (4.6)|
|School or child health||11 (3)|
|Hospital care||6 (1.6)|
|Other or no response||9 (2.4)|
The majority of respondents reported that the proposed electronic evidence system would extremely (186/371, 50.1%), very much (141/371, 38%), or moderately (37/371, 9.9%) assist them in their roles. Less than 2% of respondents indicated that an electronic evidence system would only slightly (3/371, 0.8%) or not at all (3/371, 0.8%) help with the work they do. Moreover, 0.3% (1/371) of participants did not answer. Participants’ preferences for community health issues and local contextual data are shown in. Interest in risk data, namely, prevalence and incidence of disease, was high, along with demographic characteristics. To a lesser degree, respondents reported wanting system functionality to compare their local population with other regions.
|Respondents, n (%)|
|Local to regional||283 (76.5)|
|To smaller subdivisions||283 (76.5)|
|To larger regions||255 (68.9)|
A summary of preferences for the types of research evidence is shown in. Best practice guidelines, systematic reviews or meta-analyses, and practice-based evidence elicited more favorable responses than quantitative or qualitative single studies. Related specifically to interventions, most respondents wanted information about the magnitude of effect and study quality. The required human and financial resources to deliver the intervention and heterogeneity of effects were selected less frequently.
|Respondents, n (%)|
|Types of research evidence|
|Best practice guidelines||323 (93.1)|
|Systematic reviews or meta-analyses||312 (89.9)|
|Practice-based evidence||305 (87.9)|
|Information about interventions|
|Magnitude of effect||316 (91.1)|
|Quality of study||315 (90.8)|
|Required human resources||271 (78.1)|
|Required financial resources||261 (75.2)|
|Heterogeneity in effect||230 (66.3)|
Information about the preference for information about public health resources required is presented in. The need for information about human resources, including the type and intensity of staff training, training to sustain a program, and the number of staff required, was frequently selected, more so than staff discipline. With respect to financial resources, a preference for cost-effectiveness was most commonly identified, followed by cost. Information on cost-utility and economic modeling were selected less frequently.
|Respondents, n (%)|
|Human resources information|
|Type and intensity of training||295 (85)|
|Type of training to sustain program||277 (79.8)|
|Number of staff required||273 (78.7)|
|Discipline of staff||235 (67.7)|
|Financial resources information|
|Economic modelling data||124 (35.7)|
When comparing preferences across the 3 decision-making levels (ie, frontline staff, program management, and senior management), a few notable differences were found (). Respondents who indicated they were program or project management providers were more likely to indicate a need for demographic data and heterogeneity in effect compared with frontline public health or community providers. No other differences were statistically significant.
|Frontline public health or community providers, n/n (%)||Program or project management, n/n (%)||Senior management or administration, n/n (%)|
|What data would you want to be included in such a system?|
|Risk factors||71/73 (97)||52/55 (94)||24/25 (96)|
|Demographics||64/73 (88)a||54/55 (98)a||25/25 (100)|
|What would you like to compare your local population with?|
|Compare your local region with a similar region in size||53/73 (73)||43/55 (78)||21/25 (84)|
|Compare subregions within your local regions||59/73 (81)||41/55 (75)||18/25 (72)|
|Compare your local region with a larger region in size||50/73 (68)||33/55 (60)||19/25 (76)|
|For data related to risk factors and diseases, which data would you want to be included in the system?|
|Prevalence||69/73 (95)||52/55 (95)||25/25 (100)|
|Incidence||68/73 (93)||50/55 (91)||25/25 (100)|
|For data related to demographics, which data would you want to be included in the system?|
|Age||71/73 (97)||54/55 (98)||25/25 (100)|
|Sex||68/73 (93)||53/55 (96)||25/25 (100)|
|Income||66/73 (90)||53/55 (96)||25/25 (100)|
|Education||68/73 (93)||47/55 (85)||22/25 (88)|
|Ethnicity||62/73 (85)||51/55 (93)||23/25 (92)|
|For research evidence about an intervention, what information would you want to be included?b|
|Magnitude of effect||57/67 (85)||50/53 (94)||22/24 (92)|
|Quality of study||55/67 (82)||48/53 (91)||22/24 (92)|
|Required human resources||55/67 (82)||42/53 (79)||18/24 (75)|
|Required financial resources||52/67 (78)||39/53 (74)||18/24 (75)|
|Heterogeneity in effect||36/67 (54)c||40/53 (75)c||15/24 (62)|
|Which of the following research evidence options would you want to be made available?b|
|Best practice guidelines||64/67 (95)||50/53 (94)||23/24 (96)|
|Systematic reviews or meta-analyses||53/67 (79)||45/53 (85)||19/24 (79)|
|Practice-based evidence (program evaluations)||56/67 (84)||51/53 (96)||20/24 (83)|
|Qualitative||27/67 (40)||21/53 (40)||6/24 (25)|
|Quantitative||24/67 (36)||18/53 (34)||7/24 (29)|
|For human resources, which information would you want available from the evidence?b|
|Type and intensity of training required to be competent to deliver interventions or programs||23/67 (96)||49/53 (92)||19/24 (79)|
|Type of training required to sustain program||19/67 (79)||41/53 (77)||21/24 (87)|
|Number of staff required to implement the program||20/67 (83)||47/53 (89)||20/24 (83)|
|Discipline of required staff||6/67 (25)a||40/53 (75)||19/24 (79)a|
|For financial resources, which information would you want available?b|
|Cost-effectiveness||61/67 (91)||47/53 (89)||21/24 (87)|
|Cost||47/67 (70)||41/53 (77)||19/24 (79)|
|Cost-utility||28/67 (42)||29/53 (55)||12/24 (50)|
|Economic modeling data||23/67 (34)||20/53 (38)||10/24 (42)|
aIndicates statistically significant difference (P=.04).
bSome participants only provided partial answers to the survey; thus, the sample sizes differ across questions.
cIndicates statistically significant difference (P=.045).
Qualitative data from open-ended questions identified several specific needs, concerns, and suggestions for an electronic evidence system. Echoing the preferences for cross-jurisdictional comparisons found in the quantitative results, respondents identified the ability to compare indicators across geographic areas, the inclusion of equity indicators and epidemiologic data, and the use of geographic information systems as other specific requests. Health equity indicators, such as the determinants of health, were seen as important in identifying and describing vulnerable populations. One respondent stated:
...generally, any data that might link to poverty measures, immigration status, housing situation (e.g., housed, homeless), recipient of childcare subsidy, recipient of social assistance etc.
A major theme that emerged with respect to the type of research evidence to be included was the usefulness of research beyond what is typically considered public health interventions, such as organizational interventions and interventions from the fields of education, social services, and law. Regardless of the type of research, there was a strong desire for all evidence to be critically appraised and be presented alongside summaries or statements to help interpret the evidence, as illustrated in the following quote:
...while I would be open to including all kinds of research, I would want them to be graded, to ensure that one could assess the quality of the evidence.
Similarly, participants also emphasized the need for practice-based evidence that provides contextual information on the outcomes of interventions and implementation. This included evidence on the context in which an intervention was implemented, adoption of the intervention, and considerations on how to deliver and sustain it in the community. This is reflected in a respondent’s comment:
[I] need a way to analyze context where an intervention is used. For example, if previously similar interventions had been tried in an area or subpopulation there may already be a delivery system or key partnerships in place, and there may also be a learning effect from previous work that is beneficial to achieving results with a “new” intervention.
To support the need for contextual and implementation data, respondents also specifically mentioned the need for qualitative and mixed-methods research and needs assessments conducted within other communities or organizations.
Related to resources and tools for practice, a need for theories, methods, or frameworks to support adaption or to implement a program in their community was identified. Some respondents mentioned specific frameworks, such as the Reach, Effectiveness, Adoption, Implementation, and Maintenance framework, whereas others had general suggestions for evaluation or implementation frameworks. There were also requests for tools to support practice, such as the Applicability and Transferability Tool , which supports public health planners’ use of evidence to support appropriate programming for the community, or survey question templates.
In addition to the specific needs for an electronic system, a number of potential concerns or barriers emerged. Concerns were related to either the electronic system itself or the ability to adopt a system within public health organizations. Concerns about keeping a system up to date stemmed from the understanding that evidence is created at rapid rates and new data are constantly being collected. For such a system to be useful, data would need to be current. Sustainability of the system beyond its initial creation was seen as a critical element for successful implementation, with some participants citing concerns if the system were to be funded by a research grant. An understanding of plans for long-term upkeep and sustainability may be a requirement for individual users or organizations to invest time in learning how to use the system.
The potential for duplication of existing resources was another concern related to such a system, with respondents citing specific databases or systems that already exist, and how existing databases and systems would complement or conflict with any new system. One respondent captured this sentiment, stating that:
...these systems are difficult to set up AND keep up to date. In addition, other similar systems (except for intervention data) already exist and this may add to the confusion for users (which data is THE official data?) Why do we observe differences between two systems for same indicator? Etc.
Related to the ability of individuals and organizations to adopt and implement the system, major themes about usability and costs emerged. The cost of the proposed system was seen as a key potential barrier, with questions about who would pay for it arising frequently. Second, the ability of a system to work with existing information technology infrastructure, such as outdated or restrictive computer systems and limited or slow internet connectivity, was raised as a concern. Beyond the initial barriers of cost and access, an organization’s ability to adopt the use of a system in their regular workflow was reported to be dependent on the ability of individual staff to use the system adequately, which requires not only buy-in by the individual employee but also senior-level management. Finally, concerns about data privacy and maintenance of confidentiality were also expressed.
A number of suggestions for success emerged from the qualitative data. The most frequently mentioned requirement to facilitate use of the system were transparency of methodology used, including the criteria to select evidence for inclusion, the methods used to evaluate and synthesize evidence, and the overall quality of the evidence included. One respondent stated that they “...would need a very detailed ‘methods’ section of this system to be able to be confident in it.” Sufficient staff training was also suggested to support the use of the proposed system.
Finally, respondents requested specific functions or system formatting elements, such as the ability to make graphs, print or export data, and retrieve contact information of data sharers on the system.
The purpose of this study is to understand the preferences of Canadian public health professionals for an electronic evidence system. The results indicate that there is a perceived need for an electronic evidence system; however, certain considerations related to the type of information included and how it would be presented must be addressed for such a system to be adopted and used effectively for public health decision-making.
Preferences for all 3 types of evidence (community health issues and local context, research evidence, and public health resources) were generally high. This aligns with previous research that public health professions value different sources of evidence . An important consideration to emerge from both the quantitative and qualitative data was the need to understand the quality of the evidence included within the system. Participants suggested that the evidence included in an electronic evidence system should be preappraised and include a statement of interpretation along with a description of the methodology used to appraise the evidence. Using the best available evidence is a critical component of evidence-informed decision-making [ ]. Critical appraisal requires knowledge and skill development through training and time to appraise evidence on a continual basis. Respondents were aware that there was a need for evidence to be appraised but wanted a system to do this for them. This was also the case for the ability to interpret evidence appropriately, and there was recognition that there may be various levels of skills to understand evidence. These findings are in line with previous literature that shows that time, knowledge, and skills in appraising different types of evidence are a barrier to evidence-informed decision-making in public health [ , ]. A qualitative study involving public health decision makers found that clear implication statements from the evidence facilitated uptake of this knowledge in practice and decision-making processes [ ]. Including evidence that has been preappraised and accompanied by interpretation statements, possibly through AI approaches within an electronic evidence system, may make it easier for users to understand and use the evidence, effectively overcoming some challenges to the evidence-informed decision-making process.
The need for information to examine and address the determinants of health and health equity came through strongly in this study. This is not surprising given the previous literature that suggests that equity information is commonly lacking in scientific publications. A 2016 scoping review of population health interventions found that most studies included minimal contextual information on the target population and intervention setting . This contextual information is important for effective decision-making, as it is necessary to appropriately apply evidence in different settings. Furthermore, concerns have been raised about the potential of AI to "amplify inequities in society" because of inherent biases in data sets and programming on a large scale [ ]. Although AI is useful in identifying which trends are occurring, some AI methods, such as machine learning, may lack the ability to describe why the pattern occurs [ ]. An understanding of contextual indicators such as the social determinants of health can improve the adoption and sustainability of public health investment and potentially limit biases embedded within an electronic evidence system [ , , ].
A key concern that emerged from the qualitative data was avoiding duplication of existing resources, some of which were already in use within their organization. For example, in Canada, the Canadian Best Practice Portal captures intervention evidence on effective health promotion and chronic disease prevention, but it is no longer updated ; OpenData shares surveillance evidence nationally and its uses in practice [ ]; Statistics Canada provides access to census-based population data [ ]; and Health Evidence provides quality assessments of systematic reviews of public health interventions [ ]. However, these are independent platforms that search for and synthesize data and do not integrate different types of evidence, such as local context and public health resources, requiring users to search multiple platforms [ ]. There have been calls to action from experts in public health and health informatics for pan-Canadian collaborative efforts to facilitate access to databases across the country [ ]. Until then, any new electronic evidence system should explore partnerships with relevant existing platforms and mechanisms to avoid duplication of resources and efforts.
Barriers to the use of an electronic evidence system identified in this survey are similar to those found in a previous systematic review on barriers to public health data sharing . In the review, the authors identified six main categories of barriers: technical, motivational, economic, political, legal, and ethical [ ]. Regarding technical barriers, concerns about the integration of a new electronic evidence system within the existing information technology infrastructure of an organization emerged from the qualitative data [ ]. Economic barriers related to initial and ongoing financial costs were also raised [ ]. Some participants in this study expressed concerns with respect to legal barriers, such as data privacy and confidentiality. Both technical and economic barriers illustrate the need for greater organizational capacity development [ ]. A previous review suggested allocating 5%-10% of program funds to data collection, monitoring, evaluation, and operational research, while recognizing that larger systems change needs to occur simultaneously to build sustainable funding mechanisms [ ]. Future research is needed to further understand how best to implement such a system in a way that overcomes the known technological barriers such as interoperability and cost, among others.
In our survey, motivational, political, and ethical barriers were not raised; however, the survey did not specifically seek feedback on these factors. Although motivational barriers, which limit data sharing at an individual or organizational level, were not explicitly mentioned, some respondents suggested possible ways to overcome a component of this barrier, disagreements in data use . Respondents suggested providing contact information of researchers who shared the data or the inclusion of a networking component in the electronic evidence system to facilitate discussion about the data, implementation of the possible intervention, successes or failures of interventions in different contexts, etc. The ability to have discussions between the data donor and the researcher using the data may increase trust between both parties, transparency, and reliability of the platform. As mentioned in the 2014 review, the 6 categories of barriers have complex interactions, which need to be addressed with a comprehensive approach to ensure usability of a potential electronic evidence system [ ].
An additional barrier identified in the qualitative responses was the need for ongoing training of staff to use the system. Although AI has the potential to compile, process, synthesize, and analyze patterns at rapid rates and to improve efficacy in the use of evidence, blind reliance on its outputs runs the risk of misrepresenting variables or groups of people as it is dependent on data collection methods and evidence inputs [, , ]. Experts recommend that AI-specific training is also needed if users are to appropriately address concerns of equity and systemic biases in electronic evidence systems [ , , ]. This highlights an important consideration for future implementation of such a system. A potential avenue for training can be through web-based learning modules, as they have been found to be effective for public health professionals in one study [ ]. Web-based modules that provide training on how to optimize the system and offer other features suggested by respondents, including videos, webinars with creators of the system, and social networking features to connect with other users and researchers, may support public health professionals in using the system efficiently and overcome the aforementioned barriers.
There are some limitations to this study, which should be considered when interpreting the findings. First, we did not collect any individual demographic data or years of experience working in public health, limiting the extent to which we can characterize the types of individuals who took part in this survey. In allowing participants to select all that apply for organization type, role, area of public health, and practice discipline, our analysis was limited in its ability to compare differences in preferences across each of these categories. Although the survey was disseminated through two large Canadian-based listservs to recruit public health professionals, there was no qualifying question to confirm that the preferences that emerged were solely of public health professionals in Canada. Second, respondents in the survey ranged across roles, areas, and disciplines, and their results may not be generalizable across all Canadian public health professionals, as respondents who participated may have prior awareness of, or an interest in, electronic evidence systems or evidence-informed decision-making. Finally, the survey questions for a hypothetical electronic evidence system without considerations of feasibility may have skewed responses positively, where respondents more favorably indicated the need for all items listed .
Public health professionals and organizations face many hurdles, including changes in structure, lack of funding and time, and exponential increases in new evidence. However, there is broad agreement that the hypothetical electronic evidence system proposed would make informed decisions more accessible. On the basis of our findings, public health professionals see the value in an electronic evidence system that combines local contextual evidence, research and intervention studies, and public health resources and tools. Our findings also highlight a number of elements that should be considered to ensure usability and facilitate trust in such an electronic evidence system. These elements include quality appraisals, interpretations of evidence, and transparent methods and funding models. Such an electronic evidence system may support professionals in evidence-informed decision-making, thereby enabling the Canadian public health system to be more effective in an environment with limited investment.
This work was supported by a Canadian Institutes of Health Research project grant (application 397883) awarded to DLB and MD. At the time of the study, DLB held a position as the Canadian Institutes of Health Research Chair in Applied Public Health Research and SENS was supported by a postdoctoral fellowship from the Canadian Institutes of Health Research. Data collection, analysis, and content of this study were not influenced by the funding received from the Canadian Institutes of Health Research.
Conflicts of Interest
Categorized needs assessment questionnaire items and answer options.PDF File (Adobe PDF File), 43 KB
- Structural profile of public health in Canada. National Collaborating Centre for Healthy Public Policy. 2018. URL: http://www.ncchpp.ca/710/Structural_Profile_of_Public_Health_in_Canada.ccnpps [accessed 2021-05-05]
- Fedeli V. 2019 Ontario Budget: Protecting What Matters Most. Government of Ontario: Ministry of Finance; 2019. URL: https://budget.ontario.ca/pdf/2019/2019-ontario-budget-en.pdf [accessed 2021-05-05]
- Guyon A, Perreault R. Public health systems under attack in Canada: evidence on public health system performance challenges arbitrary reform. Can J Public Health 2016 Oct 20;107(3):326-329 [FREE Full text] [CrossRef] [Medline]
- Alberta Health Services performance review. Government of Alberta. 2019. URL: https://open.alberta.ca/publications/alberta-health-services-performance-review-summary-report [accessed 2021-07-30]
- Himmelstein DU, Woolhandler S. Public health’s falling share of US health spending. Am J Public Health 2016 Jan;106(1):56-57. [CrossRef]
- PHE annual report and accounts: 2018 to 2019. Public Health England. 2019. URL: https://www.gov.uk/government/publications/phe-annual-report-and-accounts-2018-to-2019 [accessed 2021-07-30]
- Maani N, Galea S. COVID-19 and underinvestment in the public health infrastructure of the United States. Milbank Q 2020 Jun;98(2):250-259 [FREE Full text] [CrossRef] [Medline]
- Murray CJ, Alamro NM, Hwang H, Lee U. Digital public health and COVID-19. Lancet Pub Health 2020 Sep;5(9):469-470. [CrossRef]
- Brownson RC, Fielding JE, Green LW. Building capacity for evidence-based public health: reconciling the pulls of practice and the push of research. Annu Rev Public Health 2017 Nov 20;39:27-53. [CrossRef] [Medline]
- Kohatsu N, Robinson J, Torner J. Evidence-based public health: an evolving concept. Am J Prev Med 2004 Dec;27(5):417-421 [FREE Full text] [CrossRef]
- A model for evidence-informed decision making in public health. National Collaborating Centre for Methods and Tools. 2009. URL: https://www.nccmt.ca/uploads/media/media/0001/01/d9f5cec8637db62f8edda6a6a2551b293a053ede.pdf [accessed 2021-05-05]
- Yost J, Dobbins M, Traynor R, DeCorby K, Workentine S, Greco L. Tools to support evidence-informed public health decision making. BMC Public Health 2014 Jul 18;14(1):728 [FREE Full text] [CrossRef] [Medline]
- Culyer AJ, Lomas J. Deliberative processes and evidence-informed decision making in healthcare: do they work and how might we know? Evid Policy 2006 Aug 01;2(3):357-371. [CrossRef]
- Ontario public health standards: requirements for programs, services, and accountability. Ministry of Health and Long-Term Care, Government of Ontario. 2018. URL: https://www.regionofwaterloo.ca/en/regional-government/resources/Documents/Ontario-Public-Health-Standards-OPHS.pdf [accessed 2021-05-05]
- Nova Scotia public health standards 2011-2016. Government of Nova Scotia. 2016. URL: https://novascotia.ca/dhw/publichealth/documents/Public_Health_Standards_EN.pdf [accessed 2021-07-30]
- Public health economics. Ministry of Health, British Columbia. 2016 Aug 18. URL: https://www.health.gov.bc.ca/library/publications/year/2017/BC-guiding-framework-for-public-health-2017-update.pdf [accessed 2021-05-05]
- The 10 essential public health services. Centers for Disease Control and Prevention. 2020. URL: https://www.cdc.gov/publichealthgateway/publichealthservices/essentialhealthservices.html [accessed 2021-07-30]
- Shaban-Nejad A, Lavigne M, Okhmatovskaia A, Buckeridge D. PopHR: a knowledge-based platform to support integration, analysis, and visualization of population health data. Ann N Y Acad Sci 2017 Jan;1387(1):44-53. [CrossRef] [Medline]
- Shoveller J, Viehbeck S, Di Ruggiero E, Greyson D, Thomson K, Knight R. A critical examination of representations of context within research on population health interventions. Critical Public Health 2015 Dec 22;26(5):487-500. [CrossRef]
- Martin W, Higgins JW, Pauly B, MacDonald M. "Layers of translation" - evidence literacy in public health practice: a qualitative secondary analysis. BMC Public Health 2017 Oct 11;17(1):803 [FREE Full text] [CrossRef] [Medline]
- Groseclose SL, Buckeridge DL. Public health surveillance systems: recent advances in their use and evaluation. Annu Rev Public Health 2017 Mar 20;38(1):57-79. [CrossRef] [Medline]
- Dobbins M, Buckeridge D. Precision public health: dream or reality? Can Commun Dis Rep 2020 Jun 04;46(6):160 [FREE Full text] [CrossRef] [Medline]
- Dolley S. Big data's role in precision public health. Front Public Health 2018 Mar 7;6:68 [FREE Full text] [CrossRef] [Medline]
- Buckeridge DL. Precision, equity, and public health and epidemiology informatics - a scoping review. Yearb Med Inform 2020 Aug 21;29(1):226-230. [CrossRef] [Medline]
- De Spiegeleire S, Maas M, Sweijs T. Artificial intelligence and the future of defense: strategic implications for small- and medium-sized force provider. The Netherlands: The Hague Centre for Strategic Studies; 2017.
- Russell S, Norvig P, Davis E. Artificial Intelligence: A Modern Approach. London, United Kingdom: Pearson; 2010:1-1152.
- Cresswell K, Callaghan M, Khan S, Sheikh Z, Mozaffar H, Sheikh A. Investigating the use of data-driven artificial intelligence in computerised decision support systems for health and social care: a systematic review. Health Informatics J 2020 Sep 22;26(3):2138-2147 [FREE Full text] [CrossRef] [Medline]
- Triantafyllidis AK, Tsanas A. Applications of machine learning in real-life digital health interventions: review of the literature. J Med Internet Res 2019 Apr 05;21(4):e12286 [FREE Full text] [CrossRef] [Medline]
- Application of artificial intelligence approaches to tackle public health challenges - workshop report. Canadian Institutes of Health Research. 2018. URL: https://cihr-irsc.gc.ca/e/51018.html [accessed 2021-07-30]
- Morgenstern JD, Rosella LC, Daley MJ, Goel V, Schünemann HJ, Piggott T. "AI's gonna have an impact on everything in society, so it has to have an impact on public health": a fundamental qualitative descriptive study of the implications of artificial intelligence for public health. BMC Public Health 2021 Jan 06;21(1):40 [FREE Full text] [CrossRef] [Medline]
- Canadian best practices portal. Government of Canada. 2014. URL: https://cbpp-pcpe.phac-aspc.gc.ca/# [accessed 2021-05-05]
- Helping public health use best evidence in practice since 2005. Health Evidence. URL: https://www.healthevidence.org/ [accessed 2021-05-05]
- Open data. Government of Canada. URL: https://open.canada.ca/en/open-data [accessed 2021-05-05]
- Data. Statistics Canada. URL: https://www150.statcan.gc.ca/n1/en/type/data?MM=1 [accessed 2021-05-05]
- HealthLandscape. URL: https://healthlandscape.org/ [accessed 2021-05-05]
- CDC Wonder. Centers for Disease Control and Prevention. URL: https://wonder.cdc.gov/ [accessed 2021-05-05]
- Behavioral risk factor surveillance system. Centers for Disease Control and Prevention. URL: https://www.cdc.gov/brfss/index.html [accessed 2021-05-05]
- Database of promoting health effectiveness reviews (DoPHER). EPPI-Centre. URL: http://eppi.ioe.ac.uk/webdatabases4/Intro.aspx?ID=9 [accessed 2021-05-05]
- National Center for Health Statistics. Centers for Disease Control and Prevention. 2021. URL: https://www.cdc.gov/nchs/ [accessed 2021-05-05]
- Peirson L, Ciliska D, Dobbins M, Mowat D. Building capacity for evidence informed decision making in public health: a case study of organizational change. BMC Public Health 2012 Mar 20;12(1):137 [FREE Full text] [CrossRef] [Medline]
- Dobbins M, Traynor RL, Workentine S, Yousefi-Nooraie R, Yost J. Impact of an organization-wide knowledge translation strategy to support evidence-informed public health decision making. BMC Public Health 2018 Dec 29;18(1):1412 [FREE Full text] [CrossRef] [Medline]
- Azungah T. Qualitative research: deductive and inductive approaches to data analysis. Qual Res J 2018 Oct 31;18(4):383-400. [CrossRef]
- Buffet C, Ciliska D, Thomas H. Can I use this evidence in my program decision? Assessing applicability and transferability of evidence. National Collaborating Centre for Methods and Tools. 2007. URL: https://www.nccmt.ca/tools/assessing-applicability-and-transferability-of-evidence [accessed 2021-05-05]
- Dobbins M, Jack S, Thomas H, Kothari A. Public health decision-makers' informational needs and preferences for receiving research evidence. Worldviews Evid Based Nurs 2007 Sep;4(3):156-163. [CrossRef] [Medline]
- Panch T, Mattie H, Atun R. Artificial intelligence and algorithmic bias: implications for health systems. J Glob Health 2019 Dec;9(2):010318 [FREE Full text] [CrossRef] [Medline]
- AI for public health equity - workshop report. Canadian Institutes of Health Research. 2019. URL: https://cihr-irsc.gc.ca/e/51425.html [accessed 2021-05-05]
- van Panhuis WG, Paul P, Emerson C, Grefenstette J, Wilder R, Herbst AJ, et al. A systematic review of barriers to data sharing in public health. BMC Public Health 2014;14:1144 [FREE Full text] [CrossRef] [Medline]
- Panch T, Pearson-Stuttard J, Greaves F, Atun R. Artificial intelligence: opportunities and risks for public health. Lancet Dig Health 2019 May;1(1):13-14. [CrossRef]
- Kenefick HW, Ravid S, MacVarish K, Tsoi J, Weill K, Faye E, et al. On your time: online training for the public health workforce. Health Promot Pract 2014 Mar 27;15(1 Suppl):48-55. [CrossRef] [Medline]
- Lavrakas P. Encyclopedia of Survey Research Methods. Thousand Oaks, California, United States: SAGE Publications; 2008.
|AI: artificial intelligence|
|NCCMT: National Collaborating Centre for Methods and Tools|
Edited by T Sanchez; submitted 14.12.20; peer-reviewed by M Bruno, A Fernandes, R Boumans; comments to author 31.03.21; revised version received 26.05.21; accepted 02.06.21; published 07.09.21Copyright
©Bandna Dhaliwal, Sarah E Neil-Sztramko, Nikita Boston-Fisher, David L Buckeridge, Maureen Dobbins. Originally published in JMIR Public Health and Surveillance (https://publichealth.jmir.org), 07.09.2021.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Public Health and Surveillance, is properly cited. The complete bibliographic information, a link to the original publication on https://publichealth.jmir.org, as well as this copyright and license information must be included.