Printer Friendly

Awareness and use of systematic literature reviews and meta-analyses by ministerial policy analysts.

Proponents of evidence-informed policy-making see systematic literature reviews as important tools (Dobbins et al. 2001; Gough and Elbourne 2002; Lavis 2009; Perrier et al. 2011). In their views, systematic reviews can summarize complex evidence and provide an exhaustive and comprehensible survey of evidence, while minimizing bias. Similarly, meta-analyses, when based on a systematic review of existing studies, are presented as comprehensive statistical summaries of evidence that can pool various datasets and as powerful means for reducing errors (Ringquist, 2013; Cumming 2014). From a policy standpoint, both convey evidence which should be highly relevant to policy-makers and policy analysts. However, little empirical evidence exists on the extent to which these specific research designs are known, let alone how they are used by such policy actors.

This article begins by reviewing the logic for greater uptake of systematic literature reviews and meta-analyses in policy analysis. We then present empirical evidence on the awareness and use of evidence of these research designs by policy analysts employed in government departments (Quebec, Canada). The data, while collected in 2008, is unique because the use of systematic reviews and meta-analyses has received little attention from empirical researchers. To the best of our knowledge, no data collection effort has targeted this question. The policy environment in which the data collection took place may have changed to a certain extent, but our analysis shows relevant patterns that provide insight into a central issue of public administration. Moreover, our findings and analysis can serve as a benchmark for future empirical assessment, which would be highly pertinent given the rapidly evolving resources and tools to support evidence-informed policy-making in general, and policy use of systematic literature reviews in particular.

This research note has four objectives: 1) to present systematic literature reviews and meta-analysis methodologies and argue for their relevance in policy-making; 2) to describe the extent to which these research designs are used by ministerial policy analysts; 3) define the socio-professional profile of those policy analysts; and 4) discuss some implications of our analyses for policy analysts and researchers.

Systematic literature reviews and meta-analyses as policy tools

Policy analysts are involved in a wide range of activities, including documenting phenomena of interest and policy options (Colebatch 2005, 2006; Wellstead et al. 2009; Howlett and Wellstead 2012). Analysts will typically search for information, data, evidence found inside or outside their organizations in order to depict a policy context as accurately as possible. With this analysis done, their task turns to providing advice (Lomas and Brown 2009). This could involve summarizing stakeholders' opinions, estimating potential costs, identifying foreseeable obstacles in implementation, gathering evidence on the effectiveness of policy interventions in other jurisdictions, etc. With respect to the latter, multiple strategies can be used (for example, asking key actors, searching various sources, etc.), but here we consider consulting an up-to-date existing systematic review of literature or, if resources, skills and time allow for it, conducting or commissioning one.

Systematic literature reviews (Higgins & Green 2011: section 1.2) can be defined as:

[...] attempts to identify, appraise and synthesize all the empirical evidence that meets prespecified eligibility criteria to answer a given research question. Researchers conducting systematic reviews use explicit methods aimed at minimizing bias, in order to produce more reliable findings that can be used to inform decision-making.

In contrast to traditional literature reviews, greater attention is placed on rigorous screening of all relevant studies and evaluating the validity of studies by critically appraising included studies with formal tools (Lapointe et al. 2015). The result, if conducted properly, is a synthesis of the findings of all available studies where individual studies' attributes have been evaluated to control for quality. In sum, a systematic literature review provides a state-of-the-art view of all available research evidence on a specific question. Such reviews need not only focus on the effectiveness of policy interventions, as is sometimes implied, but can also cover other matters which interest policymakers (for example, problem definition, cost-effectiveness, and stakeholder opinions). (1) By carefully identifying, selecting, reviewing and synthesizing research studies, systematic reviews are thought to limit risks of bias.

Meta-analyses are sometimes included as part of a systematic review (but not necessarily). A meta-analysis normally uses the same principles as a systematic literature review, but pools quantitative results. With information--usually effect sizes--extracted from the published articles, or datasets themselves, an average or summary effect is typically estimated (which represents the averaged effect of individual studies included in the meta-analysis).

Glass (1976) defines meta-analysis as "the statistical analysis of a large collection of analysis results from individual studies for the purpose of integrating the findings. It connotes a rigorous alternative to the casual, narrative discussions of research studies which typify our attempts to make sense of the rapidly expanding research literature". (2) Meta-analyses are mainly a collection of statistical methods to calculate an overall effect from individual studies which are (ideally) on a common scale and then weighted to control for estimate precision (that is, more precise estimates are given more weight in the analysis). Several statistical models can be used to conduct metaanalyses (for example, frequentist or Bayesian, fixed-effect or random-effect Egger et al. 1997) and the result can express absolute (for example, mean difference) or relative measures of effect (for example, relative risk). While the estimation of a specific effect is the main goal of a meta-analysis, Tufanaru et al. (2015: 200) remind us of the variety of possible objectives:

[...] reviewers should be aware that there are different, legitimate objectives for a meta-analysis: to improve statistical power to detect a treatment effect, provide the closest estimate of an unknown real effect, identify subsets of studies (subgroups) associated with a beneficial effect, and explore if there are differences in the size or direction of the treatment effect associated with study-specific variables.

By aggregating evidence that bear similar enough methodological attributes (for example, research question, research design, data analysis, etc.), meta-analyses prove to be an important tool to synthesize quantitative evidence, explore and ideally, explain, existing variations in effects across studies.

For example, suppose one is interested in educational organizations in low- and middle-income countries and wants to assess the effects of decentralizing decision-making processes: one can consult a recent systematic review (Carr-Hill et al. 2015) where school-based decision-making has been evaluated for various outcomes (for example, attrition, equality of access, increased enrolment, test scores, psychosocial and non-cognitive skills). The study's inclusion criteria allowed for three main types of research designs: experimental designs, quasi-experiments and before-and-after studies. Following an initial search yielding 2,817 article titles, the selection criteria were applied and 26 eligible studies were identified for inclusion in the meta-analysis. The results demonstrate a negative effect on drop-out rates but was not statistically significant (-0.07 'standardized mean difference' (SMD), (95% confidence interval (CI) = -0.14, 0.01). Despite significant heterogeneity across measurements, a positive effect on test scores and positive and significant effect on math test scores was found (0.08 SMD; 95% CI = 0.02, 0.13). Similarly, a positive and significant effect was found by pooling 17 estimates for students' language test scores (0.07 SMD; 95% CI = 0.02, 0.13). The review provides reasonable evidence that some form of decentralization might be beneficial in some contexts. An interesting conclusion reached by the authors relates to the heterogeneity of the effects: decentralization is unlikely to work in highly disadvantaged communities. Thus, the authors' conclusions and policy recommendations tend to support school-based decision-making, but requires minimal community education, socio-economic status and participation to be effective.

A second example might involve understanding the effects of mobile technology (for example, mobile phones and tablets) on health-care service-delivery. An analyst could turn to a recent systematic review and meta-analysis (Free et al. 2013). Without reporting the data and conclusion in detail, one finds that 42 trials were identified in the review and that the pooled random effects (based on 10 studies included in the meta-analysis) demonstrate modest benefits for outcomes such as appointment attendance. More specifically, statistically significant (95%) results were reported for the latter outcome, where the estimated relative risk was of 1.06. For various reasons beyond the statistical significance of the results, the conclusion suggests that such intervention is suitable for implementation.

Systematic literature reviews and meta-analyses face limitations. First, they are not always readily available for analysts to consider, as they do not cover every possible policy issue. This was a considerable problem years ago and sometimes remains an obstacle today, but is less of a problem as more and more systematic literature reviews are produced in a wide variety of policy sectors. One can find systematic literature reviews in criminology and justice (Campbell Collaboration), education (Education Endowment Foundation), education and health (EPPI Centre), health (The Cochrane Library, Health Systems Evidence, Health Evidence, National Institute for Health and Care Excellence, Joanna Briggs Institute), environment (Collaboration for Environmental Evidence) and development in low- and middle-income countries (3ie International Initiative for Impact Evaluation). (3) One can also look for systematic literature reviews derived products such as rapid reviews (4), if relevant (Harnan et al. 2015).

Second, systematic literature reviews are generally designed as knowledge syntheses of intervention studies (for example, of randomized controlled trials) and might appear limited in the scope of research designs they include. However, both systematic reviews and meta-analyses need not be limited to a single type of study. (5) Third, the quality of a systematic review and meta-analysis rests on several assumptions: the quality of the included individual data, the comparability of measures pooled, the exhaustiveness of the literature searches, etc. This last assumption is particularly important because studies that are unpublished or unavailable would not normally be considered and the effects of other included studies might be over-represented. This is publication bias and is widely recognized as a potential threat to internal validity. (6) A similar phenomenon can be signaled in the policy arena, where relevant information and evidence can be found in different sources not normally captured by published academic reviews (for example, government white papers or think tank reports.).

The evidence-informed policy movement has presented systematic literature reviews and its statistical counterpart meta-analyses as rigorous tools that can be profitably used by policy actors for quick surveys of existing literature. However, this does not mean that the policy-making is or should be entirely determined or exclusively guided by such evidence. Policy-making is complex: evidence competes with multiple streams of inputs (for example, values, economic resources, and political considerations) and can come into play at different levels of decision-making (Oliver et al. 2014b). That said, in contexts where research findings can provide helpful policy inputs, systematic literature reviews and meta-analyses might be the best means for providing a valid viewpoint on the state of the literature on a precise question.

With these considerations in mind, we sought to discover whether these research designs were known to ministerial policy analysts and, if so, to what extent were they used.

Methods and data

Our data came from a telephone survey conducted in 2008 (between 26 September and 25 November) among Quebec public administration professional employees. Multiple professional groups of civil servants were included and the study sought to capture their analytical and advisory tasks. Specifically, the targeted individuals were analysts working in various departments; managers, directors and administrative support staff were excluded. 62.48% (n = 1,617) of the invited respondents completed the survey, following a stratified random sample selection procedure. The survey was designed to capture the professional habits of civil servants related to knowledge mobilization. It included several measures relative their use of various evidence sources and types and measures of their professional context and individual attributes.

For systematic reviews, respondents were asked two questions (the second was conditional on a positive answer to the first): "Have you ever heard of systematic literature reviews--in English (8) it is called a systematic review of literature? It is a sophisticated research protocol which differs from traditional literature reviews". If the respondent indicated awareness of that type of design, a second question followed: "In the past 12 months, have you ever read one, in part or in totality?" Regarding the meta-analyses, the respondents were first asked: "Have you ever heard of meta-analyses--in English, it is called a meta-analysis?" Again, if answered positively, the respondent was asked: "In the past 12 months, have you ever read one, in part or in totality?" These measurements were then recoded as two three-level factors: 1) never heard, 2) aware, without use, and 3) aware with use. The socio-professional profile of analysts was captured by recording their gender (that is, male/female), their type of training (that is, undergraduate, professional postgraduate, and research postgraduate), their reported access to electronic databases from their workspace (yes/no/ don't know), their ministry of employment of the past 12 months (18 departments).

The results detailed below present basic descriptive and bivariate statistics (for example, chi-square test of independence, p-value) for the variables described above. Data manipulation and analysis was conducted on R software (R Core Team 2015) and the graphical outputs produced using ggplot2 (Wickham 2009).

Results

Overall we found that 57.8% (929) (9) of policy analysts reported having never heard of systematic literature reviews, while 19% (306) said they had heard of them but had not consulted one in the past 12 months, although 23.2% (373) had consulted at least one in the previous 12 months. For meta-analyses, we found that 69.4% (1113) of respondents had never heard of meta-analyses as a research design. Among those who answered 'yes' to that first question, 19.1% (307) acknowledged knowing what meta-analyses were without having read one in the past 12 months. Finally, only 11.5% (184) of policy analysts had read one meta-analysis or more in the past year.

It is reasonable to suppose that the type of training (undergraduate, professional postgraduate or postgraduate in research) might affect the likelihood of using research (Oliver et al. 2014a). We found that the proportion of respondents who have never heard of systematic literature reviews decreases by roughly ten points of percentage as one moves from an undergraduate training (66.8%, 555) to a professional postgraduate (55.4%, 138) and to a postgraduate in research (44.7%, 236). Conversely, rates of consultation increased significantly with postgraduate training in research (33.5%, 177) or a professional one (27.7%, 69), compared to undergraduate training (15.3%, 127). In other words, the proportion of consultation of systematic reviews increases twofold when comparing undergraduate training to research postgraduate (see Figure 1).

A similar pattern emerges when looking at the distribution of response for meta-analyses. In this case, much higher proportions appear for individuals who reported having never heard of meta-analyses (80.3% (666) among undergraduates, 58.2% (145) among professional postgraduates and 57.4% (302) among research postgraduates). Interestingly, we found a slightly higher proportion of professional postgraduates than research postgraduates reporting having consulted a meta-analysis over the previous year (19.3% (48) and 17.9% (94), respectively).

Access to academic research can be considered a basic pre-condition to consulting and using evidence (Oliver et al. 2014a). Without access to research evidence, it is unlikely that policy analysts would consult and apply research evidence in their professional work. However, the levels of evidence accessed in policy organizational settings have not been thoroughly investigated in relation to evidence use. At the outset, we note that only 48.7% (785) of policy analysts in our dataset reported having access to electronic bibliographic databases that would allow them to download research articles from their workspace. (10) Figure 2 shows that of the analysts reporting no access to bibliographic databases, 63.3% (522) said they never heard of systematic reviews, 19.3% (159) report having heard of them, and 17.5% (144) claim to have heard of systematic reviews and used at least one in the past 12 months.

[FIGURE 1 OMITTED]

This last proportion is interesting: it suggests that some respondents accessed systematic reviews through other means than their department. Of those reporting access to electronic databases, 51.9% (406) said they never had heard of systematic reviews, 18.8% (147) claimed to know what systematic reviews were, and 29.3% (229) reported not only knowing what they were but having consulted at least one over the past 12 months. There was an increase of about 12% in rates of awareness and consultation of systematic literature reviews if respondents reported access to electronic bibliographic databases from their workstations.

[FIGURE 2 OMITTED]

Figure 2 also shows that 73.3% (605) of policy analysts reporting not having access to bibliographic databases also indicated that they never had heard of meta-analyses, 18.9% (156) without access said they had heard of meta-analyses but had not consulted one over the past 12 months, and only 7.7% (64) reported having no access to bibliographic databases but were able to consult at least one in the previous 12 months. When turning to policy analysts reporting access to electronic databases from their workstation, we found that 65.2% (507) had never heard of meta-analyses, 19.4% (151) have heard of them but have not consulted at least one over the past 12 months, and 15.4% (120) both report knowing what meta-analyses were and having consulted at least one in the past 12 months. In sum, there was an approximately twofold increase in the proportion of policy analysts having consulted at least one meta-analysis when compared those who report access to electronic databases and those who do not.

The surveyed analysts worked in different department, where priorities, organizational resources and procedures make it likely to find substantial differences across policy sectors. Figure 3 shows that and we observed that use varied by department. (11) For instance, Health & Social Services report an aggregate level of awareness and consultation of systematic reviews of 53% (53), Sustainable Development, Environment & Parks report 28.2% (49) and Employment & Social Solidarity report 26.4% (29). Conversely, some departments reported considerably lower rates of awareness and consultation. For instance, Transportation reports 16.7% (10), whereas analysts in Finance reported proportions of awareness and use of report 7.1% (6) and analysts in Justice only 4.9% (4).

[FIGURE 3 OMITTED]

In the case of meta-analyses, we found a different pattern in the most important users of meta-analyses. As with systematic reviews, Health & Social Services reported the highest rate of consultation (38.4%, 38). Otherwise, Education, Leisure & Sports came second (19.6%, 7) and Immigration & Cultural Communities were the top third (18%, 9). Figure 4 shows that significant proportions of analysts report having never heard of meta-analyses.

[FIGURE 4 OMITTED]

For instance, Justice (87.8%, 72), Finance (80.9%, 68) and Municipal Affairs, Regions & Land Management (80.3%, 171) have the highest scores in that respect.

Discussion

This research note focuses on findings from a broader survey to explore the extent to which systematic literature reviews and meta-analyses were known to ministerial policy analysts and to establish the socio-professional profile of users and non-users of this type of research evidence. The survey data collected from various departments suggest that generally both types of design were not very well known by policy analysts: most reported having never heard of systematic literature reviews and meta-analyses (57.8% and 69.4%, respectively). Nonetheless, we observed that individuals trained in postgraduate programs (as opposed to undergraduate ones) reported a considerably higher rate of awareness and consultation of both designs. Having access to electronic bibliographic databases from work station increased the rates of awareness and consultation of both systematic reviews (17.5% (144) without access, 29.3% (229) with access). Similarly, it increased the rates of awareness and consultation of meta-analyses (7.7% (64) without access, 15.4% (120) with access). Closely related to this variation in organizational resources, we observed an important variation across policy sectors: some departments reported considerably higher proportions of awareness of systematic reviews (for example, Health & Social Services, 53% (53), Sustainable Development, Environment & Parks, 28.2% (49), Employment & Social Solidarity 26.4% (29)). A similar pattern was described for proportions of awareness and consultation of meta-analyses as some departments reported considerably higher rates than others (for example, Health & Social Services, (38.4%, 38), Education, Leisure & Sports (19.6%, 7), Immigration & Cultural Communities (18%, 9).

This quick overview of the distribution of awareness and use of systematic reviews and meta-analyses among analysts allow us to raise a few interesting points. First, despite variations in the rates of consultation of both systematic reviews and meta-analyses (across policy sectors, types of training, and levels of access to evidence), we note that the rates of consultation were generally low (especially in the case of meta-analyses). Conversely, and perhaps most importantly, we observe rather that large segments of the sample indicated having never heard of both research designs. A methodological consideration for issues of social desirability suggest that perhaps even more analysts had, in truth, not heard of them, which might follow trends previously observed for knowledge utilization in policy settings (Landry, Lamari and Amara 2003).

Our findings also need to be contextualized since today's knowledge environment differs from when the data was collected. There has been a considerable expansion of availability, comprehensiveness of policy sectors and research questions covered by systematic reviews along with more repositories (for example, 3ie, Campbell Collaboration Library, etc.). Our analysis suggests that awareness and use of systematic literature reviews and meta-analyses seems to lag behind the availability of those forms of evidence. To fully confirm this intuition, comparable evidence needs to be collected to see if policy practices match the pace of systematic literature reviews generation and the development of resources and tools to support their policy use.

Recommendations for policy analysts and researchers

Knowing of systematic reviews or meta-analyses does not imply proper understanding of the methods and procedures used in those research protocols, nor does it imply a correct interpretation of their conclusions. What our descriptive inferences allow us to do, though, is to first signal the gap in knowledge and use of valuable research evidence syntheses. Second, it points to a greater need to engage policy analysts and policy makers in relevant sectors to develop research skills in identifying, obtaining and interpreting systematic reviews and meta-analyses.

We observed a difference between research postgraduate training, professional postgraduate training and undergraduate training, the former two related to higher rates of awareness and use than the latter. Either professional or research-focused postgraduate training might be a potentially important factor, since awareness more than triples from having had undergraduate training compared to other types of training. On the other hand, this distribution of educational backgrounds might be related to the requirements of the position and might explain why some analysts use more (or less) research evidence. In other words, the relation we observe between educational background and research use might reflect the professional and informational requirements of analysts' positions. Although overall levels of awareness and consultations were generally low, it makes sense to suggest that greater educational resources should be devoted to communicate the relevance of systematic reviews and meta-analyses, and to provide analysts with sufficient cognitive tools and critical appraisal skills to understand and use systematic reviews and meta-analyses effectively. This should be seen as an investment in future policy analytical capacity (Howlett 2009; Newman, Cherney and Head 2017). Given the rapid developments in the knowledge environment, continuing professional development should complement formal education to keep abreast of developments in evidence-informed practices. Such educational interventions have proven effective in different professional environments (for example, nursing professionals - Ruzafa-Martinez et al. 2015; Leach, Hofmeyer and Bobridge 2016) and should be effective for policy analysts (Yarber et al. 2015; Langer, Tripney and Gough 2016).

The rates of awareness and consultation of both research designs partly hinge on policy analysts and policy-makers as potential users, and their professional contexts. As found in other surveys of policy analysts, differences across levels of government (for example, Howlett and Newman 2010 on Canada) imply differences in staff composition, analysts' educational background and policy priorities. Striking differences emerge across across countries as well (for example, Australia, Canada Czech Republic--Evans and Vesely 2014; Vesely, Wellstead and Evans 2014). Organizational and administrative resources are crucial factors in supporting evidence-informed policy practices; this suggests different professional contexts such as the level of government or if users are in central agencies or operating departments, and the extent of administrative resources, will likely yield varying levels of capacity of evidence-informed policy practices. While the proportions of use of systematic reviews and meta-analyses in our data were low, it would be unfair to expect all departments to have equal awareness and consultation rates. Many policy questions cannot be illuminated by a systematic review or meta-analysis. However, it would also be unfair to presume that systematic reviews and meta-analyses are irrelevant a priori for a given policy sector. As discussed earlier, increasingly fewer policy sectors are unrepresented in systematic review repositories, such as the Campbell Collaboration. One should instead consider the type of policy question at hand, identify the most appropriate research design for obtaining evidence, and then consult a review of that evidence. Following that route will surely lead to systematic reviews and/or meta-analyses. What can hinder the consultation of this type of evidence is most definitely the lack of knowledge and awareness about what policy questions systematic reviews and metaanalyses can (and cannot) answer - and that is true for all policy sectors.

Importantly, our results suggest that having access to research evidence is a basic requirement for effective implementation of evidence-informed practices. This echoes a mainstream conclusion of the literature: a recent systematic review reported that the availability and access to research is a top-cited barrier to the use of evidence by policy-makers (Oliver et al. 2014a). A natural recommendation is to improve access and facilitate access to evidence in policy-analysis settings. To efficiently target such investments, we need to precisely determine the extent to which analysts have access to relevant bibliographic databases, journals, etc. Very little evidence summarizes the available resources in most policy sectors (but see Canadian Health Ministries, Leon et al. 2013). Furthermore, Tricco et al. (2016)'s systematic review of commonly-cited barriers and facilitators of use of systematic reviews, attitudes towards systematic reviews (that is, perceived relevance) is important. While they suggest that not perceiving the relevance of systematic reviews to inform policy-making might partly explain the lack of use, they suggest that systematic reviews can be seen as a threat to policy-makers' decisional autonomy, if they assume that systematic reviews must dictate policy (either an erroneous view of evidence-informed policy-making and/or a misconception of the function of systematic reviews and meta-analyses). To be sure, systematic literature reviews and meta-analyses can be powerful pieces of evidence--one must consider their conclusions as robust, when conducted properly. The routinized consultation and consideration of systematic reviews and meta-analyses in the policy-making must proceed in a context where users are willing to engage with evidence that may support the opposite of his/her policy preferences. In this view, evidence syntheses can narrow the room for maneuver and make it harder to ignore the statements that they make regarding, say, the effectiveness of a given policy. As demonstrated by Weiss and Bucuvalas (1980), a priori policy preferences can tamper one's view of method and validity of evidence. One should not accept evidence syntheses only when the conclusion coincides with one's policy preference (or reject them if it doesn't) nor one should one be critical of the synthesis methodology only when it seems to interfere with a policy plan.

Recent findings suggest greater engagement and collaboration between researchers and policy-makers can support the production of systematic reviews and meta-analyses not only to match policy analysts' needs and expectations but also to facilitate timely and efficient communication of results (Oliver and Dickson 2016; Tricco et al. 2016). We need more managers and policy analysts engaged in defining research questions and design, and more researchers involved in communicating and translating evidence to policy analysts. For instance, Australia's promising Policy Liaison Initiative (Brennan et al. 2016) fosters collaboration and use of systematic reviews in health policy by developing a community of practice, offering skill-building workshops, and contributing to the production of formatted summaries. Developing a take-home message, a one-page summary and providing results in plain language promise to facilitate understanding and use. Similarly, devoting extra effort to detail the policy relevance and applications can facilitate communication to non-specialists.

Finally, many investments have been made over the past years in organizational resources for evidence-informed policy-making. It would be timely and relevant to update these empirical findings to re-assess policy analysts' awareness and understanding of systematic literature reviews and meta-analyses. Our findings provide a benchmark for evaluating the results of such a study.

Notes

(1) For details on the methodological implications of the wide ranging questions that can be answered by systematic literature reviews, see Gough, Thomas and Oliver (2012).

(2) As a side note, Glass was already noticing the rapid expansion of the literature in 1976.

(3) For further advice on how to find systematic literature reviews, see for example Lavis et al. (2009).

(4) While a formal definition is lacking in the literature, Khangura et al. (2012) refer to rapid reviews as syntheses tools addressing the need of policy-makers, decision-makers, stakeholders and other knowledge users to quickly access a broad scope of evidence. Rapid reviews have a shorter timeframe (< 5 weeks) than systematic literature reviews (6 months--2 years). See also Thomas, Newman and Oliver (2013).

(5) The possible combination of multiple types of studies has been discussed elsewhere (Shrier et al. 2007).

(6) For example, see Dwan et al. (2008). For a discussion of other limitations and their effects on outcomes, see Pereira and Ioannidis (2011).

(7) For further methodological detail, refer to Ouimet et al. (2010).

(8) Note that the survey was administered in French--as Quebec is a French-speaking province. As technical terms such as this are frequently known in English, the question provided the respondent with its English equivalent.

(9) Number in parentheses is the number of observation.

(10) Note: The variable has been recoded so as to include individuals who do not know if they have access to those databases in the "No Access" category.

(11) In both Figures 3 and 4, shortened department names are used: Horizontal=Horizontal Departments (that is, Executive Council; Treasury Board), Municipal = Municipal Affairs, Regions & Land Management, Agriculture=Agriculture, Fisheries & Food, Culture=Culture, Communications and Women's Condition, Sustainable Development=Sustainable Development, Environment & Parks, Economic Development= Economic Development Innovation & Exportation, Education=Education, Leisure & Sport, Employment= Employment & Social Solidarity, Finance=Finance, Immigration=Immigration & Cultural Communities, Justice=Justice, Natural Resources=Natural Resources & Wildlife, Public Security=Public Security, Health=Health & Social Services, Transportation=Transportation. Some of these departments have obviously changed denomination following elected governments' organizational reshaping over the years.

(12) As can be seen in the graph, each bar represents the proportion (point estimate) and the overprinted lines report the 95% confidence interval around the estimate. The same applies to the other figures in this article.

References

Brennan, S., M. Cumpston, M. Misso, S. McDonald, M. Murphy and S. Green. 2016. "Design and formative evaluation of the Policy Liaison Initiative: A long-term knowledge translation strategy to encourage and support the use of Cochrane systematic reviews for informing health policy." Evidence & Policy: A journal of Research, Debate and Practice 12 (1): 25-52.

Carr-Hill, R., C. Rolleston, T. Pherali and R. Schendel. 2015. "The effects of school-based decision making on educational outcomes in low-and middle income contexts: A systematic review." In 3ie Grantee Final Review. London: International Initiative for Impact Evaluation (3ie).

Colebatch, H. K. 2005. "Policy analysis, policy practice and political science." Australian Journal of Public Administration 64 (3): 14-23.

--. 2006. "What work makes policy?" Policy Sciences 39 (4): 309-21.

Core Team. 2015. R: A Language and Environment for Statistical Computing. Vienna, Austria: R Foundation for Statistical Computing.

Cumming, G. 2013. Understanding the New Statistics: Effect Sizes, Confidence Intervals, and Meta-Analysis. London: Routledge.

--. 2014. "The new statistics why and how." Psychological Science 25 (1): 7-29.

Dobbins, M., R. Cockerill and J. Barnsley. 2001. "Factors affecting the utilization of systematic reviews." International Journal of Technology Assessment in Health Care 17 (02): 203-14.

Dwan, K., D.G. Altman, J.A. Arnaiz, J. Bloom, A.-W. Chan, E. Cronin, E. Decullier, P.J. Easterbrook, E. Von Elm, C. Gamble, D. Ghersi, J.P.A. Ioannidis, J. Simes and P.R. Williamson. 2008. "Systematic review of the empirical evidence of study publication bias and outcome reporting bias." PLoS One 3 (8): e3081.

Egger, M., G.D. Smith and A.N. Phillips. 1997. "Meta-analysis: Principles and procedures." BMJ: British Medical Journal 315 (7121): 1533.

Evans, B. and A. Vesely. 2014. "Contemporary policy work in subnational governments and NGOs: Comparing evidence from Australia, Canada and the Czech Republic." Policy and Society 33 (2): 77-87.

Free, C., G. Phillips, L. Watson, L. Galli, L. Felix, P. Edwards, V. Patel and A. Haines. 2013. "The effectiveness of mobile-health technologies to improve health care service delivery processes: A systematic review and meta-analysis." PLoS Medicine 10 (1): el001363.

Glass, G.V. 1976. "Primary, secondary, and meta-analysis of research." Educational Researcher 5 (10): 3-8.

Gough, D. and D. Elbourne. 2002. "Systematic research synthesis to inform policy, practice and democratic debate." Social Policy & Society 1: 225-36.

Gough, D., J. Thomas and S. Oliver. 2012. "Clarifying differences between review designs and methods." Systematic Reviews 1 (1): 28.

Harnan, S.E., K. Cooper, S.L. Jones and E. Jones. 2015. "Pruning and prioritizing: A case study of a pragmatic method for managing a rapid systematic review with limited resources." Evidence & Policy: A journal of Research, Debate and Practice 11 (4): 589-601.

Higgins, J.P.T. and S. Green. 2011. Cochrane Handbook for Systematic Reviews of Interventions. Version 5.1.0. Chichester: John Wiley & Sons Ltd.

Howlett, M. 2009. "Policy analytical capacity and evidence-based policy-making: Lessons from Canada." Canadian Public Administration 52 (2): 153-75.

Howlett, M. and J. Newman. 2010. "Policy analysis and policy work in federal systems: Policy advice and its contribution to evidence-based policy-making in multi-level governance systems." Policy and Society 29 (2): 123-36.

Howlett, M. and A.M. Wellstead. 2012. "Professional policy work in federal states: Institutional autonomy and Canadian policy analysis." Canadian Public Administration 55 (1): 53-68.

Khangura, S., K. Konnyu, R. Cushman, J. Grimshaw and D. Moher. 2012. "Evidence summaries: The evolution of a rapid review approach." Systematic Reviews 1: 10.

Landry, R., M. Lamari and N. Amara. 2003. "The extent and determinants of the utilization of university research in government agencies." Public Administration Review 63 (2): 192-205.

Langer, L., J. Tripney and D. Gough. 2016. The Science of Using Science: Researching the Use of Research Evidence in Decision-Making. London: EPPI-Centre, Social Science Research Unit, UCL Institute of Education, University College London.

Lapointe, L., M. Ouimet, M. Charbonneau and E.T. Beorofei. 2015. "Do Canadian university students in Political Science and Public Administration learn to perform critical appraisal?" Canadian Public Administration 58 (3): 487-503.

Lavis, J.N. 2009. "How can we support the use of systematic reviews in policymaking?" PLoS Medicine 6 (11): 1-6.

Lavis, J.N., A.D. Oxman, J.M. Grimshaw, M. Johansen, J.A. Boyko, S. Lewin and A. Fretheim. 2009. "SUPPORT tools for evidence-informed health policymaking (STP). 7. Finding systematic reviews." Health Research Policy and Systems 7 (Suppl 1): S7.

Leach, M.J., A. Hofmeyer and A. Bobridge. 2016. "The impact of research education on student nurse attitude, skill and uptake of evidence-based practice: A descriptive longitudinal survey." Journal of Clinical Nursing 25 (1-2): 194-203.

Leon, G., M. Ouimet, J.N. Lavis, J.M. Grimshaw and M.-P. Gagnon. 2013. "Assessing availability of scientific journals, databases, and health library services in Canadian Health Ministries: A cross-sectional study." Implementation Science 8 (1): 34.

Lomas, J. and A. D. Brown. 2009. "Research and advice giving: A functional view of evidence-informed policy advice in a Canadian ministry of health." Milbank Quarterly 87 (4): 903-26.

Newman, J., A. Cherney and B.W. Head. 2017. "Policy capacity and evidence-based policy in the public service." Public Management Review 19 (2): 157-74.

Oliver, K., S. Innvaer, T. Lorenc, J. Woodman and J. Thomas. 2014a. "A systematic review of barriers to and facilitators of the use of evidence by policymakers." BMC Health Services Research 14: 2.

Oliver, K., T. Lorenc and S. Innvaer. 2014b. "New directions in evidence-based policy research: A critical analysis of the literature." Health Research Policy and Systems 12 (1): 34.

Oliver, S. and K. Dickson. 2016. "Policy-relevant systematic reviews to strengthen health systems: Models and mechanisms to support their production." Evidence & Policy: A Journal of Research, Debate and Practice 12 (2): 235-59.

Ouimet, M., P.-O. Bedard, J. Turgeon, J.N. Lavis, F. Gelineau and C. Dallaire. 2010. "Correlates of consulting research evidence among policy analysts in government ministries: A cross-sectional survey." Evidence & Policy: A Journal of Research, Debate and Practice 6: 433-60.

Pereira, T.V. and J.P.A. Ioannidis. 2011. "Statistically significant meta-analyses of clinical trials have modest credibility and inflated effects." Journal of Clinical Epidemiology 64 (10): 1060-9.

Perrier, L., K. Mrklas, J.N. Lavis and S.E. Straus. 2011. "Interventions encouraging the use of systematic reviews by health policymakers and managers: A systematic review" Implementation Science 6: 43.

Ringquist, E. 2013. Meta-Analysis for Public Management and Policy. San Francisco: John Wiley & Sons Ltd.

Ruzafa-Martinez, M., L. Lopez-Iborra, D.A. Barranco and A.J. Ramos-Morcillo. 2015. "Effectiveness of an evidence-based practice (EBP) course on the EBP competence of undergraduate nursing students: A quasi-experimental study." Nurse Education Today 38: 82-87.

Shrier, I., J.-F. Boivin, R.J. Steele, R.W. Piatt, A. Furlan, R. Kakuma, J. Brophy and M. Rossignol. 2007. "Should meta-analyses of interventions include observational studies in addition to randomized controlled trials? A critical examination of underlying principles." American Journal of Epidemiology 166 (10): 1203-9.

Thomas, J., M. Newman and S. Oliver. 2013. "Rapid evidence assessments of research to inform social policy: Taking stock and moving forward." Evidence & Policy: A Journal of Research, Debate and Practice 9 (1): 5-27.

Tricco, A.C., R. Cardoso, S.M. Thomas, S. Motiwala, S. Sullivan, M.R. Kealey, B. Hemmelgarn, M. Ouimet, M.P. Hillmer, L. Perrier, S. Shepperd and S.E. Straus. 2016. "Barriers and facilitators to uptake of systematic reviews by policy makers and health care managers: A scoping review." Implementation Science 11: 4.

Tufanaru, C., Z. Munn, M. Stephenson and E. Aromataris. 2015. "Fixed or random effects meta-analysis? Common methodological issues in systematic reviews of effectiveness." International Journal of Evidence-Based Healthcare 13 (3): 196-207.

Vesely, A., A. Wellstead and B. Evans. 2014. "Comparing sub-national policy workers in Canada and the Czech Republic: Who are they, what they do, and why it matters?" Policy and Society 33 (2): 103-15.

Weiss, C.H. and M.J. Bucuvalas. 1980. "Truth tests and utility tests: Decision-makers' frames of reference for social science research." American Sociological Revietv 45 (2): 302-13.

Wellstead, A.M., R.C. Stedman and E.A. Lindquist. 2009. "The nature of regional policy work in Canada's federal public service." Canadian Political Science Review 3 (1): 34-56.

Wickham, H. 2009. ggplotl: Elegant Graphics for Data Analysis. New York: Springer.

Yarber, L., C.A. Brownson, R.R. Jacob, E.A. Baker, E. Jones, C. Baumann, A.D. Deshpande, K.N. Gillespie, D.P. Scharff and R.C. Brownson. 2015. "Evaluating a train-the-trainer approach for improving capacity for evidence-based decision making in public health." BMC Health Services Research 15 (1): 547.

Pierre-Olivier Bedard is a Mitacs Canadian Science Policy Fellow, Ottawa, Ontario. Mathieu Ouimet is Professor, Department of Political Science, Universite Laval, Quebec, Quebec.

Caption: Figure 1. Awareness and Consultation of Systematic Reviews and Meta-Analyses, by Type of Training (12)

Caption: Figure 2. Aumreness and Consultation of Systematic Reviews and Meta-Analyses, by Access to Electronic Databases

Caption: Figure 3. Awareness and Consultation of Systematic Reviews, by Department

Caption: Figure 4. Awareness and Consultation of Meta-Analyses, by Department
COPYRIGHT 2017 Institute of Public Administration of Canada
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2017 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:Canada
Author:Bedard, Pierre-Olivier; Ouimet, Mathieu
Publication:Canadian Public Administration
Geographic Code:1CQUE
Date:Jun 1, 2017
Words:6638
Previous Article:Innovation diffusion and networking: Canada's evolving approach to lobbying regulation.
Next Article:Approches formelles en transmission strategique de l'information : une boite a outils pour les decideurs publics.
Topics:

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters