Skip to Main Content

DNP Program: Evidence-Based Practice Resources at UMFK

Copied with permission from Ximena Chrisagis at Research Guides, Wright State University

Brief published guides

Critical Appraisal Tools

Tools for several different study types are available (including systematic reviews):

Tools that emphasize a particular type of study:

Main Evaluation/Synthesis Table

This table should list all the sources that comprise your body of evidence and indicate Author (Year), Conceptual Framework, Design and/or Method Sample and/or Setting, Major Variables Studied (and Their Definitions), Measurement, Data Analysis, Findings, and Appraisal (Worth to Practice).

Be concise. Abbreviate heavily and be sure to include a legend at the bottom for abbreviations.

Additional Synthesis Tables (for easy visual summaries)

If accessing off campus, you will need to authenticate with your University email username and password or be logged into your UMFK Portal.

Other Methods of Assessing the Strength of the Evidence

These methods are often used with systematic reviews.

Acronyms:

  • GRADE - Grading of Recommendations Assessment, Development and Evaluation
  • CERQual - Confidence in the Evidence from Reviews of Qualitative Research
  • ConQual: Confidence in Qualitative research
  • EPC: The Evidence-based Practice Center
  • AHRQ: Agency for Healthcare Research and Quality

Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA)

  • PRISMA aims to help authors improve the reporting of systematic reviews and meta-analyses.
  • PRISMA may also be useful for critical appraisal of published systematic reviews, although it is not a quality assessment instrument to gauge the quality of a systematic review.

Check out these short videos from Towson University's Health Professions Librarian Carrie Price for more information about PRISMA:

 

Journal Credibility Questions in Evidence Synthesis

Journals that follow deceptive or fraudulent publication practices are often found alongside more legitimate publications, whether in Google Scholar or established library databases.

This study indicated that evidence synthesizers tend to advocate for having a process in place to identify predatory journals.  They supported the need to assess the quality of the data published in them and indicated that only good quality studies should be incorporated into the synthesis.

When synthesizing evidence, you should have a clear protocol for how you will identify potentially predatory journals, how you will evaluate quality, and to what extent you will include studies from those publications.

Avoiding Retracted Articles in Evidence Synthesis

Retraction aims to correct both fraud and unintended errors in the scientific record. Including retracted literature in evidence syntheses can cause patient harm.  However, retraction notices are not always clearly publicized, so retracted articles often still get cited post-retraction.  

This opinion piece calls for more careful scrutiny by authors, editors, and peer reviewers to avoid citing retracted articles.

About this Guide:

Copied with permission from Ximena Chrisagis at Research Guides, Wright State University:

This guide, originally created by Ximena Chrisagis at Wright State University, "was inspired by Jodi Jameson's University of Toledo DNP Evidence-Based Practice Project: Library Resources Guide.  It includes video, journal, and web content developed by many external authors/creators, as well as some content by Ximena Chrisagis. Some of the external content is reused under a Creative Commons Share Alike Attribution license.  When external content was not available under a Creative Commons license, it is linked rather than copied or extracted, and it is attributed to the content creator in the link name or description."   

Sofia Birden of the University of Maine at Fort Kent (UMFK) changed very little of the original guide. Updates made were for format, campus-specific language, and resource links.