English  |  正體中文  |  简体中文  |  Items with full text/Total items : 12145/12927 (94%)
Visitors : 908335      Online Users : 989
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    Please use this identifier to cite or link to this item: http://ir.nhri.org.tw/handle/3990099045/6712


    Title: Compatibility of AGREE and clinical experts review in guideline appraisal
    Authors: Kuo, KN;Lo, HL;Chen, C
    Contributors: Division of Health Services and Preventive Medicine
    Abstract: BACKGROUND (INTRODUCTION):AGREE is the most accepted instrument in appraising the methodological quality of clinical practice guideline (CPG). Six domains measure different aspects of CPG quality and may differ from clinical expert perspective. LEARNING OBJECTIVES (TRAINING GOALS):1. To compare the result and compatibility of CPG appraisal between AGREE measures and clinical expert perspective. 2. To identify the inconsistent criteria in order to improve the consensus between AGREE reviewers and clinical specialty. METHODS:We collected data from independent evaluation by AGREE and related clinical expert on 17 CPGs developed from 2007 to 2008. For “strongly recommended” rating, we gave a score 3, “recommended with alteration” 2, and “not recommended” 1. The differences between AGREE and clinical expert's scores were expressed as sensitivity, specificity, and positive and negative predict value in relative intra- and inter-AGREE domains. RESULTS:Nine out of 17 CPGs showed similar recommendation between AGREE and clinical expert ratings. Four domains of AGREE were particularly sensitive to clinical expert perspective, including stakeholder involvement (sensitivity 0.89, specificity 0.75, PPV 0.80, NPV 0.86), rigor of development (0.89, 1.0, 1.0, 0.89), clarity and presentation (0.78, 0.88, 0.88, 0.78), and editorial independence (0.78, 1.0, 1.0, 0.80). The result is the same if we calculated those four sensitive AGREE domains and omitting other two. In consistency of items within each domain, the majority of items under “rigor of development,” “clarity and presentation,” and “editorial independence” showed relative high coherence. However, the consistency varied within ‘stakeholder involvement' domain.DISCUSSION (CONCLUSION):Our finding points out the compatibility between AGREE and clinical expert appraisal in CPG quality, and the predictability between the two. It is crucial to improve reviewer's training for enhancing inconsistent domains of AGREE.
    Date: 2010-07
    Relation: Otolaryngology - Head and Neck Surgery. 2010 Jul;143(1, Suppl. 1):22.
    Link to: http://dx.doi.org/10.1016/j.otohns.2010.04.146
    JIF/Ranking 2023: http://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcAuth=NHRI&SrcApp=NHRI_IR&KeyISSN=0194-5998&DestApp=IC2JCR
    Appears in Collections:[郭耿南(2003-2010)] 會議論文/會議摘要

    Files in This Item:

    File Description SizeFormat
    SDO2012082229.pdf80KbAdobe PDF660View/Open


    All items in NHRI are protected by copyright, with all rights reserved.

    Related Items in TAIR

    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback