Dabbs - Quality Assurance

Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

Dabbs et al.

Applied Cancer Research (2016) 36:3


DOI 10.1186/s41241-016-0002-8
Applied Cancer Research

RESEARCH ARTICLE Open Access

Quality assurance and patient safety


protocols for breast and gynecologic
pathology in an Academic Women’s
Hospital
David J. Dabbs*, Catherine T. Stoos and Abbie Mallon

Abstract
Background: Quality assurance and peer-review practices in surgical pathology have been well described in the
literature, but the majority of these reports apply to the realm of general surgical pathology. We focused on the
peer-review reporting system of a specialty women’s health pathology practice consisting exclusively of breast and
gynecologic pathology, with the specific aims of identifying diagnostic discrepancies that affected patient care.
Methods: The quality measures in this specialty practice are monitored, and the Medical Director reviews all
amended/corrected reports. Error types are qualitative, and are categorized according to impact on patient care. QA
data of all amended reports from 2012 to 2014 in breast and gynecologic pathology, as a measure of error type
and frequency, were reviewed.
Results: Of all specimens during this time period, 343 (0.54% of all reports) required amendment due to a QA
metric-discovered discrepancy. Breast specimens demonstrated a higher amendment rate than GYN specimens
(1.14% of breast specimens versus 0.27% of GYN specimens). The most common error type requiring an
amendment for both breast and GYN specimens was a type A, or Minor Disagreement (reports amended for
type A discrepancy: 78.7% of total; 81.9% of breast; 72.6% of GYN). Type B, or Moderate Disagreement
discrepancies, accounted for 21.3% of all amended cases (reports amended for type B discrepancy: 18.1% of
breast; 27.3% of GYN). Of all breast and GYN reports reviewed during the QA evaluation, there were no cases
categorized as type C, or Major Disagreements, which would significantly alter patient treatment.
Conclusion: When surgical pathology is practiced in a laboratory utilizing comprehensive quality assurance
protocols, major diagnostic interpretation errors are infrequent. The practice minimizes error, maximizes patient
safety, and maximizes educational opportunities of practicing pathologists in real-time.
Keywords: Quality, Patient Safety, Peer review, Breast pathology, Gynecologic pathology

Background study was misrepresentative of the true practice of path-


Surgical pathologic diagnoses direct patient treatment, ology, and has generated misleading data created in a
and therefore, correct diagnostic interpretation is essen- non-CLIA lab research environment.
tial for proper patient management. Recently published As a result of this study, we chose to evaluate our own
data regarding diagnostic disagreements among QA data to determine the frequency of diagnostic dis-
pathologists’ evaluation of breast biopsy specimens have crepancies (as measured by examination of amended re-
alarmed the public by reporting an almost 25% discord- ports) in breast and GYN pathology occurring at our
ance rate among pathologists participating in the study, institution. At our institution, an academic Women’s
particularly in diagnoses of atypia [4]. Unfortunately, this Hospital, comprehensive quality assurance protocols are
practiced to detect and remedy significant diagnostic
* Correspondence: ddabbs@upmc.edu error in an effort to maximize patient safety.
Magee-Womens Hospital of UPMC, Pittsburgh, PA 15213, USA

© The Author(s). 2016 Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0
International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and
reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to
the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver
(http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.
Dabbs et al. Applied Cancer Research (2016) 36:3 Page 2 of 6

Laboratory medicine is a highly structured field, of patient care. Impact on patient care is determined indi-
which the accuracy and safety has continuously been eval- vidually for each case, taking into account the pathology
uated and regulated for the last several decades. To report result, clinical information supplied, electronic
provide diagnostic information to other clinicians, pathol- medical record information, and clinician input.
ogists utilize an abundance of diagnostic tools and Error severity in amended reports is designated as
consultation in forming a diagnostic judgment, such as follows: A - Minor Disagreement, such as a spelling or
access to patient electronic medical record, access to formatting error which had no bearing on patient care;
radiographic images, submission of additional tissue B - Moderate Disagreement, including defects/omissions
levels, specialized immunohistochemical stains, access to in diagnoses that would not result in a change in patient
prior related specimen slides, and, in some cases, submis- care, Type B errors are deficits in the report, that may
sion of additional tissue. Importantly, after a diagnosis is include incorrect information, for example, errors of
rendered using these tools, frequent re-evaluation of case omission, missed lymphovascular invasion, or incorrect
material by various QA measurements frequently occurs. grading of a breast carcinoma. Such errors would not
These QA strategies are employed by practicing laborator- necessitate a change in patient management at our insti-
ies, not only as a means of decreasing diagnostic error, but tution. Type C errors represent a major disagreement,
also to meet federal regulatory guidelines for accreditation. which include major diagnostic discrepancies that would
Practicing within this setting, diagnostic errors may still be considered to be a serious event warranting a change
occur, but the rate of diagnostic errors which could result in the treatment plan of a patient [Table 1]. Type C
in major harm to a patient are low [2, 13, 17, 19, 25]. errors by definition are major report defects would
The goal of this study was to review our quality immediately impact patient care, such as misinterpret-
metrics to determine whether our QA process of proto- ation of the biological nature of a tumor-benign versus
cols minimize serious events that might alter patient malignant. If there is any doubt of error assignment as
management. type B or C, the Chief of Service consults the case with
the treating physician to properly assign the error type.
Design and methods A class C amended report mandates reporting to the
In this study, we retrospectively assessed error frequency medical staff patient safety office within 24 h of error
and severity occurring in breast and gynecologic (GYN) discovery according to ACT 13, the Pennsylvania Medical
pathology specimens by reviewing QA data on intrade- Care Availability and Reduction of Error (MCARE) of
partmental reports requiring an amendment. 2002. The clinician/treating physician is immediately
The quality assurance/peer review protocols practiced notified of the impending amendment, and the case is
and monitored at our institution include 10% intrade- discussed with the treating physician to verify the degree
partmental random case review, frozen section/perman- of patient care impact. MCARE Act mandates written
ent section diagnosis correlation, intradepartmental notification to the patient within 7 days of discovery.
consensus conference review, review of cases presented Pathology slides from amended reports are reviewed
at multidisciplinary tumor boards, double independent by QA committee members. There are three members
reads of all breast core biopsies and all new malignancies of the QA committee, including the Chief, who meet
prior to sign out, review of prior biopsy materials con- quarterly to review details of all amended reports that
currently with surgical resections, and real-time cytohis- are pre-reviewed and assigned by the Chief of service.
tologic correlations. In addition, all cases that are sent to The committee determines the final error category
outside institutions by patient request have incoming assigned to cases, based upon clinical information from
reports that are reviewed by the Chief of service, and medical records and impact of treatment decisions based
pathologists are required to issue addendum reports that the amended report. Feedback is distributed to patholo-
state their report was reviewed by an outside institution gists on error assignment, and they have the opportunity
and diagnosis was in agreement. If a diagnosis is not in to review the case slides. If cases are deemed to have
agreement with an outside institution, the case is then teaching value, they are reviewed as anonymous un-
reviewed by the Chief of service to adjudicate the diag- knowns at consensus conference.
nosis with further exterior consultation if necessary.
The peer review processes are all monitored by the Table 1 Amendment error severity categories
Chief of service, who also serves as the head of the depart- Type A: Minor Disagreement-spelling errors, typographical error,
mental quality assurance committee. The Chief, in concert formatting error
with the QA manager, initiates the production of an Type B: Moderate Disagreement-defects in diagnosis with no effect
on patient care
amended report if the criteria for an amended report are
met as described below, and the Chief assigns an error Type C: Major Disagreement-major discrepancies in diagnosis that
would affect patient treatment
severity based on the report defect and how it impacts
Dabbs et al. Applied Cancer Research (2016) 36:3 Page 3 of 6

All breast and GYN surgical pathology reports Table 2 Top Five Type B Breast Pathology Report Errors
amended for any reason from 2012 to 2014 were in- 1: “General Edit/Change in Final Diagnosis”: (32%)
cluded in this study. Examples include: Edit to ER/PR results, errors in number of lymph
All surgical pathology reports have two parts, the gross nodes reported.
description, which is dictated by pathologists and tran- 2: “Premature Signout Errors”: (17%)
scribed by secretarial staff, and the final diagnosis with Examples include: Comments left in which were intended to be
microscopic description, and correlative comments, deleted, addition of a comment, final edits to the diagnosis, etc.
which are dictated by pathologists and transcribed by 3/4: “Synoptic Template Errors” and “Changes to Diagnosis Following
secretaries. There are a limited number of templates that Additional Information:” (both: (15%))
pathologists may use for specific types of reports, such Specific examples of synoptic template changes that were listed
as biomarkers for breast cancer. Templates are also include changes in TNM staging within the synoptic, as well as
change in margin status.
typed by secretaries. The assignment of errors applies
not only to the main diagnostic header, but also to sup- 5 “Specimen Part Edit”: (10%)
plementary studies such as immunohistochemistry re- These cases are listed as having to be amended due to an
sults and prognostic/predictive biomarkers. If biomarker incorrect part/specimen type listed in the final diagnosis
results are reported erroneously, an amended report
must be issued and the error impact (B or C) is assigned breast and GYN specimens was a type A, or Minor
in consultation with the treating physician. Disagreement (reports amended for type A discrepancy:
Fifteen pathologists worked in the surgical pathology 78.7% of combined breast and GYN; 81.9% of breast;
laboratory during this study period. The minimum num- 72.6% of GYN). Type B, or Moderate Disagreement
ber of years of experience post-training was 4 years, and discrepancies, accounted for 21.3% of all amended cases
the maximum, 34 years. combined (reports amended for type B discrepancy: 18.1%
The QA process described here was initially intro- of breast; 27.3% of GYN). Of all breast and GYN reports
duced in the laboratory in mid- 2011. Prior to 2011, the reviewed during the QA evaluation, there were no
QA process second review pre-signout of benign breast amended cases which were categorized as type C, or
core biopsies was not performed. While discrepancy Major Disagreements which would significantly alter
rates were low prior to 2011, they were not quantified. patient treatment [Table 4].
The QA program did not require a significant learning
curve, because the data that was being recorded by pa- Discussion
thologists was part of the normal work reviews that Retrospective review of our combined QA data for GYN
pathologists performed. The learning curve was rela- and breast specimen reports from 2012 to 2014 demon-
tively flat, as witnessed by three new clinical fellows each strates a low diagnostic discrepancy rate (0.54%), with
year who entered our fellowship program. They had no the most common reason for error being a Type A, or
difficulty following the program and contributing to the Minor Disagreement, which is a spelling or formatting
data that was collected. Fellows typically achieved a error within the report. When a diagnostic error did
plateau in the QA program within the first 3 months of occur, the effect on patient care was minimal. There
practice. were no instances of major diagnostic discrepancies. We
The rate of error and frequency of error type for each
specimen group was evaluated. The top five B type errors Table 3 Top Five Type B Gynecologic Pathology Report Errors
for breast and gynecologic cases are seen in Tables 2 and 3. 1: “General Edit/Change in Final Diagnosis”: (31%)
Examples include changing content of comments, diagnosis of
Results whole parts omitted, a benign diagnosis change, and correcting
In total, 63,665 breast and GYN specimen reports were measurements within final diagnosis.
created from 2012 to 2014. These specimens consisted of 2: “Omission of Intraoperative Consult Results”: (19%):
44,005 GYN resection specimens and biopsies and 19,660 These cases are listed as having the intraoperative consultation
breast resection specimens and biopsies. Of all specimens results not listed on the final report/not dictated while grossing.
during this time period, 343 (0.54% of all combined breast 3: “Changes to Diagnosis Following Additional Information”: (16%)
and GYN reports) required amendment due to a QA Examples include changes to final diagnosis after external
metric-discovered discrepancy. All amended reports were consultations, change to final diagnosis after additional clinical
reviewed by the Chief of service and assigned an error information.
designation. Breast specimens demonstrated a higher 4/5: “Premature Signout Errors” and “Synoptic Template Errors”: (9%)
amendment rate than GYN specimens (1.14% of breast Premature sign out included additional descriptions and signing
specimens versus 0.27% of GYN specimens). The most out case before entirely completed. Corrections to synoptic template
data.
common error type requiring an amendment for both
Dabbs et al. Applied Cancer Research (2016) 36:3 Page 4 of 6

Table 4 Frequency of Amended Reports by Specimen and Error Type


Report Type Amended Total Number Amended Type A Type B Type C
Breast Specimen Report (2012–2014) 19,660 226 (1.1%) 185 (81.9%) 41 (18.1%) 0 (0%)
GYN Specimen Reports (2012–2014) 44,005 117 (0.3%) 85 (72.6%) 32 (18.1%) 0 (0%)
Combined Breast and GYN Reports (2012–2014) 63,665 343 (0.5%) 270 (78.7%) 72 (21.3%) 0 (0%)

credit this low discrepancy rate, in part, to the compre- rates in surgical pathology reports can be attributed to
hensive QA measurements that are in place at our the variation among institutions in the determination of
institution. error rate and classification of errors, as well as the spe-
One may question the lack of serious events in this cimen type, and the construction and accuracy of the
study as a weakness, citing a situation where if everyone study itself [5, 13, 15].
misinterprets a case, that it would not be perceived as To assist in error reduction and report accuracy, and
an error. While such a scenario is possible, it is ex- to maintain institutional accreditation, pathologists em-
tremely unlikely given the multi-faceted comprehensive ploy and practice auditing systems through various QA
approach of the described peer review process. The peer measures, which have been evaluated in numerous pub-
review redundancies described herein provides a system lished studies regarding QA in surgical pathology. In
of check and balances that is difficult to circumvent. order to operate, modern-day laboratories must adhere
The redundancies appear to have prevented serious to a QA program compliant with federal regulation, in
events. particular, the Clinical Laboratory Improvement
Discrepancies in surgical pathology (as well as all other Amendment of 1988 (CLIA’88), under the direction of a
medical fields) exist, even when the utmost care is put physician laboratory director. Under CLIA’88, which
into rendering a diagnosis. As the interpretation of a established standards for all national laboratories to
histologic specimen is “more subjective” than a standard ensure the safety and reliability of laboratory testing,
clinical laboratory test, factors such as pathologists’ ex- laboratories must create and abide by QA protocols, as
perience, clinical information provided about a case, the well as undergo inspections by accreditation agencies,
use of ancillary studies, and others can play a role in the such as the CAP to ensure protocols are followed and
variation and accuracy of a diagnosis [13]. Diagnostic major deficiencies are remedied [3]. The goal of these
error has been extensively studied and categorized in programs is to enhance patient safety by identifying and
various ways in the literature, and studies regarding dis- correcting errors in the diagnostic process which would
crepancies in surgical pathology reports demonstrate a lead to patient mismanagement. In surgical pathology,
range of error rates, with certain organ systems having standard QA protocols for all practices do not exist.
an overall higher rate of disagreement than others, such However, common methods among pathologists are
as skin lesions, breast, bone and soft tissue, and others employed, such as prospective and retrospective second
[16, 17, 25]. In 2014, the CAP published data on its Q- reviews of cases, expert opinion on difficult cases, ran-
probes study data from 2011, which prospectively exam- dom or focused review of a selected percentage of cases,
ined any post-signout changes to surgical pathology re- frozen section/permanent section correlation, cytology-
ports from 73 institutions occurring over a 3-month histology correlation, multi-discipline tumor board and
time span to establish benchmarks for error rates in sur- pathology consensus conferences, and others. The ma-
gical pathology. Defects were classified using the error jority of these QA measures are founded on the concept
taxonomy suggested by Meier et. al in [11]. In this study, of “second opinion,” by a peer pathologist or subspecialty
1,688 report defects were discovered out of the 360,218 expert when assessing a diagnosis [10, 17, 21, 23, 25],
reports reviewed, yielding an overall defect rate of 0.47% Although each method has its own benefits, with error
[25]. While over half of these report errors were classi- detection by some methods being superior to others
fied as “other defects,” which mainly included typo- [16, 17], these and other QA methods have been studied
graphical or dictation errors, misinterpretation errors are shown to effectively detect and reduce major diagnos-
accounted for 14.6% of the overall report errors, and tic errors, the serious events which adversely affect patient
were found most commonly in skin and breast speci- care and increase medical care costs.
mens [25]. More recently, a large literature review of Second opinion pathology reviews, whether pre- or
137 published articles regarding interpretive errors in post-signout, by intradepartmental or outside consult-
surgical pathology and cytology conducted by the CAP ation, are commonly employed by pathology practices
demonstrated a median major discrepancy rate in surgi- and are generally accepted to have a positive impact on
cal pathology of 6.3%, with significant error rates ranging diagnostic accuracy and concordance. Numerous studies
from 0.1 to 10% [13]. The seemingly wide range of error of various organ systems demonstrate positive benefit by
Dabbs et al. Applied Cancer Research (2016) 36:3 Page 5 of 6

identifying errors or reaching consensus on difficult cases, the majority of which were a change in diagnosis
diagnoses, particularly before patient care is begun. from one benign condition to another. However, in 10% of
Pre-signout reviews hold the added benefit of the cases, the change in diagnosis altered surgical manage-
identification and alleviation of errors before pathology ment of the patient [20]. In a recent, somewhat similar
information is reported to clinicians. An early, large pro- study from MD Anderson Cancer Center, all consultation
spective study on pre-signout peer review by Whitehead breast pathology referral cases from a 1-year period (1,970
et al., examined 3,000 surgical pathology cases which were total cases) were examined for discrepancies between the
double read by a separate pathologist pre-signout and original outside institution report, and the newly-issued
demonstrated a 7.8% discrepancy rate with 12.4% of the expert report. The authors discovered a significant dis-
discrepant cases classified as “significant” discrepancies crepancy, which was a disagreement affecting patient care,
[26]. A later prospective study regarding the benefit of in 226, or 11.47%, or the cases [6]. These and other similar
intra-institutional, peer review diagnostic biopsies, discov- studies demonstrate the value of a second, expert opinion
ered a major diagnostic error which would affect patient in breast and other surgical pathology cases to avoid
care in 1.2% of the 2,694 biopsy specimens after being wrong or unnecessary treatment as well as savings in
reviewed by a second pathologist before sign-out [8]. healthcare costs.
Later, a study by Novis in 2005 [15] retrospectively Finally, studies on other surgical pathology QA mea-
and prospectively examined surgical pathology intra- sures have touted similar effects on diagnostic accuracy
departmental error rates in a community hospital set- and patient management, and have been found to be a
ting before and after implementation of a policy re- useful addition to pathology QA protocols. One such
quiring a second review of all histologic material by a method, review of pathology during multi-discipline con-
separate pathologist. By reviewing all amended reports ferences, was shown by various studies to identify discrep-
for 1 year before and 1 year after the implementation of ancies in breast pathology, particularly due to the benefit
this policy, he found that the misdiagnosis rate of 1.3 per of additional clinical information [1, 14]. Raab et. al. stud-
1000 (10 of 7,909 total reports reviewed) before imple- ied the benefit of monitoring frozen section/permanent
mentation of the pre-signout review decreased to 0.6 per section discrepancies overtime by utilizing CAP Q-Tracks
1000 (5 of 8,469 total reports) reports after implementa- data on 174 participating institutions based on 3 Q-probes
tion of the policy [15]. These findings are reaffirmed by studies from 1999 to 2003, and found institutions who
the recent data from the 2014 CAP Q-probes study, which practiced long-term frozen/permanent section correla-
found that second review of all malignancies as a pre-sign tions to have significantly lower discordance rates, deferral
out strategy was significantly associated with a lower rates, and microscopic sampling errors [18].
misinterpretation rate, and was also associated with lesser Our overall discrepancy rate, as measured by report
significant errors, such as defects in protocols or labeling amendment, was 0.5% for breast and GYN specimens
errors [25]. combined. There were no serious events catalogued. The
Studies regarding the benefit of inter-institutional goal of our QA program, to minimize serious events
second review of outside (post-signout) pathology by (Type C error) was accomplished utilizing a comprehen-
expert subspecialty pathologists have yielded similar re- sive peer review process that also enhanced pathologist
sults, and mandatory second review of outside referral education and active participation in all facets of the
pathology cases before surgical intervention has been program.
employed and studied by various institutions [7, 9, 22, 24]. In summary, surgical pathology is a complex practice
Through this QA strategy, discrepancies in outside path- for which a high level of training, expertise, and over-
ology with major diagnostic and prognostic implications sight is required to provide accurate diagnostic interpret-
are remedied before the initiation of treatment, thus pre- ation. Surgical pathology employs QA strategies to not
venting inappropriate therapy and reducing unnecessary only be in compliance with federal law, but also to pro-
medical costs [6, 7, 9, 12, 20, 22, 24]. Many breast vide “boundaries” of diagnostic standardization which
pathology-specific studies on the benefits of inter- help minimize sweeping variation in diagnostic accuracy,
institutional review have been published. A recent study decreases diagnostic discordance and maximizes patient
from Mount Sinai Medical Center looked specifically at safety by minimizing the occurrence of serious events.
discrepancies in breast pathology from excisional and nee- When practiced in an environment of QA oversight and
dle core biopsies submitted as part of a surgical referral assistance, and not in a vacuum, as suggested by some
from an outside facility. All of the specimens were studies in which published error rates are derived from
reviewed by a pathologist who specialized in breast path- misrepresentative study models [4], discrepancy rates are
ology. The authors found that, after reviewing 430 biopsy reduced and patient safety is heightened, and the occur-
specimens for306 patients, second review by an expert in rence of major diagnostic disagreements that could affect
breast pathology led to changes in diagnosis in 17% of patient management for breast or gynecologic pathology
Dabbs et al. Applied Cancer Research (2016) 36:3 Page 6 of 6

diagnoses are distinctly uncommon. When surgical path- 11. Meier F, et al. Development and validation of a taxonomy of defects. Am J
ology is practiced in a laboratory utilizing comprehensive Clin Pathol. 2008;130:238–46.
12. Middleton, L, et. al. Second-opinion pathologic review is a patient safety
quality assurance protocols, major diagnostic interpret- mechanism that helps reduced error and decrease waste. J Oncol Pract.
ation errors are infrequent. The practice minimizes error, 2014;10(4):1–7.
maximizes patient safety, and maximizes educational op- 13. Nakhleh R, Nosé V, Colasacco C, et al. Interpretive diagnostic error reduction
in surgical pathology and cytology. Arch Pathol Lab Med. 2016;140:29–40.
portunities of pathologists. 14. Newman, E, et.al., Changes in surgical management resulting from case
review at a breast cancer multidisciplinary tumor board. Cancer. 2006;
107(10):234–2351.
Conclusion 15. Novis D. Routine review of surgical pathology cases as a method by which
This study describes the quality peer review practice in to reduce diagnostic error in a community hospital. Am J Surg Pathol. 2005;
an academic women’s hospital that maximizes patient 10(2):63–37.
16. Raab S, et al. Effectiveness of random and focused review in detecting
safety and minimizes serious diagnostic errors. These surgical pathology error. Am J Clin Pathol. 2008;130:305–912.
processes ensure that pathologic diagnoses are accurate 17. Raab S, Nakhleh R, Ruby S. Patient safety in anatomic pathology:
for proper patient care. measuring discrepancy frequencies and causes. Arch Pathol Lab Med.
2005;129(4):459–66.
Acknowledgements 18. Raab S, et al. The value of monitoring frozen section-permanent section
Not applicable. correlation data over time. Arch Pathol Lab Med. 2006;130:337–42.
19. Renshaw A, Gould E. Measuring the value of review of pathology material
by a second pathologist. Am J Clin Pathol. 2006;125:737–9.
Funding
20. Romanoff A, et al. Breast pathology review: does it make a difference? Ann
There was no funding source for this study. This quality study was approved
Surg Oncol. 2014;21:3504–8.
for by the Quality Committee of University of Pittsburgh Medical Center.
21. Roy J, Hunt J. Detection and classification of diagnostic discrepancies
(errors) in surgical pathology. Adv Anat Pathol. 2010;17:359–65.
Availability of data and material 22. Soofi Y, Khoury T. Inter-institutional pathology consultation: the importance
Qualtiy data collated by author Abbie Mallon. of breast pathology subspecialization in a setting of tertiary cancer center.
Breast J. 2015;21(4):334–44.
Authors’ contributions 23. Tomaszewski J, et al. Consensus conference on second opinion in
DD (concept, data acquisition, data analysis, writing manuscript); CS (data diagnostic anatomic pathology: who what, and when. Am J Clin Pathol.
analysis, writing manuscript); AM (data collation/acquisition, final review, 2000;114:329–35.
writing manuscript). All authors read and approved the final manuscript. 24. Tsung J. Institutional pathology consultation. Am J Surg Pathol. 2004;28(3):
388–402.
Competing interests 25. Volmar K, et al. Surgical pathology report defects: a college of American
The authors declare that they have no competing interests. pathologists q-probes study of 73 institutions. Arch Pathol Lab Med. 2014;
138:602–6012.
Consent for publication 26. Whitehead M, et al. Quality assurance of histopathologic diagnoses: a
Not Applicable. prospective audit of three thousand cases. Am J Clin Pathol. 1984;81:
478–91.
Ethics approval and consent to participate
Not Applicable.

Received: 26 June 2016 Accepted: 25 October 2016

References
1. Chang J, et al. The impact of a multidisciplinary breast cancer center on
recommendations for patient management: the university of Pennsylvania
experience. Cancer. 2001;91(7):1231–7.
2. Chaudhary S, Kahn L, Bhuiya T. Retrospective blinded review of
interpretational diagnostic discrepancies in surgical pathology: 18 years of
experience at a tertiary care facility. Ann Clin Lab Sci. 2014;44(4):469–75.
3. Clinical Laboratory Improvement Amendments of 1988. October 31, 198.
102 Stat. 2903, Public Law 100–578.
4. Elmore J, et al. Diagnostic concordance among pathologists interpreting
breast biopsy specimens. JAMA. 2015;313(11):1122–32.
5. Frable W. Surgical pathology – second reviews, institutional reviews, audits, Submit your next manuscript to BioMed Central
and correlations: What’s out there? Error or diagnostic variation? Arch Pathol and we will help you at every step:
Lab Med. 2006;130:620–5.
6. Khazai L, et al. Breast pathology second review identifies clinically significant • We accept pre-submission inquiries
discrepancies in over 10% of patients. J Surg Oncol. 2015;111:197. • Our selector tool helps you to find the most relevant journal
7. Kronz J, Westra W, Epstein J. Mandatory second opinion surgical pathology
• We provide round the clock customer support
at a large referral Hospital. Cancer. 1999;86(11):2426–35.
8. Lind A, et al. Prospective peer review in surgical pathology. Am J Clin • Convenient online submission
Pathol. 1995;104(5):560–6. • Thorough peer review
9. Manion E, Cohen M, Weydert J. Mandatory second opinion in surgical
• Inclusion in PubMed and all major indexing services
pathology referral material: clinical consequences of major disagreements.
Am J Surg Pathol. 2008;32(5):732–7. • Maximum visibility for your research
10. Maxwell S, Raab S. Directed peer review in surgical pathology. Adv Anat
Pathol. 2012;19(5):331–7. Submit your manuscript at
www.biomedcentral.com/submit

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy