Jurnal Varsely Harus Dibaca Segeraaa
Jurnal Varsely Harus Dibaca Segeraaa
Jurnal Varsely Harus Dibaca Segeraaa
PII: S0737-4607(16)30061-1
DOI: https://doi.org/10.1016/j.acclit.2018.01.001
Reference: ACCLIT 39
To appear in:
Please cite this article as: Deniz A.Appelbaum, Alex Kogan, Miklos
A.Vasarhelyi, Analytical Procedures in External Auditing: A Comprehensive
Literature Survey and Framework for External Audit Analytics (2010),
https://doi.org/10.1016/j.acclit.2018.01.001
This is a PDF file of an unedited manuscript that has been accepted for publication.
As a service to our customers we are providing this early version of the manuscript.
The manuscript will undergo copyediting, typesetting, and review of the resulting proof
before it is published in its final form. Please note that during the production process
errors may be discovered which could affect the content, and all legal disclaimers that
apply to the journal pertain.
1
PT
One Normal Avenue
Montclair, NJ 07043
USA
RI
appelbaumd@montclair.edu
SC
Professor
Rutgers, the State University of New Jersey
One Washington Park
U
Newark, NJ 07102
USA
kogan@business.rutgers.edu
N
A
Miklos A. Vasarhelyi, PhD
KPMG Distinguished Professor of AIS
M
Director of CARLab
Rutgers, the State University of New Jersey
One Washington Place
D
miklosv@business.rutgers.edu
*Corresponding author
EP
CC
Abstract:
There is an increasing recognition in the public audit profession that the emergence of big data as
well as the growing use of business analytics by audit clients has brought new opportunities and
A
challenges. That is, should more complex business analytics beyond the customary analytical
procedures be used in the engagement and if so, where? Which techniques appear to be most
promising? This paper starts the process of addressing these questions by examining extant
external audit research. 301 papers are identified that discuss some use of analytical procedures
in the public audit engagement. These papers are then categorized by technique, engagement
phase, and other attributes to facilitate understanding. This analysis of the literature is categorized
into an External Audit Analytics (EAA) framework, the objective of which is to identify gaps, to
2
provide motivation for new research, and to classify and outline the main topics addressed in this
literature. Specifically, this synthesis organizes audit research, thereby offering guidelines
regarding possible future research about approaches for more complex and data driven analytics
in the engagement.
PT
1.0 Introduction
RI
There is increasing recognition in the public audit profession that the emergence of big data as
well as the growing use of analytics by audit clients has brought new opportunities and concerns
SC
(Appelbaum, Kogan, and Vasarhelyi 2017). That is, should more complex analytics be used in the
engagement and if so, where? Which techniques appear to be most promising? It is said by many
that the public auditing profession would be the last to adopt new technologies, that regulations
U
mold the scope, breadth, and methodology of the engagement. However, the standards do not
explicitly define the type of analytical approaches that should be undertaken by auditors to fulfill
N
regulatory requirements, except that the auditor should develop an expectation from the
appropriate analytics of reliable data from certain accounts, and then calculate the difference of
A
these expectations and the recorded numbers (AS 2305, PCAOB, 2016). The standards require that
analytical procedures be undertaken in addition to evidence collection at the preliminary review
M
and final review stages (Daroca & Holder, 1985), but the decision about which analytical
techniques to use is left to auditor judgment.
D
The opaqueness of this aspect of public auditing has led to numerous debates and discussion
TE
within the auditing academic community since 1958 (AICPA 1958). These debates have increased
with the emergence of big data and automation of business financial reporting (Vasarhelyi, Kogan,
and Tuttle 2015). These discussions and debates, as evidenced in academic publications, are
EP
indicative of the degree and breadth of analytical approaches available to the engagement.
Therefore, it is only natural to investigate this vast body of audit academic research for insights
regarding an expanded use of analytics. This research is relevant to:
CC
Audit academics and researchers who are interested in continuing with new research about
analytics in the external audit engagement and who can refer to this paper for guidance as
to which areas have previously been discussed in the literature and which could benefit
A
This paper represents an effort to identify and categorize academic publications referencing the
use of analytics in the engagement. Accordingly, 301 papers are identified that discuss some aspect
of analytical procedures in the external audit engagement. The large number of papers make it
difficult for academics and practitioners to identify specific analytic techniques or gaps in the
research. Therefore, these papers are first categorized by technique, engagement phase, and other
attributes to facilitate an understanding. This preliminary analysis of the literature is subsequently
categorized in an External Audit Analytics (EAA) framework, derived from Business Analytics
PT
(BA), and whose objective is to facilitate the identification of gaps, to provide motivation for new
research, and to classify and outline the main topics addressed in this literature.
RI
This paper organizes and synthesizes the previously uncategorized extant literature, thereby
encouraging further research and exploration by academia, regulators, and practitioners. It
systematically examines published literature regarding analytics in the external audit to understand
SC
central themes and status of this research, and provides an organizing framework which positions
these findings in context with the modern business environment. The EAA framework provides a
benchmark for expected research, against which to compare the available published research to
U
date.
N
Following this Introduction, the Background section discusses Analytical Procedures as
promulgated by the standards and typically practiced by the profession. The third section begins
A
the Literature Review process by discussing the methodology for collecting these papers and how
M
they are categorized by timeline, research methods, audit stage, technique, and orientation. The
fourth section discusses the meaning of the results of the literature review, areas for future research,
and gaps in the literature. An External Audit Analytics (EAA) conceptual framework is proposed
D
in Section Four to facilitate an understanding of not only where research has been undertaken but
also, given an understanding of business analytics practices by audit clients, where future research
TE
should concentrate. This EAA conceptual framework is derived from the synthesis of the literature
in the context of business analytics. This paper then concludes with implications and discussions
for future research regarding the broad potential for analytics in the external audit.
EP
CC
2. 0 Background
A
typically considered as reasonableness tests. At the review stage of the audit, they provide an
overall review of the assessments and conclusions reached. APs may be used as a substantive test
to obtain evidence about certain assertions related to account balances or types of transactions. In
certain circumstances, APs may be more effective and efficient than substantive tests of details.
When the data set is large and varied, APs may be more effective. When the risk of misstatement
is minimal, APs may be more efficient and less costly.
The Cushing and Loebbecke (C-L) model (Figure 1) reflects the phase structure of the typical
PT
audit engagement by the Big 8 firms at that time and is the basis for the audit model in many
textbooks (Louwers et al. 2016; Whittington and Pany 2014). In this model, auditors should
RI
conduct a preliminary analytical review in the planning activities, conduct analytical review
procedures as well as substantive tests of transactions and tests of balances in the substantive
testing phase. In the evaluation and review phases, this work requires revisiting and re-performing
SC
analytical tests (Cushing and Loebbecke 1986). Continuous Activities seemed to consist primarily
of project management duties, light documentation, and follow-up procedures.
U
(Insert Figure 1 about here) N
A
As described in AS 2305.05 (PCAOB 2016), analytical procedures “involve comparisons of
M
recorded amounts, or ratios developed from recorded amounts to expectations developed by the
auditor.” For example, APs typically accomplish the following five tasks (Table 1):
D
Based on this description of APs, it could be expected that the literature could easily be reviewed
for relevant papers and organized to facilitate of understanding. However, as will be discussed in
the following section, the literature about APs is not confined to the fundamental processes
EP
depicted in Table 1, but instead is much broader and varied in scope, thereby complicating this
task. This complexity may require an expanded means for organization beyond that of the
commonly understood analytical procedures.
CC
The main objective of this research is to explore and then categorize and synthesize the
available academic research on analytical procedures in the external audit engagement. As
discussed in the Introduction section, a primary concern of practice is whether business analytics
PT
should be used in the engagement, and if so, when and how often (Appelbaum, Kogan, and
Vasarhelyi 2017)? And should these techniques be more complex? Could the focus of extant
research help direct practice? However, it is not yet ascertained that these are concerns of
RI
academics historically.
The next objective is to organize these selected papers to assist in understanding this literature
SC
and identify existing gaps and areas for further investigation. The third objective is to apply the
results to a structured framework that can appropriately direct future research activities.
U
Following the methodology of a systematic literature review as proposed by Keele (2007), this
research comprises the following search procedures:
N
Keywords Search: Keywords and search strings are collected based on the research questions. This
A
process entailed keyword searches for “analytics”, “analytical procedures”, “analytical review”,
“audit planning”, “risk assessment”, “internal control assessment”, “compliance testing”,
M
Search strings: These are constructed from the keywords in conjunction with the research
questions. The string format is generic so that it may be used in most libraries. For example:
(Management Fraud) OR (Earnings Misstatement).
EP
Sources: To accomplish the task of initially identifying relevant papers, the database of auditing
research compiled by a sub-committee of the AAA Auditing Section Research Committee
CC
(Trotman et al, 2009) is examined for academic papers likely to discuss audit analytics. The
references of these papers are also examined for likely additions to the list and those subsequent
papers are similarly reviewed and additional references tracked, in an iterative process. This entire
process is then repeated in Google Scholar and SSRN.
A
Filtering: The papers selected for this study had to be published as full papers in academic journals
or as completed dissertations or as completed working papers published online. After obtaining
the results from the inclusion/exclusion lists that follow, all remaining studies were examined for
the required additional textual analysis. Table 2 shows the selection steps for the literature review.
The complete listing of all identified papers and major categorizations can be found in Table 9
of Appendix B. The inclusion criteria are as follows:
PT
mining/machine learning and/or one of those techniques
Papers discussing at least one phase of the audit (see discussion that follows)
Papers where analytics are not the primary focus but meet all other criteria (this is typical
RI
for many behavioral studies)
SC
Papers are excluded based on the following criteria:
Papers published in media that were practitioner journals at the time of publication
U
Conference papers and workshop papers
Incomplete papers and duplicate papers
N
Papers that mention “auditing” or “auditor” but do not distinguish internal from external
and do not describe or refer to a typical engagement responsibility or task
A
Papers referring only to internal auditing/auditors
Papers that do not discuss some aspect of analytics/statistics/sampling/data
M
In general, a paper is considered relevant if it mentions directly external auditing and discusses
an aspect of analytics that typically belongs in at least one phase of the external audit model as
EP
developed by Cushing and Loebbecke (1986), see Figure 1 (Elliott, 1983). In the public company
audit setting, analytics could be the primary focus of the paper or a secondary focus or part of
another process/objective. For those papers where the use of analytics is not the primary focus,
CC
only those papers where analytics are essential to the process/argument/study are selected. For
example, several behavioral studies are included that focus on professional judgement and utilize
analytical procedures in the experiment or survey process (e.g. Arrington et al, 1984; Asare and
A
Wright 1997). Furthermore, if an analytical procedure is discussed but the typical stage of the audit
cycle for that procedure is not identified directly by the author(s) but is otherwise described, the
audit cycle is not identified in the categorization table (Table 9) in Appendix B online.
7
Literature Evaluation: A large majority of the papers (80%) discuss the effectiveness or efficiency
PT
of various APs as the primary topic. Fourteen papers mention the effectiveness and efficiency of
the APs as topics for future research. The overwhelming thrust of each paper is the quality of the
performance of APs as either a primary or secondary factor in some aspect of the external audit
RI
(Table 4).
SC
(Insert Table 4 about here)
Most academic research about APs in the financial audit engagement appears to be accessible
U
online for publications as of 1958. Although the publications were sparse for the first two decades,
this changes in the 1980’s and maintains that pace ever since for a total of 301 papers (Figure 2).
Analytical (Simulation, Modeling, Design Science, Internal Logic (Vasarhelyi 1982 p 48-
4)
D
Empirical methods are considered as both Behavioral and Archival since both approaches are
based on research that can be verified through experimentation or verification (Vasarhelyi 1982 p
CC
48-4; Coyne, Summers, Williams, and Wood, 2010 p 634) The research methods are described
more precisely per paper in Appendix B, but are summarized in the body of this manuscript at the
level of Analytical, Behavioral, Archival, and Conceptual, since these general approaches are
A
predominant. For example, a paper may be classified as a survey in Appendix B but be represented
in this figure as behavioral. These 301 papers vary in both research methods and in analytical
techniques (see Figures 12, 13, 14, and 15 of Appendix A online). The most popular research
methods are analytical, behavioral, archival, and conceptual.
The papers are published in thirty-three different journals, with Auditing: A Journal of Practice
and Theory with the higher frequency, followed by the Accounting Review, the Journal of
8
PT
thought and interdisciplinary approaches (Vasarhelyi 1982). Prior to the advent of the Auditing: A
Journal of Theory and Practice, many papers referred to auditors as “outside accountants” or as
“accountants and auditors” (Keenoy, 1958; Arkin, 1958; Hill, 1958). Auditing became more
RI
established as a field of its own, with unique issues of judgment and expertise that frequently were
examined with behavioral methods (Felix and Kinney, 1982).
SC
Specific areas of emphasis for analytical review procedures in the external audit are shown in
this literature to be Financial Statement/Management Fraud (Hogan, Rezaee, Riley Jr & Velury,
2008; Trompeter, Carpenter, Desai, Jones & Riley Jr, 2013), Going Concern Opinion (Carson,
U
Fargher, Geiger, Lennox, Raghunandan & Willekens, M., 2013), and Fair Value Measurement
N
(Martin, Rich, & Wilks, 2006; Bratten, Gaynor, McDaniel, Montague & Sierra, 2013).
The papers mention analytical methods in the six audit phases with the frequency shown below
A
in Figure 4, organized in sequence to the typical audit engagement process. Many papers discuss
M
applying analytical methods in more than one phase, and each instance of analytical procedures in
a phase is separately counted. Analytics are discussed in the papers as follows: 36 times for the
Engagement phase, 228 times for the Planning/Risk Assessment Phase, 225 times for the
D
Substantive Testing Phase, 167 times for the Review Phase, 46 times for the Reporting Phase, and
not at all in the Continuous Activities Phase. Given the role of analytical procedures as prescribed
TE
in the standards, it is not surprising that research is primarily concentrated in the phases of
planning, substantive testing, and review and minimally in the areas of engagement and reporting.
EP
The analytical procedures are also examined for each step of the C-L model (Figure 1). Upon
CC
initial examination, it soon became obvious that this research is broader in scope than the AP
processes detailed in Table 1. Accordingly, these techniques, detailed below as mentioned in the
papers, are categorized1 as follows:
A
1
These are the techniques described in the papers that have been applied/discussed/debated as APs in the external
audit engagement. The technique names were maintained, even if there is commonality across methods (for
example, time series are linear and Box Jenkins and ARIMA are the same, yet are mentioned separately to maintain
faithfulness to the literature). Short definitions and attributes for each technique may be found online in Appendix A,
Table 7.
9
PT
Regression Analysis, Step-Wise Logistic, Auto Regressive Integrated Moving Average
(ARIMA), Martingale Model, Multivariate Distribution, Sub-Martingale Model, Box
Jenkins (ARIMA), Discriminant Analysis, Seasonal Time Series X-11, Random Walk
RI
(ARIMA), Ordinary Least Squares (OLS), Double-Exponential Smoothing Model, Single-
Exponential Smoothing Model, Random Walk Drift (ARIMA), Hypergeometric
SC
Distribution, Ordinal Regression Model, Probit Model
Other Statistical Methods: Descriptive Statistics, Benfords Law, Monte Carlo
Study/Simulations, Complementary Hypothesis Evaluation, Analytic Heirarchy Process
U
(AHP)
N
The Audit Examination, Unsupervised, Supervised, Regression, and Other Statistical
techniques are considered appropriate if they had been applied in the context of the Cushing-
A
Loebbecke model (Figure 1), which may also be referred to as the “traditional” external audit
model. A complete listing of the literature with audit phases and analytical techniques identified
M
The percentage of papers using specific analytical techniques is shown below in Figure 5. Many
D
papers mention more than one analytical technique. In the realm of audit analytic techniques, the
most frequently used techniques are those of Audit Examinations followed by Regressions. Audit
TE
Examinations were discussed 459 times; Unsupervised Methods, 43 times; Supervised Methods,
171 times; Regression, 251 times; and Other Statistical Methods, 77 times.
EP
Many of the techniques are applied to the different phases of the external audit, albeit
CC
sporadically in the case of unsupervised and supervised methods and frequently in the case of
Audit Examination techniques and Regression techniques. Each of the audit phases of
Engagement, Planning/Risk Assessment, Substantive & Compliance Testing, Review, Opinion
A
2
Unsupervised approaches are those techniques that draw inferences from unlabeled datasets in which instances
either have no output specified or the value of the output is unknown (such as whether a transaction is fraudulent or
not) (Han, Kamber and Pei 2012, p 330).
3
Supervised approaches are those techniques that draw inferences from labeled datasets, otherwise known as
training data (Han et al, 2012, p 330).
10
Formulation and Reporting, and Continuous Activities exhibits academic research as follows
(please see Table 8 in Appendix A and Appendix B for more detailed analysis per publication):
1. Engagement: The papers from this phase primarily discuss ratio analysis, regression,
descriptive statistics, and expert systems, with only a few papers handling visualization,
text mining, expert systems, multi-criteria decision aids and structural models.
2. Planning/Risk Assessment: Most of the papers in this phase deal with all types of audit
examination, all of the regression techniques, and descriptive statistics, with some
PT
discussion of expert systems, Bayesian Belief Networks (BBN), and probability models,
and slightly less of clustering, text mining, visualization, multi-criteria decision aids, and
RI
structural models.
3. Substantive Testing & Compliance Testing: Audit examination techniques are enormously
popular here as were all of the regression techniques, descriptive statistics, expert systems,
SC
BBN, and probability models. Less popular were the unsupervised methods and other
supervised techniques such as Support Vector Machines (SVM), Artificial Neural
Networks (ANN), genetic algorithms, bagging/boosting, and multi-criteria decision aids.
U
4. Review: Ratio analysis and Computer Assisted Audit Techniques (CAATS) are discussed
N
frequently as were linear and time series regression and expert systems, with BBN,
probability models, and descriptive statistics used occasionally.
A
5. Opinion Formulation and Reporting: In the opinion phase, the main techniques mentioned
are ratio analysis, visualization, expert systems, log and linear regression, descriptive
M
All the techniques observed even once in the literature are marked in Table 5 below as to which
TE
audit phase they occur. For example, although all instances for sampling total 164 mentions, some
variations of sampling occur more than once in some papers, resulting in total of 145 papers in
Table 5. Additionally, Table 8 in Appendix A online contains a listing of the papers for each
EP
technique per audit phase that have been identified in the external audit literature.
Based on the analysis of which techniques are used in the various audit phases in the literature,
a preliminary mapping (Table 5) has been created, based entirely on the discussions in the 301
A
papers. The predominant techniques for all phases belong to the Audit Examination and
Regression approaches, with some use of BBN, probability models, descriptive statistics, and
expert systems. Although it may appear in the framework that many other more complex
techniques are analyzed by audit academics, their deployment in the literature is inconsistent and
sporadic. Some techniques are discussed only a couple of times, as is the case with text mining,
visualizations, process mining, SVM, ANN, Genetic Algorithm, C4.5 Classifiers, AHP, and
hypothesis evaluation.
11
In the task of Audit Examination, techniques such as sampling, ratio and trend analysis, CAATS
usage, and general ledger tests are clear favorites. Sampling techniques and ratio and/or trend
analysis are discussed more frequently than any other method, at 37.8% and 43.5% respectively.
CAATS are included in this category as many of the tests conducted by external auditors in the
papers were general ledger tests and basic calculations (Figure 6).
PT
Additionally, Bayesian statistics are applied extensively in the area of sampling (Ijiri & Kaplan,
1971; Corless, 1972; Elliott & Rogers, 1972; Hoogduin, Hall, & Tsay 2010) and in auditor
judgment and planning (Felix, 1976; Chang, Bailey, & Whinston, 1993; Dusenbury, Reimers, &
RI
Wheeler, 2000; Krishnamoorthy, Mock, & Washington, 1999).
Regression techniques are second in popularity, discussed 251 times in the audit literature. Log
SC
Regression was mentioned 81 times, with Linear Regression at 62 times, Time Series Regression
at 34 times, ARIMA at 20, and Univariate and Multivariate at 54 (Figure 7).
U
(Insert Figure 7 about here)
N
Most popular of the supervised techniques is the application of Bayes Learners/Bayesian Belief
Networks at 46 times, followed by Expert Systems at 41, Probability Models at 30, and Artificial
A
Neural Networks at 24 times (Figure 8).
M
Unsupervised Methods are discussed minimally, with Process Mining being the most popular
D
(Figure 9).
TE
Other Statistical Methods are slightly more popular with coverage in 77 papers, with
EP
The sheer number of academic papers still presents a challenge for researchers even after many
features have been described. The available academic research on analytical procedures goes well
beyond developing expectation models and testing actual results, which is the definition for
Analytical Procedures as described in the standards. The systematic research method (Keele 2007)
suggests that an organizing conceptual framework should be developed to facilitate understanding.
12
The aim of this structured research is not just to aggregate the evidence but to also provide
guidelines for future academic research and practitioner applications in a specific context.
A conceptual framework may be defined as “the way ideas are organized to achieve a research
project’s purpose” (Shields and Rangarjan 2013, p 24). The purpose of a framework is to organize
the literature to best understand how academic researchers apply analytical procedures to the audit
engagement. Since the typical engagement proceeds with the format of the audit phases, it seems
logical to organize the literature first by audit phase and then these phases are subsequently divided
PT
by AP type. Table 5 summarizes this information which is also available in more detail with paper
numbers, see Table 8 of Appendix A online. However, Table 8 may still appear overwhelming.
RI
Therefore, it may be appropriate to organize this literature within another view of APs, that of
Business Analytics (BA).
SC
4.1. Business Analytics
Since auditors examine business financial data, much of which may be generated with
applications and analytics embedded in management enterprise systems, gaining knowledge of and
U
perhaps adapting concepts of business analytics as discussed in academic literature (Holsapple et
al, 2014) could be beneficial. Business analytics is ‘the use of data, information technology,
N
statistical analysis, quantitative methods, and mathematical or computer-based models to help
managers gain improved insight about their operations, and make better, fact-based decisions’
A
(Davenport and Harris, 2007).
M
The recently proposed three dimensions of Business Analytics (BA), domain, orientation, and
techniques (Holsapple et al 2014), are useful for understanding the scope of business analytics.
D
Domain refers to the context or environment in which the analytics are being applied. Orientation
describes the outlook of the analytics – descriptive, predictive, or prescriptive, while techniques
TE
refer to the analytical processes of the domain and orientation (Holsapple et al 2014). The
feasibility of the application of a technique is dictated not only by its orientation, but also by the
available data.
EP
In the environment that the audit team operates, the domain dimension of the client is business
enterprise and management. The three dimensions of orientation (descriptive, predictive, and
CC
Descriptive Analytics
Descriptive analytics answers the question as to what happened. It is the most common type of
analytics used by businesses (IBM, 2013) and is typically characterized by descriptive statistics,
Key Performance Indicators (KPIs), dashboards, or other types of visualizations (Dilla, Janvrin,
and Raschke 2010). Descriptive analytics also forms the basis of many continuous monitoring alert
13
systems, where transactions are compared to data based analytics (Vasarhelyi and Halper 1991)
and thresholds are established from ratio and trend analysis of historical data.
Predictive Analytics
Predictive Analytics is the next step taken with the knowledge acquisition from descriptive
analytics (Bertsimas and Kallus, 2014) and answers the question of what could happen (IBM,
2013). It is characterized by predictive and probability models, forecasts, statistical analysis and
scoring models. Predictive models use historical data accumulated over time to make calculations
PT
of probable future events. Most businesses use predominantly descriptive analytics and are just
beginning to use predictive analytics (IBM, 2013).
RI
Prescriptive Analytics
Prescriptive Analytics (Bertsimas and Kallus, 2014; Holsapple et al, 2014; IBM, 2013; Ayata,
SC
2012) answers the question of what should be done given the descriptive and predictive analytics
results. Prescriptive analytics may be described as the optimization approach. Prescriptive
analytics go beyond descriptive and predictive by recommending one or more solutions and
U
showing the likely outcome of each.
N
The techniques for predictive and prescriptive analytics may appear similar, but their
orientation and ability to prescribe or predict depends on the type and amount of data available for
A
analysis. The bigger the data and more varied the data types, the more likely the solution may be
prescriptive. Prescriptive techniques may pull upon quantitative and qualitative data from internal
M
and external sources. Analytics based on quantitative financial data alone are utilizing only a
fraction of all available data, since most data is qualitative (Basu, 2013). Based on business rules,
D
techniques came from statistical data analysis, more recently research has begun incorporating
techniques that originate in machine learning, artificial intelligence (AI), deep learning, text
mining, and data mining. Some of these recently popular techniques do not make any statistical
A
assumptions about underlying data, and consequently generate models that are not statistical in
nature. The techniques found in business analytics are classified in Table 7 of online Appendix A.4
Given the attributes of audit engagement APs as discussed in the literature, the next challenge is
to obtain an understanding of how these APs can be considered as Business Analytics. This process
4
Due to space concerns, Appendix A, Appendix B (Table 9), and the Table 9 Reference List (all papers listed in
Table 9) are available online. The references in Section 6 pertain only to those citations in the body of the text.
14
starts by first understanding this literature to date, by undertaking the next steps of the literature
review process.
4.2 Given its attributes, how can this literature be presented to direct future academic
research?
One of the more common reasons for performing a literature review is to provide a framework
or context to appropriately position new research activities, having identified the extant research
PT
(Keele 2007, p 3). Within this scope of a review lies the opportunity to provide an overview of the
literature with the intent to influence the direction of future research. This paper began by
describing the dilemma of the current audit profession, that the emergence of big data as well as
RI
the growing use of analytics by audit clients has brought new concerns. That is, audit clients are
progressively using more complex Business Analytics (BA) and auditors are concerned that APs
as typically and historically applied may not be relevant or effective. Since auditors examine
SC
business financial and BA data, ideally a literature review based framework should be directed
towards these new concerns.
U
This section will discuss the evolution of a conceptual External Audit Analytics (EAA)
framework, where this examination of extant audit academic research regarding Analytical
N
Procedures is applied to the more general context of Business Analytics (BA). Although there have
A
been many applications of Analytical Procedures in the external audit practice5, there should be a
framework providing guidance for academic research of the more complex analytical techniques.
M
External Audit Analytics (EAA) is defined as: the utilization of various analytical procedures,
methods, and models to facilitate the transformation of data into external audit evidence and
D
subsequently into audit decisions. EAA may be considered as a special sub-area of the wider area
of Business Analytics (BA) since public auditors examine business financial data.
TE
Business Analytics in academic research is discussed in the previous section and its dimensions
(domain, orientation, context) are subsequently applied to the Analytical Procedures function of
EP
the audit engagement. In this context, APs as practiced to date (Table 1) are but one component of
EAA. APs in the context of EAA provide a greater scope and variation than the APs as
conventionally understood.
CC
The conventional Analytical Procedures (APs) process, when regarded under the view of
Business Analytics, can now be conceptually regarded as one component of External Audit
Analytics (EAA). EAA provides the generalization needed to encourage further research and use
A
of this expanded view of APs. For example, in Table 1 APs are limited to basic comparisons and
ratio analysis using both financial and nonfinancial data – however, EAA pertains to all BA
techniques that lend themselves to the engagement process. In this context, EAA in an audit
5
Li et al. (2016) surveyed users of an audit analytics software and found very limited use of advanced analytics.
15
engagement could comprise of ratio analysis, text mining, and network mapping – a combination
of quantitative and qualitative data sources and a varied range of techniques.
Accordingly, the three BA dimensions are useful for defining EAA. The initial findings of this
literature review will be categorized by these three BA dimensions of domain, orientation, and
technique. These dimensions, particularly that of orientation, are a new way of understanding
analytics in the external audit. The process of categorizing each paper in the context of EAA
involves the following three steps:
PT
Determine the audit phase as described in the paper
Determine the orientation of the research task (descriptive, predictive, prescriptive)
RI
Determine the type of analytical technique deployed in the publication
SC
Parts of the following Figure 11 are a high level graphic illustration of the results of this three-
step process. It illustrates a literature-based framework which identifies the APs that the papers
discuss and what type of EAA orientation and techniques were examined:
U
(Insert Figure 11 about here)
N
Figure 11 categorizes the literature at a high summary level and could be regarded as the
literature based framework for APs in the external audit domain.6 Checkmarks indicate where a
A
paper has been identified, based on the process for that phase/orientation/technique type. All the
M
unchecked blank or shaded spaces theoretically represent areas where literature has not been found
to date, yet could be potential areas of research
D
and models may be applied. When applying the properties and techniques of BA to APs in the
process of developing EAA, issues which may emerge during this process could be as follows:
EP
How different are the objectives of Internal and External Audit Analytics in the current
context (Li et al, 2016)?
Isn’t there a substantive overlap between business monitoring and real-time assurance?
CC
A
6
It is recommended that interested researchers follow these procedures:
Considering that there is substantive overlap in data analytic needs, are the traditional three
lines of defense (Freeman, 2015; Chambers, 2014)) still relevant?
Descriptive EAA answers the question as to what happened. It is the most common type of
PT
analytics used by auditors and is typically characterized by descriptive statistics, Key Performance
Indicators (KPIs), dashboards, or other types of visualizations. It is expected that the focus of
discussion in the academic literature would be predominant in this orientation of the audit.
RI
Predictive EAA is the next step taken with the knowledge acquisition from descriptive analytics
SC
(Bertsimas and Kallus, 2014) and answers the question of what could happen (IBM, 2013) and is
characterized by predictive and probability models, forecasts, statistical analysis and scoring
models. Most audit clients use predominantly descriptive analytics and are just beginning to use
U
predictive analytics (IBM, 2013). The following issues perhaps should be considered by audit
researchers in this evolving analytic environment:
N
Traditional auditing has a retrospective approach, as traditional technologies did not allow
A
for other approaches - can the current environment allow for a prospective look?
What parts / procedures of the audit are fully or partially automatable?
M
Prescriptive EAA (Bertsimas and Kallus, 2014; Holsapple et al, 2014; IBM, 2013; Ayata, 2012)
D
goes beyond descriptive and predictive by recommending one or more solutions and showing the
likely outcome of each approach. It is a type of predictive EAA in that it prescribes a solution
TE
requiring a predictive model with two components: actionable big and varied (hybrid) data and a
validation/feedback system. A prescriptive EAA model will have a decision function that chooses
among alternatives – an optimization model. Interesting questions emerge from attempting to
EP
prescribe:
used to arrive at a quantitative score for the audit opinion, as opposed to the current pass/fail
opinion.
The currently mandated pass/fail opinion format 7 does not reflect the nuances and details of
the auditor’s work - the culmination of much laborious examination and careful judgement by the
auditor. With more advanced EAA techniques and reliable evidence, it is probable that this process
and resulting opinion could be quantified with prescriptive analytics. Prescriptive analytics may
allow for a graduated scale or ranking of audit opinion and audit risk. In an ideal scenario, auditors
PT
should be prolific in their use of analytic techniques of all three orientations, as analytics should
be dominant in industries that are very data-rich and where one of the major improvements from
RI
analytics usage is risk reduction (Banerjee et al, 2013).
SC
Many of the techniques observed in the external audit literature are quantitative in nature. This
dominance of quantitative techniques in APs may be because the main objective of external audit
has been to provide assurance on the accounting numbers. Therefore, the accounting numbers
U
traditionally were the focus of APs. However, with the availability of internal textual data, social
media, and big data, the scope of APs could be expanded to that of EAA. This greater variety of
N
available data creates the opportunity for more advanced analytics research.
A
Accounting numbers are derived by manipulating (aggregating, adjusting, etc.) quantitative
descriptions of business transactions and are obviously well structured. This structured data leads
M
typically to analysis which is quantitative and descriptive, and can be categorized as Audit
Examination techniques. Audit Examination entails, among many procedures, basic transaction
D
tests, three-way matching, ratio analysis, sampling, re-confirmation, and re-performance. These
tests are applied in every external audit engagement, and are regarded as fundamental EAA.
TE
expectation models in the audit should be presented. The most common types of techniques
utilized in EAA, in addition to those of the afore-mentioned audit examination, are expectation
models. The standards prescribe that auditors should develop expectations of accounts in the risk
CC
7
There is ongoing discussion regarding Critical Audit Matters within the profession and the PCAOB. The PCAOB
Release No. 2013-005, August 13, 2013, Docket Matter No. 034, The Auditor’s Report on an audit of Financial
Statements When the Auditor expresses an Unqualified Opinion, discusses the auditor’s responsibilities regarding
certain other information in certain documents containing audited financial statements and the related auditor’s
reports and related amendments to the PCAOB standards. See also Lynne Turner’s comments
(https://pcaobus.org//Rulemaking/Docket034/ps_Turner.pdf).
18
An expectation model is inferred from the archive of historical records. If it turns out to be
possible to infer a stable empirical relationship that fits the historical records well, then it is
reasonable to expect this relationship to hold for the near future, assuming no significant changes
take place in the business. Therefore, this relationship provides an expectation model for the
accounting numbers and other important business metrics of the near future. The accuracy of this
future relationship provides important audit evidence about the veracity of the quantities involved.
PT
expectation model as an equation for this accounting number. Then, for a given confidence level,
this equation can be used to derive a prediction interval for the future value of the accounting
RI
number. If the actual future value turns out to be inside the prediction interval, this can be
interpreted as strong evidence that the accounting number is properly represented. Otherwise, if
the actual future value lies outside the prediction model, the auditor will need to conduct further
SC
investigation to determine if there is indeed a problem with this accounting number. The
expectation model forms the basis of audit examination in the engagement and determines the
direction and degree of evidence collection and audit scrutiny.
U
The EAA usage described above has predictive orientation, and the amount of audit evidence
N
provided is based on the level of agreement between the observed business reality and the
predictions. This is utilized not only to verify accounting numbers, but also to provide assurance
A
on controls by comparing the observed business process workflow with the expectations derived
M
either from the existing business rules, or from the past observations of business processes. As an
example of the former, a business rule stating that “purchase orders exceeding $1,000 require
management authorization” creates an expectation with which all future purchase order
D
transactions would be compared. As for the latter option, if the analysis of past purchase orders
shows that 99% used vendors that were pre-approved, then it would be reasonable for the auditors
TE
to expect that every future purchase order would use a pre-approved vendor, and those that do not
would warrant investigation.
EP
difficult, and this manuscript focuses on other EAA expectation models obtained from more
advanced techniques.
A
The most basic dichotomy of the EAA techniques distinguishes between structural and
quantitative methods. Structural techniques look for various structural properties in the historical
records. A recent example is process mining (Jans et al, 2013). It provides techniques for analyzing
enterprise system logs and identifying the most common paths of enterprise business workflow to
be used as expectation models. If the observed workflow of a particular process deviates
significantly from the expected path, it should warrant an investigation.
19
There is a great variety of EAA multivariate techniques, and no generally accepted agreement
on their taxonomy8. It could be useful to differentiate multivariate techniques by considering
whether a particular EAA technique explicitly assumes the presence of latent9 features. For
PT
example, common classification and regression techniques do not work explicitly with any latent
features, while common clustering techniques do (with the latent feature being the cluster ID).
RI
Often, the utilization of latent features techniques is necessitated by the lack of critical information
in the historical records. For example, while it is commonly assumed that managerial or financial
statement fraud is a routine occurrence in most enterprises, very few confirmed and documented
SC
cases of such fraudulent transactions exist. For this reason, most audit engagement teams face the
challenge of creating expectation models for what is fraudulent versus normal, given that the
historical records do not identify past transactions in this way. In this situation, it would be ideal
U
for an auditor to examine transactions with a different perspective that is more exploratory in
nature.
N
Another important technique dimension to consider is the scale of variables utilized in the
A
expectation models, with the categorical and continuous ones being the two most commonly used.
M
The two important measurement scales of categorical variables are nominal and ordinal, while the
two important measurement scales of continuous variables are interval and ratio.
D
It is often the case that a technique assumes that all the variables are measured on one type of
the scale, and adaptations are required for those measured on a different one. For example, multiple
TE
linear regression models are developed for the case of continuous variables, while the categorical
scales of independent variables are accommodated by using dummy variables. Sophisticated
generalizations of multiple linear regression models such as ordinal regression models are utilized
EP
to deal with the case of categorical dependent variables. On the other hand, decision trees are
CC
8 The primary objective of multivariate techniques is to develop relationships between or among variables/features
under study. In this view, the universe of multivariate techniques is wider than what is usually considered to be the
A
domain of multivariate statistics, where joint distributional properties of more than one variable are studied. If only a
single variable is viewed as the outcome or dependent variable, and its univariate distribution is studied given the
values of some the other variables, such as case in multiple linear regression, then we view it as a multivariate
technique even though it is traditionally not considered to be multivariate statistics.
9
Latent features are attributes or qualities that are not directly observed. For example, a concept such as trust is
measured in terms of multiple indirect observations that have shown correlation with it, thereby deriving a value for
this attribute which cannot be directly measured.
20
developed for nominal variables, while the continuous ones are accommodated by introducing
their comparisons with threshold values.
An important subset of continuous EAA models consists of the time series models, where the
time variable is afforded special treatment. Note that univariate time series models are based on
two variables (including time). Also, commonly used time series models study relationships
between variable values at discrete moments in time. Those much more complicated models where
time is continuous belong to the realm of stochastic processes, and such models have not so far
PT
found applications in audit analytics.
RI
Combining knowledge of the EAA with the literature, a summary conceptual framework of
EAA for academic research in the external audit domain is proposed (Figure 11). By grounding
SC
the EAA framework with analytics based on prevalent business and external audit practices, future
academic research maintains its relevance to the profession.
U
This framework identifies with shading those areas of APs (now considered as EAA) that have
been covered by extant literature yet require additional study, in addition to those areas of research
N
that exhibit gaps in the EAA domain. Audit Examination techniques form the foundation of each
step in the proposed EAA framework. Since Audit Examination techniques may be descriptive,
A
exploratory, and confirmatory (Liu 2014), they provide a level of domain and transaction
knowledge that are essential to the auditor. In EAA, it is expected that data preparation procedures
M
such as data verification, data cleaning, and data harmonizing contribute to “client knowledge” or
“client data expertise” and are similarly time-consuming and laborious to obtain.
D
The framework in Figure 11 illustrates with shading the general type of technique (Audit
TE
Examinations, Unsupervised, Supervised, Regression, and Other Statistics) that potentially could
be deployed by auditors and the orientation of these techniques (Descriptive, Predictive, and
Prescriptive). This framework may serve as a foundation for additional detailed research by
EP
practitioners, standard setters, and academia regarding the use of the various suggested techniques
for each audit phase. The framework provides guidance as follows:
Checked areas: the audit phase where published research has been found with that
CC
Shaded areas, checked/not checked: The phases where research appears to be scant or
missing to date are shaded.
The shaded cells shown in Figure 11 are identified now as research-sparse EAA.
For example, clustering as an unsupervised descriptive method has been found to be missing in
the engagement phase literature and is suggested here for future analysis. Additionally,
21
visualization as an unsupervised method has been examined for many audit phases in some
research; however, this does not mean that there isn’t room for additional research contributions.
In general, the phases of Engagement, Opinion, and Continuous Activities are particularly sparse,
most likely because the standards do not require analytical procedures at these phases, and
therefore could benefit from additional research.
The proposed EAA framework is based on the assumptions that the auditor has few technical
constraints and has access to a significant amount of client and other external data. Figure 11
PT
combines the discussion of the potential approaches for possible technique types in each audit
phase (see beginning of this section) with that of the literature findings.
RI
In Figure 11 there are research gaps/sparse research in visualization, process mining, and all
prescriptive methods for every audit phase. For example, an unsupervised technique such as
SC
Visualization, which is already predominant in BA (Holsapple et al, 2014), might be readily
accepted to supplement audit examination techniques in each phase. A different view of the
proposed EAA Framework is provided in Table 6, where areas suggested for future research are
U
checked:
the audit examination process in that they are descriptive. Techniques that are of predictive
orientation (unsupervised, supervised, regression, and other) would be next, followed last by
prescriptive oriented techniques (unsupervised, supervised, regression and other).
D
As it stands now, auditors typically face significant challenges to obtain sufficient and reliable
TE
client evidence, quantitative and qualitative. Looking forward, it is believed that these assumptions
regarding the EAA framework will not be unrealistic – many clients today process dozens of
terabytes of internal data, not to mention acquiring additional external sources of data, which is
EP
more than 1000 times the data available just ten years previously (Banerjee et al, 2013). Over time,
clients may expect deeper insights from their external auditors, to maximize the potential benefits
of their investment in internal IT infrastructure and big data collection. Other client stakeholders
CC
may also expect deeper levels of analysis from the external auditor in this big data technology
driven business environment.
A
By and large most advanced analytical procedures are of value for predictive methods but not
necessarily prescriptive. Descriptive methods complement these approaches. Traditional
descriptive methods can also be supplemented by other statistical methods. This huge potential
usage of predictive and prescriptive methods also raises the issue of the adequacy of the traditional
organization of the audit in an assurance process that is close to real time, mainly automated,
subject to deep human decision making, and complemented by analytic technology.
22
PT
and other details.
This preliminary understanding is then expanded with the concepts of business analytics
RI
(Holsapple et al. 2014), applications which capture the potential information made possible with
big data. Considering audit analytics with the concepts of business analytics(BA) is appropriate
since auditors examine business processes and decisions. This combination of literature findings
SC
and BA, now called the External Audit Analytics (EAA) framework, is organized around
descriptive, predictive, and prescriptive orientations of BA. Although predominantly literature
based, the EAA framework contains recommendations for the utilization of prescriptive
U
techniques.
N
This paper organizes and synthesizes the previously uncategorized extant literature, thereby
A
encouraging further research and exploration by academia, regulators, and practitioners. Due to
the very large number of publications discussing data analytics in the external audit, the process
M
of organizing and understanding this research is just beginning. The breadth and scope of
approaches in the literature is astonishing, given the somewhat limited and narrow applications of
analytics in assurance practice. The fact that 301 papers discuss analytics in the audit engagement
D
is significant. The expanse of extant research is apparent and challenges the assumption that the
profession has always been focused only on ratio analysis, sampling, and scanning. This literature
TE
Summarizes and organizes the existing research about analytics and big data in the
EP
audit engagement;
Provides a framework/background – the EAA Framework – with which to
understand this extant relevant research and to appropriately direct new research
CC
This paper details and organizes all the relevant research for any approach that occurs in a
phase of the audit engagement. Academics, practitioners, and regulators may readily identify
previous research for many techniques in the audit phases. For example, the PCAOB is re-
assessing the feasibility of a more quantitative reporting format for the Audit Opinion and CAMs
– and this paper provides an organized reference guide that directs attention to the papers that
discuss various analytics and reporting techniques for that phase of the engagement. The PCAOB
may see that 46 papers discuss analytics in the reporting phase, and these papers are identified.
23
Essentially, this paper may assist in directing the research process for the audit profession and
exposes the degree of thought and analysis that has already occurred, thereby offering a significant
contribution to the field.
However, when considering these papers in light of EAA, more research could be applied in
the engagement phases as follows:
1. Engagement: The auditors have access to the audited financial statements and other public
PT
information as well as other external sources of data, not dissimilar to investment/financial
analysts. Expectation models could be developed at this time, derived from quantitative
and qualitative data. At this stage, researchers should focus on performing the following
RI
techniques: ratio analysis of audited statements, text mining, visualization, regression, and
descriptive statistics.
SC
2. Planning/Risk Assessment: Similar to the Engagement Phase, but the researchers can
assume that auditors would have access to the current unaudited financial statements and
could develop models of what could and should happen. Clustering, visualization,
U
regression, belief networks, expert systems, and descriptive statistics may be used in
addition to ratio and trend analysis.
N
3. Substantive Testing & Compliance Testing: Research in this phase could compare
sampling to testing 100% of the transactions, depending on the client environment.
A
Transactions could be tested against benchmarks and expectation models. Results that are
M
flags or indicative of further investigation could be subject to further testing and evidence
collection. However, initially research for this phase most likely would include all audit
examination techniques, Audit by Exception (ABE) if appropriate, clustering, text mining,
D
process mining, visualization, SVM, ANN, expert systems, decision trees, probability
models, belief networks, regression, Benford’s Law, descriptive statistics, structural
TE
since what should have happened will serve as the benchmark of what happened. All the
techniques outlined in Substantive Testing could be applied here, with more emphasis on
expert systems, probability models, belief networks, SVM, ANN, genetic algorithms,
CC
It is anticipated that there may be a more nuanced measurement of risk than the current
unqualified/qualified opinion. Potentially the audit opinion could be a more informative,
graduated opinion derived from prescriptive analytics of reliable evidence. This phase
could feasibly benefit from the same approaches mentioned in earlier phases, with more
emphasis on time series regression, probability models, belief networks, expert systems,
and Monte Carlo simulation studies. The topic of the application of analytical techniques
24
to arrive at a more quantitative audit opinion, away from the current mainly dichotomous
outcome, is a very important area for future research.
6. Continuous Activities: The researcher and/or auditor may run continuous or interim tests
using many different models to generate predictive and prescriptive expectations of the
ongoing client’s activities and how they may impact the upcoming financial statements.
This phase would involve the use of many audit examination techniques as a foundation
for the use of regression, descriptive statistics, belief networks, probability models, expert
PT
systems, decision trees, process mining, visualization, text mining, and clustering.
Prescriptive models would be continuously updated with new data, improving the models’
accuracy over time. Although not mentioned by Cushing and Loebbecke (1986),
RI
Continuous Auditing (CA) (Vasarhelyi and Halper 1991) with its real-time feed of relevant
information could be considered as an interim continuous activity.
SC
Although there exists a vast body of academic analysis providing an expanded view of APs,
this view of available research to date has many gaps. As can be seen, much additional analysis
U
needs to occur in every audit phase with most techniques, despite the broad expanse of extant
N
research in this domain. It is hoped that regarding these APs in a slightly different light will
encourage additional discussion. Academia has already conducted extensive research regarding
A
the use of expanded analytics in the external audit, yet even more is required. The application of
these papers towards an EAA framework maintains their relevance in the modern economy and in
M
the modern data-driven audit. The broad expanse of research regarding analytics in the engagement
is now exposed, in juxtaposition to the very narrow range of analytics used by the external audit
profession. What has been lacking to date is the execution in assurance practice of this rich research
D
– however, with the challenges that auditors face in this modern business environment of analytics
TE
and big data, motivation for a shift in practice towards more complex analytics surely must be
strengthening.
EP
CC
A
25
American Institute of Certified Public Accountants. (1958). Glossary of statistical terms for accountants
and bibliography on the application of statistical methods to accounting, auditing and
PT
management control. AICPA library, pp. 1-30.
Appelbaum, D., Kogan, A., and Vasarhelyi, M.A. (2017). Big Data and Analytics in the Modern Audit
Engagement: Research Needs. Accepted at Auditing: A Journal of Practice & Theory.
RI
Appelbaum, D., Kogan, A., Vasarhelyi, M. A., and Yan, Z. (2017) Impact of Business Analytics and
Enterprise Systems on Managerial Accounting. Accepted at International Journal of Information
Systems.
SC
Arkin, H. (1957). Statistical sampling in auditing. New York Certified Public Accountant (pre-1986),
27(000007), 454.
Arkin, H. (1958). A statistician looks at accounting. Journal of Accountancy (pre-1986), 105(000004),
U
66.
Arrington, C. E., Hillison, W., & Jensen, R. E. (1984). An application of analytical hierarchy process to
N
model expert judgments on analytical review procedures. Journal of Accounting Research, 298-
312.
A
Asare, S. K., & Wright, A. (1997). Hypothesis revision strategies in conducting analytical procedures.
Accounting, Organizations and Society, 22(8), 737-755.
M
Basu, A. T. A. N. U. (2013). Five pillars of prescriptive analytics success. Analytics Magazine, 8-12.
Bertsimas, D., & Kallus, N. (2014). From Predictive to Prescriptive Analytics. arXiv preprint
TE
arXiv:1402.5481.
Bratten, B., Gaynor, L. M., McDaniel, L., Montague, N. R., & Sierra, G. E. (2013). The audit of fair values
and other estimates: The effects of underlying environmental, task, and auditor-specific factors.
Auditing: A Journal of Practice & Theory, 32(sp1), 7-44.
EP
Carson, E., Fargher, N. L., Geiger, M. A., Lennox, C. S., Raghunandan, K., & Willekens, M. (2013).
Audit reporting for going-concern uncertainty: A research synthesis. Auditing: A Journal of
Practice & Theory, 32(sp1), 353-384.
CC
Chambers, A. (2014). New guidance on internal audit–an analysis and appraisal of recent developments.
Managerial Auditing Journal, 29(2), pp.196-218.
Chang, A. M., Bailey Jr, A. D., & Whinston, A. B. (1993). Multi-auditor decision making on internal
control system reliability: A default reasoning approach. Auditing: A Journal of Theory &
A
Practice, 12(2), 1
Christensen, C. (2013). The Innovator's Dilemma: When New Technologies Cause Great Firms to Fail.
Harvard Business Review Press
Corless, J. C. (1972). Assessing prior distributions for applying Bayesian statistics in auditing. Accounting
Review, 556-566
Coyne, J.G., Summers, S., Williams, B., and Wood, D.A. (2010) Accounting Program Research Rankings
by Topical Area and Methodology. Issues in Accounting Education, Vol. 25, No. 4, pp. 631-654
26
Cushing, B. E., & Loebbecke, J. K. (1986). Comparison of audit methodologies of large accounting firms.
Studies in Accounting research #26, American Accounting Association
Daroca, F. P., & Holder, W. W. (1985). The use of analytical procedures in review and audit engagements.
Auditing-A Journal of Practice & Theory, 4(2), 80-92.
Davenport, T. H., & Harris, J. G. (2007). Competing on analytics: The new science of winning. Harvard
Business Press.
Deakin, E.B. (1976). Distributions of financial accounting ratios: some empirical evidence. The
Accounting Review, 51(1), pp.90-96.
PT
Dilla, W., Janvrin, D. J., & Raschke, R. (2010). Interactive data visualization: New directions for
accounting information systems research. Journal of Information Systems, 24(2), 1-37.
Dusenbury, R. B., Reimers, J. L., & Wheeler, S. W. (2000). The audit risk model: An empirical test for
RI
conditional dependencies among assessed component risks. Auditing: A Journal of Practice &
Theory, 19(2), 105-117
Elder, R. J., Akresh, A. D., Glover, S. M., Higgs, J. L., & Liljegren, J. (2013). Audit sampling research:
SC
A synthesis and implications for future research. Auditing: A Journal of Practice & Theory,
32(sp1), 99-129.
Elliott, R. K., & Rogers, J. R. (1972). Relating statistical sampling to audit objectives. Journal of
Accountancy, 134.
U
Elliott, R.K. (1983). Unique Audit Methods: Peat Marwick International. Auditing: A Journal of Practice
& Theory Vol. 2, No. 2 Spring 1983
N
Felix Jr, W. L., & Kinney Jr, W. R. (1982). Research in the auditor's opinion formulation process: State
of the art. Accounting Review, 245-271.
A
Felix, W. L. (1976). Evidence on alternative means of assessing prior probability distributions for audit
decision making. Accounting Review, 800-807
M
Freeman, S. (2015). Special report: Engaging lines of defense. Freeman, 31(4), 21.
Glover, S. M., Prawitt, D. F., & Wilks, T. J. (2005). Why do auditors over-rely on weak analytical
procedures? The role of outcome and precision. Auditing: A Journal of Practice & Theory, 24(s-
D
1), 197-220
Han, J., Kamber, M., & Pei, J. (2012) Data Mining Concepts and Techniques, 3rd Ed. Waltham, MA:
TE
the academic literature. Auditing: A Journal of Practice & Theory, 27(2), 231-252
Holsapple, C., Lee-Post, A., & Pakath, R. (2014). A unified foundation for business analytics. Decision
Support Systems, 64, 130-141
CC
Hoogduin, L. A., Hall, T. W., & Tsay, J. J. (2010). Modified sieve sampling: A method for single-and
multi-stage probability-proportional-to-size sampling. Auditing: A Journal of Practice & Theory,
29(1), 125-148
IBM. (2013). Descriptive, predictive, prescriptive: Transforming asset and facilities management with
A
Keele, S. (2007). Guidelines for performing systematic literature reviews in software engineering. In
Technical report, Ver. 2.3 EBSE Technical Report. EBSE.
Keenoy, C. L. (1958). The impact of automation on the field of accounting. Accounting Review, 230-236
Krishnamoorthy, G., Mock, T. J., & Washington, M. T. (1999). A comparative evaluation of belief
revision models in auditing. Auditing: A Journal of Practice & Theory, 18(2), 105-127.
Li, H., J. Dai, T. Gershberg, and M. A. Vasarhelyi. (2016). Understanding Usage and Value of Audit
Analytics in the Internal Audit: An Organizational Approach. Working paper, Continuous
Auditing and Reporting Laboratory.
PT
Liu, Q. (2014) The application of exploratory Data Analysis in Auditing. PhD Dissertation, Rutgers
Business School, Continuous Audit and reporting Lab, Newark, NJ, 2014
Louwers, T. J., Ramsay, R. J., Sinason, D. H., Strawser, J. R., & Thibodeau, J. C. (2015). Auditing and
RI
assurance services. New York, NY: McGraw-Hill/Irwin.
Martin, R. D., Rich, J. S., & Wilks, T. J. (2006). Auditing fair value measurements: A synthesis of relevant
research. Accounting Horizons, 20(3), 287-303
SC
Neter, J. (1949). An investigation of the usefulness of statistical sampling methods in auditing. Journal of
Accountancy (pre-1986), 87(000005), 390.
Public Company Accounting Oversight Board (PCAOB). (2016). Substantive Audit Procedures. Auditing
Standards (AS) 2305. Washington, D.C.: PCAOB
U
Public Company Accounting Oversight Board (PCAOB). (2016). Audit Sampling. Auditing Standards
(AS) 2315. Washington, D.C.: PCAOB
N
Public Company Accounting Oversight Board (PCAOB). (2016). Identifying and Assessing Risks of
Material Misstatement. Auditing Standards (AS) 2110. Washington, D.C.: PCAOB
A
Shields, P. M., & Rangarajan, N. (2013). A playbook for research methods: Integrating conceptual
frameworks and project management. New Forums Press.
M
Trompeter, G. M., Carpenter, T. D., Desai, N., Jones, K. L., & Riley Jr, R. A. (2013). A synthesis of fraud-
related research. Auditing: A Journal of Practice & Theory, 32(sp1), 287-321.
Trotman, K., Gramling, A., Johnstone, K., Kaplan, S., Mayhew, B., Reimers, J., Schwartz, R., Tan, H.T.,
D
Wright, B., Brazel, J., Earley, C., Krogstad, J., Cohen, J., and Jenkins, G. (2009). Thirty-Three
Years of Audit Research. AAA Audit Section Database:
TE
www2.aaahq.org/audit/33YrsAuditResearchFinal.doc
Tucker III, J. J., & Lordi, F. C. (1997). Early Efforts of The US Public Accounting Profession to
Investigate the Use of Statistical Sampling. The Accounting Historians Journal, 93-116.
EP
Vasarhelyi, M. A. (1982). Academic Research in Accounting and Auditing. Handbook of Accounting and
Auditing, hrsg. von John C. Burton, Russel E. Palmer und Robert S. Kay, 4.
Vasarhelyi, M.A. and F. B. Halper. 1991. The continuous audit of online systems. Auditing: A Journal of
CC
Horizons.
Weber, R. (1978). Auditor decision making on overall system reliability: accuracy, consensus, and the
usefulness of a simulation decision aid. Journal of Accounting Research, 368-388.
28
3. 4.
1. Pre- Compliance
Evaluation 6.
2. and
engagem and 5. Report Continuous
PT
Planning Activities
ent Substantive
Testing Review
RI
SC
Figure 1: Model of Engagement Cycle based on Cushing and Loebbecke Model (1986) and modified to reflect the current standards
and general practice and which provide context for understanding the use of Aps.
U
N
A
Number of Publications about Analytical Procedures every 5
years
M
50
45
45 43
40 41
D
40 37
Number of papers
35
TE
30 27
25
25
20 17
EP
15
15
10 7
5
CC
0
1970 1975 1980 1985 1990 1995 2000 2005 2010 2015
Year
A
Figure 2: Number of Analytical Papers per year, from 1970 (aggregated for years prior to 1970) until 2015, in five year increments
29
79
80
60
PT
40 29
20
RI
0
Behavioral Conceptual Analytical Archival
Paper Research Approach
SC
Figure 3: Display of the number and percentage of paper types/approaches that discuss analytics in the external audit
U
Number of Papers per Audit Phase
N
Audit Phase in the Engagement
Continuous Activities 0
A
Report 46
M
Review 167
Client Engagement/Retainment
TE
36
Figure 4: Total number of papers discussing the application of analytics per Audit Phase
CC
A
30
Regression 251
Supervised 171
PT
Other Statistics 77
Unsupervised 43
RI
0 100 200 300 400 500
Number of Papers
SC
Figure 5: Number of papers using certain Audit Analytics techniques in the literature
U
Audit Examination Techniques in the Literature
N
Audit Examination Techniques
A
Ratio Analysis/Transaction Tests 189
M
Sampling 164
Data Analytics 22
TE
Data Modeling 15
Number of Papers
Figure 6: Total number of papers that discuss the various Audit Examination techniques
CC
A
31
Time Series 54
PT
Log Regression 81
RI
0 20 40 60 80 100 120
SC
Number of Papers
U
Figure 7: Total number of papers discussing Regression Methods
15
Process Optimization 10
6
Supervised Method
2
2
Fuzzy ANN 2
2
TE
stacking 1
1
Majority Vote 1
1
Multilayer Feed Forward Neural Network (MLFF) 1
CC
0 5 10 15 20 25 30 35 40 45
Number of Papers
Figure 8: Breakdown of Supervised Methods by technique and the total number of times each is discussed
A
32
Clustering 8
Visualization 7
PT
Simulation 4
RI
Text Mining 3
0 2 4 6 8 10 12
Number of Papers
SC
Figure 9: The total number of papers discussing each Unsupervised Method
U
Other Statistical Methods in the Literature
Descriptive Statistics N 31
Other Statistical Methods
22
A
Benfords Law 9
8
Monte Carlo Study/Simulations 3
M
1
Complementary Hypothesis Evaluation 1
1
Analytic Heirarchy Process (AHP) 1
D
0 5 10 15 20 25 30 35
TE
Number of Papers
Figure 10: The total number of times that Other Statistical Methods are discussed
EP
CC
A
33
PT
RI
SC
U
N
A
M
D
TE
EP
Figure 11: Conceptual External Audit Analytics (EAA) Framework, where the shaded areas indicate suggested areas of great
potential for continued or additional research, based on the extant research of Aps.. The checked marked areas indicate where
CC
research has already occurred. The shaded cells indicate where research is sorely needed, based on potential scope of EAA.
A
34
PT
account balances to other current year balances current period
for conformity with predictable patterns based
on the client’s experience
RI
Comparison of current year account balances Industry statistics
and financial relationships (ratios) with similar
information for the client’s industry
SC
Study of the relationships of current year Pertinent nonfinancial information
account balances with relevant nonfinancial
information
U
Table 1: Typical AP Engagement Tasks, adopted from Louwers et al (2015) page 99
N
A
Selection Step:
M
Step 1 Apply keywords and strings to all sources and follow up with source
references, gathering results until additional papers cannot be extracted
Step 2 Exclude any invalid papers
D
PT
Table 4: Research Focus of the papers that mention Analytical Procedures in the External Audit
RI
SC
U
N
A
M
D
TE
EP
CC
A
36
PT
Process Time Series Descriptive
Mining (2) Statistics
(1) (11)
Univariate and
RI
Multivariate
(6)
Planning: Transaction Clustering Process Log Regression Multi-criteria
SC
Tests (6) Optimization (65) Decision Aid
(20) (4) (15)
Ratio Analysis Text Mining Expert Systems/ Linear Regression Descriptive
(159) (6) Decision Aids (36) Statistics
(33) (27)
U
CAATS Visualizations BBN Time Series Structural Models
(19) (7) (22) (33) (7)
(19) N
Probability Model ARIMA
(9)
Univariate and
A
Multivariate
(25)
M
Substantive & Ratio Analysis Visualizations SVM Linear Regression Benford's Law
Compliance (139) (8) (1) (50) (7)
Testing:
Sampling Text mining ANN Time Series Descriptive
D
(26) (22)
Bagging, Boosting Monte Carlo Study
(4) (3)
BBN
(29)
CC
Probability Models
(17)
Techniques: Audit Unsupervised Supervised Regression Other Statistics
Examination
A
(26) (1)
Opinion: Ratio Analysis Visualizations Expert Systems/ Log Regression Multi-criteria
(35) (3) Decision Aids (22) Decision Aid
(7) (3)
Process Linear Regression Descriptive
Mining (11) Statistics
(1) (10)
Continuous
Activities:
PT
Table 5: Summary listing/draft framework of the techniques occurring at least once in the various Audit Phases in the literature,
where the numbers of papers containing that technique type per phase are indicated in parentheses.
RI
SC
U
N
A
M
D
TE
EP
CC
A
38
Continuous
Descriptive Engagement Planning Testing Review Opinion
activities
Clustering Models
Descriptive Statistics
Process Mining: Process
Discovery Models
PT
Ratio Analysis
Spearman Rank Correlation
Measurement
RI
Text Mining Models
Visualization
SC
Continuous
Predictive Engagement Planning Testing Review Opinion
activities
Analytical Hierarchy
Processes (AHP)
U
Artificial Neural Networks
(ANN)
Auto Regressive Integrated
Moving Average (ARIMA)
N
A
Bagging and Boosting models
M
Bayesian Theory/Bayesian
Belief Networks (BBN)
Benford’s Law
D
Models
Expert Systems/Decision Aids
Genetic Algorithms
Hypothesis Evaluations
EP
Linear Regression
Log Regression
Monte Carlo
CC
Study/Simulation
Multi-criteria Decision Aid
Probability Theory Models
Process Mining: Process
A
Optimizations
Structural Models
Support Vector Machines
(SVM)
Time Series Regression
Univariate and Multivariate
Regression Analysis
39
Continuous
Prescriptive Engagement Planning Testing Review Opinion
activities
Artificial Neural Networks
(ANN)
Auto Regressive Integrated
Moving Average (ARIMA)
Expert Systems/Decision Aids
Genetic Algorithms
PT
Linear Regression
Log Regression
Monte Carlo
Study/Simulation
RI
Time Series Regression
Univariate and Multivariate
SC
Regression Analysis
Table 6: Gaps and Areas of Scant Research in the APs literature in the EAA context are checked – these are techniques that should
U
be explored with more depth in future academic research (adapted from Appelbaum et al 2016)
N
A
M
D
TE
EP
CC
A