GO CKD WP DA DWB UserGuide 2022
GO CKD WP DA DWB UserGuide 2022
Data Workbench
User Guide
June 2022
Contents
1. Introduction to Data Workbench ......................................................................................................... 8
Extracting the Right Data ............................................................................................................. 8
Transferring the Data ................................................................................................................... 9
Importing the Data ........................................................................................................................ 9
Key Considerations for Data Import ........................................................................................... 10
1.4.1. Pre-processing of data ........................................................................................................ 10
1.4.2. Use of multiple company codes .......................................................................................... 10
2. Walkthrough Data Workbench .......................................................................................................... 12
Introduction to Data Workbench................................................................................................. 12
Supported Analytics and Data Import Scenarios ....................................................................... 12
2.2.1. Supported Data Scenarios for KPMG Clara analytics Routines ......................................... 13
2.2.2. Supported Data Scenarios for Financial Services Routines ............................................... 14
Accessing Data Workbench ....................................................................................................... 15
Advanced Capabilities Portal ..................................................................................................... 17
2.4.1. Creating an Analysis ........................................................................................................... 17
2.4.2. Viewing the Analysis Status ................................................................................................ 19
2.4.3. Accessing the Analytical Results ........................................................................................ 19
2.4.4. Accessing Data Workbench for Data Processing ............................................................... 20
2.4.5. Selecting Results Widgets to Display ................................................................................. 20
2.4.6. Editing an Analysis .............................................................................................................. 20
2.4.7. Resetting, deleting or requeuing an Analysis ..................................................................... 21
My Analysis/Overview ................................................................................................................ 22
Analytics Selection ..................................................................................................................... 23
2.6.1. Select Analytics for Processing........................................................................................... 23
2.6.2. View the Status and Requirements for Selected Analytics ................................................. 25
2.6.3. Delete an Analytic ............................................................................................................... 25
Upload, Import and Transform Data .......................................................................................... 26
2.7.1. Upload and Import Client Data............................................................................................ 26
2.7.2. Transform the Data Using the Data Field Mapping Functionality ....................................... 32
2.7.3. View the Analysis Details .................................................................................................... 33
2.7.4. Activate SQL Insert to Perform Manual Transformation of Data ........................................ 34
2.7.5. Review the Data Import and Data Transform Validations .................................................. 35
Account Mapping ....................................................................................................................... 36
2.8.1. Loading the Financial Statement Structure ......................................................................... 38
2.8.2. Mapping General Ledger Accounts to the Financial Statement Structure ......................... 39
2.8.3. Changing and Un-mapping Accounts ................................................................................. 40
2.8.4. Saving the Account Mapping .............................................................................................. 43
2.8.5. Using the Export and Import functionality ........................................................................... 43
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
2
2.8.6. Reviewing the Account Mapping Validation Report ............................................................ 46
2.8.7. Publish account mapping .................................................................................................... 48
Parameters ................................................................................................................................. 50
2.9.1. General Ledger Analysis Parameters ................................................................................. 50
2.9.2. Financial Services Routines Parameters ............................................................................ 55
Validations ................................................................................................................................ 57
2.10.1. Upload Validation Routines............................................................................................... 57
2.10.2. Import Validation Routines ................................................................................................ 57
2.10.3. Transformation Validation Routines .................................................................................. 58
2.10.4. Analytic Reports and Reconciliation ................................................................................. 59
2.10.5. Confirmation ...................................................................................................................... 61
Process .................................................................................................................................... 61
Results ..................................................................................................................................... 63
Database Archiving & Restore ................................................................................................. 64
2.13.1. Archive an analysis ........................................................................................................... 64
2.13.2. Restore an analysis .......................................................................................................... 68
Activity Log ............................................................................................................................... 68
2.14.1. Analysis Creation .............................................................................................................. 68
2.14.2 Importing, Transforming and Processing data ................................................................... 69
Central Team Portal ................................................................................................................. 70
2.15.1 Project Overview ................................................................................................................ 70
2.15.2 Project User Management ................................................................................................. 71
Data Preparation Toolbox ........................................................................................................ 75
Appendix A. Summary of Data Import and Transformation Validations ............................................ 76
A.1. Summary of Data Upload and Import Validations ................................................................. 76
A.2. Summary of Data Transformation Validations ...................................................................... 81
A.3. Summary of FSR Data Transformation Validations .............................................................. 97
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
3
How to navigate this document
To navigate between the topics included in this user guide, use the bookmarks (example illustration
included below) to jump to relevant sections.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
4
How to select the correct version of the Data Workbench User Guide
As of 2022.1 release there are two different deployment modes of KPMG Clara workflow - Data
Workbench:
PaaS mode leveraging on Azure services for data processing back end
Non-PaaS mode that continues using Microsoft SQL Server for data processing
As there are slight differences on the functionality in Data Workbench between PaaS and Non-PaaS
deployments, there are two separate versions of the KPMG Clara workflow - Data workbench User Guide,
each specific to the data processing back-end technology used by Data Workbench.
In order to choose the right Data workbench guidance, confirm with your local KPMG Clara deployment
team, Central Team or IT department whether PaaS or Non-PaaS is applicable for the member firm. There
can be the following scenarios:
Regional clouds environments (e.g. EMA and ASPAC Training/STG/Prod) are deployed with the PaaS
mode.
Satellite cloud environments may be deployed with PaaS or Non-PaaS based on the member firm
deployment strategy.
On-Premises environments are always deployed with Non-PaaS (Microsoft SQL back-end
technologies)
Another way to identify the deployment mode in Data workbench, is to review the analysis details in Data
workbench directly after creating a new analysis and following these steps:
- Satellite can leverage SQL or PaaS based on country decision.
1. Open a 2022.1 (or higher version) engagement from the “Engagements Dashboard”
2. From My Engagement (the 4 square icon in the top right) , select Advanced Capabilities
3. On the Advanced Capabilities Portal, Create a new Analysis or open an existing analysis
4. In the right-hand icon list, select the disk pack icon which opens the Analysis Details
5. Compare the Analysis Details screen from your analysis to the screen prints below
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
5
Data Workbench
Data Workbench is the universal data management platform that prepares and
Highlights
processes the data in support of the features and activities teams perform within the
KPMG Clara workflow – Enhanced / Core (from hereon: KPMG Clara). This version
of the guidance specifically addresses instances of KPMG Clara which are
implemented in a Satellite cloud environment or as local instances that are not
deployed with PaaS (e.g. using SQL Server technologies). A separate version of
the user guide addressing the Data Workbench as implemented on a regional or
satellite cloud as a Platform as a Service (PaaS) is available.
This guidance covers the import, data field mapping, account mapping, setting of
parameters and processing of data for the KPMG Clara analytics general ledger
routines, data-enabled working papers and GL apps. All of these capabilities will be
referred to as “KPMG Clara analytics” throughout the remainder of this guide. This
guide also pertains to the processing of the financial services routines, which are
highlighted throughout the guide in individual sections.
KPMG Clara data workbench is the one-stop shop for extraction, transformation,
Purpose
load, and data processing efforts for all capabilities requiring the use of data. It
provides the ability to import and process data to support the KPMG Clara workflow
Advanced Capabilities and is essential to support the extraction, transferring and
importing of the data necessary to execute the KPMG Clara workflow Advanced
Capabilities and Financial Services Routines.
KPMG Clara data workbench has an embedded “data management” process which
includes the following high-level activities:
− Utilizing processed analytics for the entry of balances in the KPMG Clara
workflow
− For certain KPMG Clara analytics (e.g. Planning Analytics, Account Analysis,
Data-Enabled Working Papers, GL Apps), a Financial Statement Structure must
have been created and Process Mapping must have been performed within the
KPMG Clara workflow – Financial Statements Module.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
6
The use of various data-driven activities within the KPMG Clara workflow require
Why is this
data to be processed using the KPMG Clara data workbench. This includes the use
Important?
of the General Ledger analyses (i.e. Journal Entry Analysis, Planning Analytics and
Account Analysis), the Financial Services routines, the data-enabled working papers
and GL apps and the entry of balances into the Financial Statements module.
Following the steps outlined in this document, and the diligent execution and review
of validations within the workflow of the KPMG Clara data workbench application will
facilitate the complete and accurate processing of analytics.
For more Watch the Data Workbench ”How to” videos for details relating to the Data
information … Workbench functionality.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
7
1. Introduction to Data Workbench
Before heading into the steps to process data in Data Workbench, it is important to understand how data
flows from the entity (i.e. the client) to KPMG. The overall process starts upon request of the engagement
team, identifying the relevant analytics. From that point, the communication between the engagement team
and the client is shaped to discuss the detailed data requirements and the availability of automated
extraction scripts. This can be broken down into three stages:
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
8
Figure 1 – Flow of Data from the Perspective of the Engagement Team
− Engagement letter: Consider the terms and conditions of the relevant engagement letter. A country’s
local Risk Management function may provide additional clauses to include in the engagement
letter(s)/contract(s).
− Country guidelines: Work with the Central Team and local Risk Management to understand the
guidelines in place regarding secure data transfer of large files or to establish local guidelines
regarding secure data file transfer when local guidelines do not exist.
− Understand the environment: Work with the Central Team and the local IT Support team to
understand how and to where the data transfer will occur.
− Work with the client: Consider whether the entity has a preference regarding an acceptable method
of transferring data from their facility to the KPMG environment.
It is recommended that an encrypted medium be used when transferring such data. Refer to the Risk
Management Considerations document, local Risk Management and local ITS team guidelines to establish
a secure data transfer process. Examples of these can include the setup and availability of a Secure File
Transfer Protocol (SFTP) that allows for a secure online transfer of files, or the use of encrypted media,
such as encrypted external hard drives to provide a secure method of transferring and protecting both
KPMG’s and the entity’s information during transit.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
9
In cases where large(r) data files need to be imported, other forms of uploading data files are available:
- Upload these directly to the file upload path of the engagement (as indicated in section 1.2
Transferring the Data).
- Upload Zip files containing supported data files (as indicated in section 2.7.1.2 Zip File Upload) in
order to streamline the file upload process.
Follow the guidance in section 2.2 Supported Analytics and Data Import Scenarios Import scenarios to
import the entity’s data files.
− Using ERP ETL Guidance (SAP, others) and the SQL Insert process (see separate SQL Insert – Data
Workbench)
− Using IDEA.
Data Workbench -- Data Transformation guide provides procedures that can be used with Microsoft Excel®,
Microsoft Power Query® or IDEA® v10 (IDEA®) to accomplish data preprocessing tasks prior to import into
KPMG Clara. Note that these are just some of the tools that can be used. Best practice is that teams use
those tools they are most familiar with, whether it is one of the globally available solutions outlined above,
or other local or third-party resources that achieve the goal of transforming the data into a KPMG Clara
compatible format.
Depending on the complexity of the data received and transformations needed, it is recommended that
SMEs (Central Teams, IRM professionals) are involved during this process to make the data extraction and
data transformation processes as efficient and effective as possible.
Note: When documenting data pre-processing steps, engagement teams can use the KPMG Clara
workflow Computer Assisted Audit Techniques (CAATs) Document, which provides certain pre-populated
content specifically related to the use of Data Workbench and the Advanced Capabilities.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
10
For data sets with general ledger accounts having same number but with different names across company
codes (i.e. translated account names but still within the same chart of accounts), it’s important to ensure
that the account number is unique for account mapping purposes, therefore in this scenario a possible
workaround is to concatenate the company code to the account number to make it unique with its own
description.
When reviewing the results of imported data with multiple company codes, engagement teams may need to
consider the following:
− The Excel dashboards are generally set up such that a global slicer can be used to show results by
company. If results are needed on a company code basis, the company code attribute needs to be
included within the pivot table builder.
− Engagement teams should be aware of cases where document numbers are shared (i.e. recurring)
across company codes within their imported data. The native behavior of Microsoft Excel® is to provide
aggregated information, and as such, engagement teams should review the need to include the
company code attribute within the pivot tables to provide disaggregated results.
− The Transaction viewer within the Excel dashboards will provide information by document number,
regardless of company code.
− The Journal Entry Workpaper (within the Journal Entry Analysis dashboard) will provide results at a
document number level.
− Within the Journal Entry Analysis, the Pre-Defined reports are not initially set up by company code. As
such, engagement teams using multiple company codes may want to include this attribute within the
pivot table builder.
− When using multiple company codes, the bifurcation of journal entries for Account Analysis purposes is
performed at a company code and document number level.
Users are now able to allocate balances that have been processed for multiple company codes within a
single data set, through the use of the multi-opinion feature within the Financial Statements – Balances
module. Once the data set has been processed, simply go to the Balances module, select “Advanced
Capabilities” as the source of data for the balances, and subsequently allocate the identified company
codes (as per imported data) to each of the opinions.
Figure 2 – Mapping data set company codes to reporting opinions when importing account balances
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
11
2. Walkthrough Data Workbench
Introduction to Data Workbench
Data Workbench is the universal data management platform that prepares and processes the data in
support of the features and activities teams perform within the workflow. This includes the import, data field
mapping, account mapping, setting of parameters and processing of data for the general ledger routines,
as well as financial services routines.
In order to support the objective of bringing client data into the workflow, most commonly through the
processing of Advanced Capabilities, the following process flow is embedded within Data Workbench:
1. Data Workbench Portal (Section 2.4)
2. My Analysis/Overview (Section 2.5)
3. Analytic Selection (Section 2.6)
4. Upload, Import and Transform Data (Section 2.7)
5. Account Mapping (Section 2.8)
6. Parameters (Section 2.9)
7. Validations (Section 2.10)
8. Processing (Section 2.11)
9. Results (Section 2.12)
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
12
2.2.1. Supported Data Scenarios for KPMG Clara analytics Routines
With respect to the KPMG Clara analytics, there are three general ledger analytics groups available:
1. General Ledger Analysis
b. Planning Analysis PA
c. Account Analysis AA
The execution of these analytics is tied to the availability of four distinct data files, further outlined in the
Data Workbench - Data Requirements guide. These four files include the General Ledger Accounts (GLA)
file, the General Ledger Account Balances (GLAB) file, the Periodic General Ledger Trial Balance (GLTB)
file, and the Journal Entry Transactions (JET) file. Depending on the availability of these files, the data can
be offered to Data Workbench following three “scenarios”:
− “Compact” scenario (Only balances) – Either only the GLA and GLAB files or the GLTB file alone with,
at a minimum, all mandatory fields (i.e. account balance data only).
− “Full” scenario (Balances and transactions) – GLA, GLAB and JET files or the GLTB and JET files with,
at a minimum, all mandatory fields
− “JET only” scenario (Only transactions) – Only the JET file with, at a minimum, all mandatory fields.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
13
The following tables specify the required files, first by Analytics and then by File Scenario (i.e. Full or
Compact):
Table 1 - Required files by analytic
JET
Compact Full
Analytics Only
Analytics
Category GLA GLAB GLTB
GLA GLAB GLTB JET
JET JET
Planning Analysis x x x x
GLA Analysis Account Analysis x x
Journal Entry Analysis x x x
Business Process Leadsheets x x x x
General Ledger
KPI dashboard x x x x
App
Materiality aide x x x x
Trend Analysis and waterfall
x x x x
analysis
Payroll expenses substantive
x x x x
Data Enabled analytical
Workpapers Gross margin percentage
x x x x
substantive analytical
Gross margin substantive
x x x x
analytical
Financial Services File Types Loan Portfolio New Loans Payment Interest
Routines Scenario Overview Analysis (NLA) Behavior Income
(LPO) Analysis (PBA) Analysis (IIA)
Loans Sub X X X X
Ledger (LSL)
Loans Analysis
Loans Cash X X X
Flow
Transactions
(CFL)
Interest rate X
changes data
(IRC)
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
14
Financial Instruments Analysis (FIA)
The following table provides an overview of the files that are needed to process each of the respective
analytics.
Table 3 – Link between Financial Instrument Analysis and file type
In addition to accessing Data Workbench via the engagement page, users have the ability to navigate to
the Data Workbench Portal (i.e. Central Team Portal) to see all of their data projects and analyses in one
overview. In order to access the Data Workbench via the Central Team Portal, users can either make use
of the tile on the KPMG Clara workflow home page.
Figure 5 – Data Workbench / Central Team Portal access point within KPMG Clara workflow
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
15
or access the Central Team Portal using the access point within any given Data Workbench analysis.
Figure 6 – Central Team Portal access point within a Data Workbench analysis
For more information about the Central Team Portal, please see section 2.15 Central Team Portal.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
16
Advanced Capabilities Portal
The Advanced Capabilities Portal is a landing screen within Data Workbench where the user can access all
analyses associated with the selected KPMG Clara workflow engagement. Within the Advanced
Capabilities Portal, the user can perform the following activities:
Parameter Required/Optional
1
Beginning for KPMG Clara workflow 2021.3, the user can choose to analyze a single financial period within the fiscal year.
When entering the Start and End periods, the user enters the same period number for both.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
17
Figure 8 – Create Analysis pop-up window with available parameters
Below is an overview indicating supported and unsupported scenarios as per current release.
Company codes Single & Multiple company code data sets Supported
Period / Phase Single Fiscal year (Planning, Interim, Year-End, Stub) Supported
After clicking the +Create Analysis button, the user is automatically returned to the Advanced Capabilities
Portal, and the created analysis will display on the screen. Information on each analysis created is
displayed on its own Analysis Card.
Figure 9 – An Analysis Card for a created analysis displayed on the Advanced Capabilities Portal
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
18
2.4.2. Viewing the Analysis Status
During data processing, the user can always return to the Advanced Capabilities Portal to see the progress
of each analysis by viewing the respective analysis subway map. The subway map indicates the current
processing stage for each analysis.
− Not available steps (activity is not yet available, and has yet to be completed) shown as grey “stations”.
By clicking on the “subway stations”, the user can directly navigate to the corresponding activity:
Figure 10 – The “analysis subway map” displaying the data processing workflow
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
20
Note: After the creation of an analysis, certain parameters (such as Country, Fiscal Year, Period Start
Date, Period End Date, Start Period and End Period) cannot be edited anymore. In case these parameters
need to be corrected, the user has to create a new analysis in which the data has to be reprocessed.
After confirming that the analysis needs to be reset, the previous data container will be removed, and a
new will be created supporting current release features.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
21
2.4.7.2. Deleting an Analysis
The user has the ability to remove an analysis permanently by clicking the Delete button on the Analysis
Card. Upon pressing the Delete button, an alert is opened, asking the user to confirm deleting the analysis
progress. Note that the same considerations apply as outlined for deleting an analysis, and that various
objects are permanently removed when selecting this feature.
My Analysis/Overview
As outlined above, when clicking the Workbench button, the user is navigated to the My Analysis screen.
This screen displays the following information:
− The status (via the subway map) for the selected analysis: This shows the status of the individual
processing steps leading up to successful completion of the analysis. The user can also use the
subway map to navigate to any of the stages individually by clicking on them
− My Analytics: This screen lists out the analytics selected for this analysis, including their readiness for
processing and their respective status
› “Ready for Processing” means that data has been prepared, validated, confirmed and
analytics processing can be started.
› “Processing Successful” means the analytics processing has been completed and results
dashboards are available.
− The Notifications window: This window displays information relevant to the analysis including the
status of analytics processing (i.e., messages of successfully completed or failed processes, and
calling the user to action) and information about changes on the users who have been added to or
deleted from the engagement in which the analysis was created.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
22
Figure 18 – My Analysis screen
Analytics Selection
The Analytics Selection screen is the start of the Data Workbench data processing journey. Within this
screen, the user has the ability to:
− View the Analytic Name, (Analytic) Group, (Analytic) Process and the required file types to execute the
selected analytics
− Delete one or more of the selected analytics from the current analysis
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
23
Figure 20 – Analytics Tab (before Analytics selected)
In addition to selecting analytics, the pop-up window provides information on the analytics, and several
options for filtering or searching and displaying the available analytics, as shown below
Filtering can be performed on the following fields in the Analytics library:
‒ Group (i.e., a set of analytics covering the same area are brought together into the same group),
‒ Process for which business process the analytics is applicable (e.g. financial accounting)
‒ File Type(s) that are required for processing of specific analytics (i.e., the identified file types).
Once dropdown selections have been made in one or more of the library fields using check boxes to pick
values on which to filter, click on the Filter button to activate the filter. Filters can be removed by selecting
the “Select All” option in the affected fields and once again clicking the Filter button.
There is also available a keyword search by group, process, and file type. Note that the keyword search
takes the current filters into account. If results are not returned, ensure that the filters are set appropriately
before searching.
Click Insert to add the selected Analytics to your Analysis. Once analytics have been inserted, they are
added to the My Analytics list and the “Add Analytics” pop-up closes.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
24
Once analytics have been inserted, they are added to the My Analytics list when the pop-up closes.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
25
Upload, Import and Transform Data
In the Data screen, the user can perform the following actions:
− Transform the data to the KPMG Clara data requirements using the data field mapping functionality
This opens a Windows File Explorer, where the user can select one or more files to be uploaded. The
Data Workbench allows for the upload of various file formats, including text files (TXT), comma-delimited
files (CSV), and Excel (XLSX).
Note that this single or multiple file upload approach works best with files less than 2GB size. For larger
files, please refer to section 2.7.1.4 Large file upload.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
26
option, Data Workbench will only allow the upload of files with “.zip” as extension (other archived
extensions are not supported).
The Zip file upload feature does not support a hierarchical structure such as nested zip files or a zip
file which includes a folder structure.
After selecting the file, a message banner will be displayed at the top of the screen to indicate that a
backend request has been queued to upload the file then, the archive file will be displayed on My Data
screen with the “Upload in progress” state.
Once the upload has completed, Data Workbench will start to unzip the files and the status then changes to
“Upload Successful and Unzip in progress. Refresh to get the latest update”
! Do not refresh the web browser or navigate to another place in the active web browser tab until the
zip file is displaying the status below.
After the user should refresh the screen until getting the standard pop-up showing the File Upload/Import
screen will come up, where the user can set the parameters for the TXT, CSV, or Excel files that were
within the archive (see 2.7.1.5 File Import).
Notes:
− The upload of other archive file formats is currently not supported through Data Workbench.
− The zip-file upload function will not unzip “nested” zip files. That means, if the zip file contains another
zip-file, the underlying ones will not get automatically unzipped. As such, it is recommended to
separate zip-files (i.e. upload 2 zip-files) rather than one archive with zip-files inside.
− In cases where the zip-file contains multiple folders, ensure that the file names are distinct. In cases
where multiple files with the same name are detected, the application will overwrite the file that already
exists.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
27
Figure 28 – Set up ERP Pipeline
After setting up the ERP Pipeline, organizational scope can be defined in the following screen:
Then, in the subsequent pop-up window, the user will have access to select the data files to be uploaded.
By clicking the Browse button, the user is directed to a window to select the data from their computer. Data
files can also be uploaded via the direct transfer to the upload folder, for which the details are shown on the
popup window, or in the Analysis Detail window. Files can be loaded either as text files or a zip file
containing the text files. Only data extracted using the official KPMG Data Extraction tools for SAP and
Oracle is compatible with the functionality of the ERP pipeline. For data obtained from SAP and Oracle
systems in different formats, please use the SQL Insert functionality, and the accompanying ERP ETL
Guidance.
After uploading data using the direct transfer method, click on Refresh button to display the files on screen
and then click on Import data button. Note that the Data screen must not be manually refreshed during the
file upload process, in order to prevent issues with the import process not completing automatically.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
28
A Quick Reference Card detailing the functionality and operation of the ERP pipeline can be found here.
This icon launches the “Analysis Details” window, which is a listing of server names, including the analysis
database and cube servers and network file shares that are being used in your analysis (these will vary by
analysis).
Open the Analysis Details for your engagement by clicking the disk icon . Select the entire Upload Path
and copy it (Ctrl-C) and close the Analysis Details window. The Upload Path highlighted in the screen print
below is where you will copy the files to upload into Data Workbench.
Open a Windows File Explorer window and paste the Upload Path into the Address Bar of the File
Explorer. You may need to refresh the Windows File Explorer to ensure that it is pointing to the Upload
Path you copied in.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
29
Figure 33 - Windows File Explorer with the Upload Path
From a separate File Explorer, drag or copy the files you wish to upload to the Upload Path File Explorer
window and refresh the Data Workbench browser screen.
When the upload for all of your selected files has been completed, you will have to refresh the web browser
at the Data screen and you will see the files displayed, one line for each file in the Select Data from Hard
drive screen. In this screen, you enter the import parameters discussed in section 2.7.1.5 File Import
below.
The same process can also be used for large archive (zip) files. Once the zip file is uploaded to the file
upload location, Data Workbench will automatically unzip the archive. The user will have to wait until the
unzip process is completed (i.e. once process is complete zip file is deleted automatically and only the
contents of the zip-file should be shown in the file upload folder).
− Column Delimiter (the text character(s) used to separate fields within the data file)
− Text Qualifier (the text character(s) that appear around text strings to differentiate them from other
types of data within the data file)
− Thousand Separator (the text character that is used to separate thousands places in data fields. The
character used varies in different regions of the world. Can be either comma, period, or none.
− Decimal Separator (the text character used to separate decimals from whole numbers in data fields.
Like the Thousand Separator, the character used varies in different regions). Can be either comma or
period or none.
− Select file type(s) (to which data requirement the uploaded data file is related)
− Date Format (the order and formatting of how month, day and year components of dates are formatted
in the data).
Please note that prior to the “Import Data” button becoming available, the “Select All” box (or all the boxes
for the individual files) should be marked in order to be able to proceed with the upload.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
30
Figure 34 - Upload/Import Configurations
In order to populate these values correctly, a “Preview” feature has been implemented which displays the
first 10 records of each data file. This feature can be accessed by clicking the Preview button underneath
each data file.
After having set all the parameters, the Import Data button will become available, and the user can import
the data into the analysis.
Notes:
− With respect to the column delimiter, it is recommended that a column-delimiter that is not usually
found elsewhere within the client data file 2 be used for the data files, in order for the data import
process to go smoothly. A good example of a “strong” column-delimiter is a “combined” column
delimiter of two or more uncommon characters. We provide the ‘#|#’ (hash-pipe-hash) column delimiter
(which is used in other KPMG data analysis solutions) as a standard column delimiter within the
interface. However, users can define their own column delimiters by selecting Other from the drop-
down menu and then define their own delimiter as used in the data.
− The Text Qualifier option is currently only available for CSV files.
− Data files cannot have the same thousand and decimal separators within the data. Upon choosing one
of the options for either thousand separator or decimal separator, this option will be greyed out on the
subsequent selection.
− Each data file can only be mapped against one file type (i.e. data requirement) simultaneously. In case
a data file contains more than the data for a single file type, the user either needs to split these data
2
Some special characters may appear within data fields such as an “@” sign such as in a field for email addresses or price
quotes. If such a character is also used for a delimiter, Data Workbench will not be able to distinguish actual data from
delimiters, and likely experience import issues. The use of a text qualifier in combination with a more common delimiter can
overcome these issues.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
31
files, or upload the data file another time (with a different name) and then tag it to the other file type.
− Date formats have to be consistent within one data file. In cases where the date formats within one file
are not consistent, this will result in data import and transformation issues and failed validations. Date
formats can be different between files, as date formats are set per data file.
− The preview feature only shows the first 10 records of a data file. This generally is sufficient to identify
what the common settings in the file are. In case different data permutations are present in the file after
the 10th row, these will not be visible in the preview, and might (still) results in failed data import and
transformation processes as soon as the data is being loaded.
Upon clicking Data Import the user will be returned to the Data screen. In the My Data window, the
selected data file names will be displayed. Once these names display, their status will automatically
proceed from “Import in Progress” to “Import Successful”.
Note: The Data Upload and Data Import stage have automated triggers to refresh the screen. As
such, it is of critical importance that the user does not refresh the screen manually when data files are
not yet listed as “Import in Progress”. Refreshing the screen before this status is reached for all files
will result in the fact that the remaining progress of the files to be uploaded is not picked up anymore.
In those cases, the user will have to remove the data files, and then upload and import them again.
Depending on the file size and system activity, it might take some time before the status is
automatically moved to “Import in Progress”.
2.7.2. Transform the Data Using the Data Field Mapping Functionality
After the successful import of data, the user needs to perform the data field mapping for each imported
data file. The purpose of data field mapping is to map client data fields to the mandatory and optional fields
needed to process the selected analytics. Data field mapping is a prerequisite for being able to continue
with the subsequent data processing stages.
In order to perform the data field mapping, the status of the data files needs to be “Import Successful”.
Then, the user should select each file individually, and perform the mapping of the data fields received in
the client data against the data requirements set by KPMG Clara for each of the selected file types (i.e., the
Common Data Model). This is done by selecting the “correct” value from the source field list (a drop-down
feature), for the field that corresponds with the desired target field value, as shown below.
During this process, the user needs to make sure all “Mandatory” fields are mapped prior to transforming
the data files. This can be reviewed in the pane on the right-hand side, where a “Priority” is indicated (an
aggregated conclusion), as well as an assessment of each data requirement (“Target Field”) against the
analytics selected. Fields for the individual analytics can take two statuses: either Mandatory or Optional.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
32
With respect to the Priority field, this is the conclusion over all selected analytics for given target field.
Priority can have one of three statuses:
− Optional in case the field is optional for all selected analytics, and
− Partial in case the field is mandatory for at least one of the selected analytics, but not all.
Figure 37 – Data transformation field Priority and data field requirements per analytic
In cases where the field names in the source data file have the same names as the target field (from KPMG
Clara data requirements such as companyCode, documentNumber. etc.), a simple form of auto-mapping
(based on identical names) will already have taken place, leaving the user having to map only those fields
whose names in the source file do not match one of the target field names. This auto-mapping functionality
simplifies the efforts within the Data Workbench application and promotes the use of pre-processing stages
to prepare data.
Once the data fields for each of the imported files have been (individually) mapped, the user should then
select all the files using the checkboxes in the My Data screen and press the Transform button. This will
start the transformation process and begin moving the imported data into our Common Data Model.
− The analysis server and instance name: The name of the server and SQL instance on which the
relational analysis database is located
− The download path: The location where the generated Excel dashboards are stored for this analysis
− Cube DB Name: The name of the multi-dimensional data cube (for results generation)
− Cube DB Server: The name of the server and SSAS instance on which the multi-dimensional data
cube is located
− The upload path: The location where (large) data files can directly be copied to, and will be picked up
by the application during data import
− The archive path: The location where the database backup is stored for this analysis.
Each of these have relevance in different stages of the data processing, such as during the file import (for
larger files), the transformation of data (using SQL Insert), or the debugging of issues.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
34
Figure 40 – SQL insertion selection in data transformation process
In order to make use of the SQL Insert option, the user performing the transformation must have been
given SQL access via a new role created in Management Console and also have been given access to the
engagement (managed by the engagement team). In addition, the access granted does not give full access
to the database, but is limited to the following:
− Read/Write access only to the database schema in which the data transformation is happening (DBO
and FIN)
− Read access to all remaining Schemas (i.e. GEN, ERP, PTP, OTC, etc.).
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
35
.
Figure 42 – Data transformation validation details pop-up
For detailed information about file import and transformation validations, refer to Appendix A. Summary of
Data Import Validations.
Account Mapping
The Account Mapping feature is accessed from the Accounts tab and enables engagement teams to map
General Ledger (GL) accounts imported from the entity’s data files to the KPMG Clara Knowledge
Accounts that have been authored within the Financial Statements module in KPMG Clara workflow.
The essential activities performed within the Accounts screen are as follows:
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
36
Figure 44 – Account mapping interface
1 Switching between the Map Accounts feature and the Mapping Validation Report
2 The General Ledger Accounts window (also referred to as “left hand side” in account
mapping)
3 The Financial Structure window (also referred to as “right hand side” in account mapping)
4 Toggles to show current year ending balances and prior year ending balances in the mapping
grid
5 Statistical information for the account mapping being performed, including the total number of
accounts loaded, the number of mapped accounts, and the number of unmapped accounts –
split between active accounts (i.e., accounts that need to be mapped) and inactive accounts
(i.e., accounts that have no activity and are optional for mapping)
6 Account mapping features to load a financial structure (i.e., obtain from the Financial
Statements module), clear the mapping of all mapped accounts, and import and export
account mapping reports (both from/to an Excel format)
7 Filter and search features within the account mapping windows to search for certain accounts
or groups
8 A legend to identify the state of each account (i.e., unmapped accounts, mapped accounts,
and inactive accounts)
9 The ability to save the account mapping
10 Refresh changes done to the financial statements structure
11 Run mapping suggestions (mapping bot) when the general ledger account group data column
has been mapped in the GLA or GLTB files. For more information on using mapping bot,
please refer to the KPMG Clara mapping bot user guidance.
With respect to the filter functionality, the following is relevant to understand:
− The filters on the General Ledger accounts window can be applied by account type (e.g. revenue,
trade receivables, etc.), by account number ranges or by using key words included in the account
name or account number and then clicking the Apply button. To remove all the applied filters and
display all GL accounts again, click the Reset button.
− The filters also allow to hide or show general ledger accounts with zero balances or transactions in the
period
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
37
Figure 45 – Account mapping filters
− The list of general ledger accounts is grouped by Balance Sheet/Income Statement. Additionally, if the
column for account group is imported for the GLA file, these account group levels will be displayed as
well. The following information displays for each account:
− Account number
− Account name
− The list of Financial Statement structure nodes displays all the Financial Statement captions and
workflow accounts. Totals created in the Financial Statement module are not being brought over to the
account mapping, as totals per group are automatically calculated upon mapping accounts to these
nodes.
− Financial Statement caption or workflow account filters can be filtered by selecting the check boxes on
the left and then clicking the Save button. It is also possible to display the Library account mapped to
the workflow account at the Financial Statements structure by using Show knowledge accounts
toggle button.
Lastly, it is highly recommended to use the Account Mapping Save button frequently to avoid losing any
changes during the mapping process. Additionally, it is a best practice to download the account mapping
report at the end of the mapping stage. This report can always be reused in cases where data needs to be
reprocessed (without needing to remap all the accounts again), or for roll-forward purposes (e.g. interim to
year-end roll-forward) on the same data set.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
38
file. However, upon first entering the screen, there will be no Financial Statement structure displayed yet to
map the general ledger accounts to on the other side. To load the Financial Statement structure in the right
pane, click the Refresh button located on above the Financial Statement grid:
Once the financial statement structure is retrieved, the date and time display on the screen of the last
retrieved structure:
Note: The financial statement structure in the account mapping screen is only retrieved when the user
actively does so, using the functionality described above. Any changes made in between within the
Financial Statement module are not automatically brought over. When the user wants to reload the
financial statement structure in account mapping, click the Refresh button again. Upon doing so, an alert
message will display to ask the user to confirm the reload.
Note that upon reloading the financial statement structure, the account mapping of various nodes might be
lost due to changes in the underlying structures. This includes changes such as renaming Financial
Statement Captions and Workflow Accounts, removing Financial Statement Captions and Workflow
Accounts, and adding new ones, or the mapping/un-mapping of knowledge to the Workflow Accounts.
It is possible to select more than one account using the mouse in combination with the Control or Shift
keys on your keyboard.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
39
2. Once the account is selected, use the mouse to drag and drop the account to the Financial Statement
structure node on the right-hand side of the screen.
3. Make sure the accounts are dragged and dropped to Workflow Account nodes on the right side, and
not to Financial Statement captions. Otherwise, the Unable to Map error message will display in the
upper left corner of the screen.
4. Once the account is mapped, it will be removed from the left pane and display on the right pane with
the Account mapped indicator:
5. Upon mapping, the Workflow Account and its parent Financial Statement captions nodes will be
updated with the aggregated ending balance of the accounts, based on accounts mapped to these
nodes.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
40
Figure 53 – Account mapping feedback
It is possible to select more than one account by using the mouse in combination with the Control or
Shift keys.
2. Drag and drop the selected GL accounts into another workflow account. In case the accounts are
placed into a Financial Statement caption or a total, the Unable to Map error message displays in the
top left corner of the screen.
3. Once the account is moved, the balances will be updated in both the previous workflow account and
the new account, along with the corresponding parent Financial Statement captions nodes.
Similar to moving an account from one node to another, the user can make use of the Unmap feature
within the account mapping. In order to do so, the user should.
1. Verify the account to be un-mapped is highlighted in gray.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
41
Figure 56 – Mapped account selection
It is possible to select more than one account by using the mouse in combination with the Control or
Shift keys.
2. Right-click the selected accounts and select the Unmap menu option.
3. The un-mapped GL account will be removed from the Financial Statement structure on the right pane
and will display again on the left-hand side (as part of the General Ledger accounts) with the
Unmapped indicator.
In order to perform a “bulk remove” activity, the account mapping screen also offers the ability to un-map all
accounts with one click by using the Clear Mapping button. This option removes all the general ledger
accounts mapped to Workflow Accounts and returns them to the left-hand side of the account mapping
screen with an indication of “Unmapped” accounts.
1. Click the Clear Mapping button on the upper right side of the screen.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
42
Figure 59 – Clear Mapping functionality
3. All the accounts will be removed from the right pane and return to the left pane with the Unmapped
account indicator. A confirmation message displays in the upper left corner of the screen.
It is highly recommended that teams frequently save the account mapping. In addition, it is recommended
to save the mapping whenever changes are made to ensure the latest copy of the mapping is available to
prevent situations where data needs to be reprocessed due to latest changes not being reflected.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
43
Note. For cases where the number of accounts exceeds 5,000 for the import and 1,000 for export, the account
mapping will be handled through a back-end service. As a result, the user will have to wait for the import /
export Account Mapping task to be completed, the process can be monitored through the Activity Log.
− “Cancel Export”, which The Account Mapping report will be exported into an Excel file.
If the Account Mapping Report was previously downloaded, when the Export function is selected again, the
pop-up will include the date and time of the previous download. Progress of the task can be tracked
through the Activity log feature.
The downloaded Excel file which by default is named “Account Mapping” + (name of the Analysis) displays
the following information:
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
44
3 Version template of the account mapping report file.
4 Status of the account (Possible values: Mapped/Unmapped)
5 GL Account number
6 GL Account Name
7 GL Account type (Possible values: Income Statement (P) / Balance Sheet (B))
8 Labelled account: This is the workflow account to which the GL account was mapped.
Depending on whether the workflow account was linked to knowledge accounts, it displays
the value in one of the following ways:
− Workflow account linked to knowledge account:
(Knowledge account identifier) – Workflow account Name.
e.g. 83948 - Equity-accounted investees, 1044862 - Deferred tax assets
2. When the applicable file is selected, the “Browse File” pop-up is displayed again, providing status of
the import and instructions to refresh the web browser to display the accounts mapped from the
imported Account Mapping Report.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
45
Figure 67 – Account mapping import file selection
After refreshing the web browser, the account statistics are updated, and the GL accounts are mapped
to the Workflow Accounts.
If the account mapping report file cannot be read, an error message displays in the upper corner of the
screen.
− Accounts with Unmatched Account Type: Cases where balance sheet accounts (as per source
data) are mapped to income statement accounts within the Financial Statement hierarchy and vice
versa. This validation is performed irrespective of whether the workflow accounts are linked to
knowledge.
− Accounts with Unexpected Balances: This validation is performed only when the workflow accounts
are linked to the Library. Based on the Library account association, the validation checks whether the
accounts mapped to each workflow account match with the expected balance of that workflow account
(debit or credit).
− Unmapped accounts with current year non-zero balances / non-zero transactions: This validation
displays any unmapped accounts that are relevant to the analysis, i.e. accounts that have balances at
the ending period or transactions during the period. Only accounts that have neither (i.e. no balance,
and no transactions) do not have to be mapped.
The steps to take to review the Account Mapping Validation report are:
1. To access the Account Mapping Validation report, click Mapping Validation Report.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
46
2. To display the detail of the accounts falling in each validation, expand the section.
− GL Accounts
When selecting the Unmap action, the selected accounts return to the left pane and the confirmation
message displays in the top left corner of the screen.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
47
When selecting the Ignore action, the selected accounts will remain in the Validation report and the
confirmation message displays in the top left corner of the screen.
In addition, an Excel file can be downloaded with the information with the account mapping validations by
clicking the Export button.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
48
Figure 76 – Account mapping validation report navigation
2. Verify all the accounts have been mapped and there are no warnings requiring any action. Then click
the Publish button.
3. In the Review and Publish window, confirm the account mapping has been reviewed and is ready to
be published. Then click the Publish button.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
49
Parameters
Figure 80 – Parameters
Within the Parameters screen, the user can enter processing-relevant parameters, most notably driven by
high-risk criteria within the analysis. The available parameters depend on the analytics selected and are
mostly optional in nature. Parameters can be split into two areas: KPMG Clara analytics parameters and
Financial Services Routine parameters.
− Defining the number of periods in the fiscal year (excluding the adjustment periods) for the
annualization of balances on the Planning Analytics dashboard
− Defining relevant weekend days for the weekend-posting analysis in the Journal Entry Analysis
dashboard
− Defining relevant holiday days for the holiday-posting analysis in the Journal Entry Analysis dashboard
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
50
2.9.1.3. Account Analysis Scoping
The Account Analysis Scoping feature is used to select which accounts will be analyzed using Account
Analysis. Once accounts are selected, Account Analysis will only process those journal entries which
contain, in at least one of their line items, one or more of the selected accounts.
1 This parameter is only available when Account Analysis has been selected in the Analytics
screen.
2 Use the search functionality to assist in searching and filtering the Workflow Accounts.
3 Workflow Accounts can be selected or deselected using the checkboxes found next to the
Workflow Account name.
Refer to the KPMG Clara Account Analysis User Guide for more insights into this functionality.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
51
This drop-down menu is populated with all industries selected within the engagement profile page in KPMG
Clara workflow. When attempting to select an industry not authored (see industries above), a notification
displays stating the industry cannot be used with Account Analysis:
When there are multiple industries that have been mapped to the engagement profile, and one of the
industries has been authored, the “Up-Front Expectations” section should be used to view the default
account pairing expectations for the accounts mapped in the “Account Mapping” screen and to review the
appropriateness of the industry assumptions that will be applied.
If there are no up-front expectation changes to be made from the standard industry template, then select
No for both questions. Engagement teams are still able to make changes to the account pairing
expectations in the Account Matrix, found within the Account Analysis portal.
If a roll-forward file is available, select Yes for “Do you want to roll forward expectations from a previous
engagement?” This activates the Import button and allows the upload of a roll-forward file (see Rolling
Forward the Expectation Matrix).
When rolling forward an “Expectation Matrix”, engagement teams review the prior phase “Expectation
Modification” report and the “Expectation Matrix” before the file is imported (e.g. during the process
walkthrough(s)), and (1) confirm that any changes made in the prior phase are still applicable and (2)
consider the need for any additional changes to be made to the “Expectation Matrix”.
When up-front expectation changes are to be made without using a roll-forward file, the engagement team
should select Yes to the question “Do you want to make up-front modifications to the current phase?”. This
activates the Export button and allows the download of the “Expectation Matrix” file.
The Expectation Matrix is used to change the default Knowledge applied prior to data processing.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
52
Figure 87 – Expectation matrix
The Expectation Matrix allows engagement teams to make the same modifications that would be
available within the account analysis portal within the Account Matrix. Engagement teams can modify the
cells using drop-down menus as follows:
− Default Expected (green) cell can be modified to Unexpected (red) or back to Expected (green).
− Default Unexpected (red) cell can be modified to Expected (green) or back to Unexpected (red).
− Default Unique (grey) cell can be modified to Expected (green), Unexpected (red), or back to Unique
(grey).
− Default Same (black) cell can be modified to Expected (green), Unexpected (red), or back to Same
(black).
This matrix uses the knowledge account mapping performed in the Financial Statement Module. Therefore,
engagement teams are unable to further modify the names within the matrix nor add/remove columns/rows.
Making a change to the expectation matrix, beyond the accounting combination expectation, will result in
the failure of the import of the modified expectation matrix.
Once the Expectation Matrix has been modified, it can be saved locally to an engagement team
member’s desktop. Once saved locally, the Expectation Matrix can then be imported by selecting Import
to complete the upload.
A confirmation message displays at the top of the page stating “Alert! Expectation modification has been
successfully imported”, verifying the file has been imported.
Once the modified Expectation Matrix has been successfully imported, an engagement team can review
and document the up-front expectation changes made by clicking Review Expectations. (1)
Review Expectations is used to document procedures and rationale considered by the engagement team
in making modifications to the default knowledge. When modifications are made, the debit, credit, prior year
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
53
expectation, default expectation, final expectation and source columns are automatically populated. The
prior year expectation column displays the modified expectation from the roll-forward file (if applicable).
Otherwise, it displays Not Applicable.
The Procedures Performed and Rationale and WP Reference columns are required to be completed by
the engagement team, whether or not modifications have been made through a rolled forward Expectations
Matrix or a newly modified Expectation Matrix. The documentation requirements for these columns tab are
the same as those required for the Account Analysis Expectation Modification Report.
All pre-populated and additional documentation completed are carried forward to the Expectation
Modification Report within the Account Analysis Portal.
Once the relevant text boxes have been populated, select Save and click the Next arrow to continue to
Scoping.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
54
Figure 89 – Create Analysis pop-up window with post-closing settings
1 Select the Defined Period End Analysis of JE Analysis in Scope with the Yes radio button,
enter the number of adjustment periods and click the refresh icon to add the additional
periods to the grid
2 These checkboxes (uneditable) are the additional periods added to the grid
− Defining client loan types, and mapping the values from the client data against these customized
categories
− Defining how the clients loan grades match with the standardized risk ratings within the Financial
Services routines
− Defining the lower- and upper-limits for the stratification buckets for loan balances, loan-to-value ratios,
and payment delinquency days.
With respect to these parameters, there are two “mapping” efforts: the mapping of the client loan types and
the mapping of risk ratings. For the mapping of the client loan types, the user can enter a description for (a
group of) loan types (e.g. commercial loans), and subsequently map the values from the client data to this
caption. In the reports, the self-authored description will come up. In addition, for each created loan type,
the user can define for which analytics this loan type should be considered by selection the Yes/No option
under each analytic.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
55
Similarly, the mapping of client risk ratings works against a standardize risk rating used within the Financial
Services Routines (example, 1-10), where the user has to map the values received in the client data (e.g.
A+, A++, etc.) against these standardized categories.
Lastly, there are three parameters for the setting of stratification limits for various routines:
− Setting lower and upper limits for loan balances (for the Loan Portfolio Overview charts)
− Setting lower and upper limits for loan-to-value ratios (for the Loan Portfolio Overview charts)
− Setting lower and upper limits for payment delinquency days (for the Payment Behavior Analysis
charts).
Each of the buckets can be customized to fit the client data. Users should ensure that all lower and upper
limits are defined, because otherwise data processing might fail due to missing limits.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
56
Validations
The Validations screen is used to summarize the different types of validation routines performed through
data upload, import, and transformation. The Validations screen also enables manual validation of the
completeness of the import data by using the available reconciliation reports. Users can view the
validations and reports either by analytic group or by analytic (Account Analysis, Planning Analytics or
Journal Entry Analysis, etc.), which is explained in the following subsections.
The Validations screen has the following functionalities:
Figure 94 – 0 KB Validation
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
57
The status of the import validation section includes the following features.
-- validation passed
-- Validation failed
− Exporting the validation results into Microsoft Excel® can be done by clicking the Open in Excel button
in the Validation pop-up on the Data screen. This Excel report may be used to assist in obtaining
updated data files when file validation issues have occurred.
The status of the transformation validation section includes the following features:
-- validation passed
-- Validation failed
− Exporting the validation results into Microsoft Excel® can be done by clicking the Open in Excel button
in the Validation pop-up on the Data screen. This Excel report may be used to assist in obtaining
updated data files when file validation issues have occurred.
For detailed information about the above validation routines, refer to Appendix A. Summary of Data Import
Validations. Note that the results in the validation pop-up display on two pages.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
58
2.10.4. Analytic Reports and Reconciliation
The Reconciliation Reports enable the user to perform a manual validation of the completeness of imported
data by providing the ability to reconcile imported data to the source data files.
The screen contains three access points to download the following reports:
− Cumulative account balances at the end of each financial period, calculated by adding opening
account balances and total balance of transactions for each financial period (i.e. sum of debits and
credits) obtained from the Journal Entry Transactions file
− Closing balances, which display the cumulative balance calculated for the last financial period of the
year.
Opening account balances and closing account balances shown in all three reports discussed above
should agree with each other.
To export the data to Microsoft Excel®, click the Open in Excel button.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
59
2.10.4.2. Trial Balance Report
The Trial Balance Report is the second option in the Validations screen. Clicking the Open in Excel button
exports the Trial Balance Report. While the Account Balance Report and the Periodic Balance Report
provide similar information in a different format from the Trial Balance Report, the Trial Balance Report
reflects the standard format that can be used to verify the completeness of the imported data files by
Company Code. Evaluation of the completeness of the journal entries can be performed in Data
Workbench through the use of the Trial Balance Report.
The Trial Balance Report contains the following information at the general ledger account level:
− Opening Balance column, which displays current period opening account balances obtained from the
Balances file.
− Total Debits column, which displays the total amount of debit postings recorded to each account
throughout the fiscal year per the Journal Entry Transactions file.
− Total Credits column, which displays the total amount of credit postings recorded to each account
throughout the fiscal year per the Journal Entry Transactions file.
− Calculated Closing Balance column, which displays the calculation based on the Opening Balance,
Total Debits and Total Credits columns in the Trial Balance Report.
− Imported Closing Balance column, which displays the current period closing account balances
obtained from the Balances file.
− Differences column, which displays the calculation based on the difference between the calculated
closing balances and the imported closing balances.
The engagement team should export the Trial Balance Report to Microsoft Excel® via the Open in Excel
button and use it to (a) assist with the manual reconciliation of the imported data with the current year trial
balance subject to audit and (b) document the performance of such data validation procedures, refer to
data verification in KPMG Clara Computer Assisted Audit Techniques (CAATs) Document.
− Total balance of transactions (i.e. sum of debits and credits) recorded at each financial period, which
are obtained from the Journal Entry Transactions file.
− Closing balances, which are calculated by adding opening balances and period ending balances for
the entire fiscal year.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
60
This report can also be used for trend analysis and seasonal analysis, once the data has been validated.
To export the data to Microsoft Excel®, click the Open in Excel button.
2.10.5. Confirmation
When Data Validations and Analytics reports have been reviewed by engagement team, the Confirmation
section is intended for teams to manually confirm the analytics are ready for processing by checking the
“Confirmation” field checkbox for all analytic selected in the analysis. This confirmation is a mandatory step
and only confirmed analytics can be processed.
After saving the Confirmation screen, click on next to proceed to the processing stage.
Process
The purpose of the Process screen is to start analytics processing once all analysis parameters have been
completed and also to provide information about the progress on the process. In case the “Process” button
is not active, as shown highlighted below, navigate to “Confirmation” screen and review the analytics status
and confirm all the analysis steps have been completed (e.g. green tick marks in the subway map).
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
61
Figure 103 - the Process screen
− Processing Business Logic: Lists all the business logic details of the analytics that are being processed
− Preparing Results Sets: List the status of the results data model preparation (e.g. Power BI data
model)
− Generating Result Visualizations: Displays the status of the results dashboards configuration (e.g.
setting up the data source connection and refreshing the dashboards)
The Processing Business Logic table shows the Process Status for each analytic, together with
timestamps for start-time and end-time of processing. Each analytic has a chevron to expand for viewing
the status of the individual procedures executed on the transformed data. In case of a processing failure,
the user is able to identify the specific components that failed. There is a checkbox next to each business
logic component, for which the user can click on whenever is needed to rerun specific steps related to
changes on imported data, account mapping, parameter changes (or any other changes in the analysis) or
when there is a technical issue with the process.
The Preparing Results Sets table lists underling results data source that is used to run and display the
results of the selected analytics. Similar to the information for the Processing Business Logic table, this
table also displays the status and start and end time of processing of the results data model. Also, there is
a check box if the package needs to be re-run (reprocessed) in case of changes on the analysis or possible
technical issues.
Results
The Results tab displays the following information related to successfully processed analytics:
− Analytic Name
− Related Group
− Related Process
To access the analytics results, click on the View Results button to view the My Results tab that displays
the data widgets for the analytics selected in the analysis.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
63
Figure 109 – Data widget
Users can view the results displayed on the data widgets and download the dashboards results:
1 Open in Excel button to download analytics dashboards.
2 Click on the hyperlink(s) to open analytics specific visualizations
database is possible once this is fully processed. This feature is available from the pencil icon in the
Action section of the analysis and then in the “Database Properties” tab of the pop-up.
Prior to the archiving being available, a setting in Management Console should be reviewed. Please note
that the archiving feature is by default enabled and can be disabled at a country level. Member firms who
host KPMG Clara workflow in the regional cloud can log a support ticket to change this setting and turn off
the archiving functionality.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
64
Figure 111 – Archive feature default configuration in Management Console
Once all analytics are processed successfully (database and cube processing have completed and
database has been shrunk to release unused space), and no subsequent changes have been made to the
analysis (e.g. parameter, data, account mapping publish, etc.), archiving is initiated:
This information is also available in the “Analysis Detail” window for each analysis:
− Manually triggered by the user; from the “Edit Analysis” window, the user can initiate the process by
clicking the “Archive” button. This option is only available when the current status is displayed as
“Shrunk Successfully” or “Archive Failed”, and the analytic processing status is “Processed
Successfully”. This step will then execute the archiving process immediately.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
65
The database backup will be stored in the location given in the “Archive Path” that is defined in the Analysis
details (available from the database icon in the analysis).
Note: Archive folder is only accessible by users after the Archive feature has been triggered.
The archiving status is available in the Edit Analysis pop-up windows for each analysis. The following
statuses can occur:
− Provisioned Successfully: The Analysis DB has been provisioned (created) successfully, archive is
not available.
− Provisioning Failed: The Analysis DB provisioning (creation) has failed, archive is not available.
− Database Shrunk Successfully: After data processing the storage space that is not required anymore
is getting released by shrinking the database. The database is ready to be archived (if Analytic
Processing status is displaying “Processed Successfully on <DATE>”).
− Database Shrink in progress: Database shrink has been started and is in progress, archive is not
available.
− Archiving in Progress: The archiving has started, no changes on the analysis should be done until
the end of the process.
− Archiving Successful: The archiving has taken place, meaning that the analysis is now in a “read
only” state.
− Archiving Failed: The archiving has failed, please check the Management Console in order to identify
the root cause.
Once the database is archived, space reduction has been achieved by removing all tables which are no
longer needed outside of the read-only mode from the analysis database. The following information will
remain available:
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
66
Table 6 – List of reports available in Read-only Mode
After archiving is successful, the analysis will be in “read only” mode, meaning that the following actions will
be deactivated:
Table 7 – List of disabled functions in Read-only Mode
Screen Action
Add analytics
Analytics
Delete analytics
Add date
Data
Delete data
Map account
Unmap account
Account mapping
Publish account mapping
Save parameter
Map/Unmap parameter
Confirm/Unconfirm analytics
Validation
Save validation
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
67
Screen Action
Process Process/Reprocess
Note: When deleting or resetting the analysis, the backup will also be deleted from the archive folder
restored on the server. This feature is available from the pencil icon in the Action section of the
analysis and then in the “Database Properties” tab of the pop-up.
The restore feature is available only if the Current Status is “Archiving Successful” or “Restoring Failed”. To
restore the database, user can click on the “Restore” button available in the Analysis details pop-up
windows.
The restoring status is available in the Edit Analysis pop-up windows for each analysis. The following
statuses can occur:
− Restoring Successful: Database has been restored successfully, read only mode has been
deactivated (i.e. database is “restored” including all database objects to the point before archiving,
shrinking, etc.).
− Restoring failed: Restoring the analysis database has failed, please check the management Console
in order to identify the root cause.
Activity Log
Information contained within the Management Console is very useful for determining why an analysis fails
and provides useful information for debugging and fixing the error. In order to display the information from
Management Console (user, task type, created on, last update, status, server, error info) into the specific
analysis, there is an Activity Log screen which can be accessed from two different steps:
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
68
Figure 118 – Access to Activity Log during Analysis Creation
The activity Log can subsequently be exported into a Microsoft Excel format using the button.
This file can be shared with individuals supporting your engagement (e.g. Central Teams, support teams) in
case of issues, and contains additional information about the engagement such as:
− Project Name
− Engagement ID
− Analysis Name
− Analysis Server Name and Instance
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
69
Central Team Portal
1 The project grid view can be exported to Excel (for reporting purposes)
2 The project view can be toggled between a tile view and a list view (shown here)
3 The user can filter the listed projects to which they have access by Analysis Name
4 The user can control which fields to display in the Project Overview grid and the order on how
are displayed in the grid from a dropdown menu launched from the list icon .
5 The user can drag and drop one or more column headings to this location (the grouping bar)
to group the Analyses by those categories. Here the column heading “Fiscal year” is the
grouping key.
The Column Options dropdown list allows the user to reorder the available fields on the grid (vertical
ellipsis), remove fields they do not wish to see (trash can), or add fields that are not displayed (+ Add New).
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
70
2.15.2 Project User Management
The Project User Management tab allows the user to view which other users have access to the Data
Workbench project either added through Team management access or through Central team Portal, this
functionality is available not just for the currently selected project but across any projects to which the
current active user has access.
When the user first selects the Project User Management functionality, a search box is displayed. It allows
the user to search for Projects by Project Name or Client Name.
As soon as the user enters one or more characters that is contained within the Project, Client or Username,
a list of projects to which the user has access which meet the search conditions is displayed. As
additional characters are entered, the list is refreshed to include only those projects which continue to meet
the search criteria.
When a user selects one of the projects meeting the search criteria, its details are displayed, as shown in
below.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
71
Figure 125 - Project User Management screen
1 Project Name – Hyperlinked to the Advanced Capabilities Portal for the project
2 Client Name
3 Country
4 A list of Analyses within that Project – each analysis listed has a hyperlink to My Analysis
page
5 A list of users assigned to the selected project
6 A button to add additional users to a project – note that users added from the Central Team
Portal only have access to features in the workbench module of the workflow, not the
workflow itself.
The “Users assigned to this project” listing displays information about each user with access to the project
is displayed, including their first and last names, email address, alias (i.e., their user ID / name portion of
their email address), their initials and an icon with their role in the project.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
72
The user roles for projects displayed in the “Users assigned to the project” listing are defined as follows:
Table 8 - Roles defined in Central Team Portal
KCW The user was added from Team management, this role has full
access to the Engagement and Data Workbench
Workbench The user was added on Central Team Portal and have access
only to the workbench project and not to the Engagement (e.g.
Financial Statements, Workflow screens, etc.)
Reporting Groups The user was added on Central Team portal and have access
only to specific analyses and company codes in the analysis.
This role is only available on Data Workbench with Sub-Ledger
package deployed (e.g. KPMG Clara Analytics)
The “Search User” field allows you to search for a user by one of the following key words:
− First Name
− Last Name
− Email Address
− Alias
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
73
Figure 128 - Adding the selected user
After the user has been selected, a pop-up displaying the other fields for the user’s information is
displayed, with a drop down for you to select the desired role for the user being added. Select the
Workbench User and the “Add User” button will go active.
Click “Add User” (which is hidden when the Role dropdown menu is opened) and the user is assigned to
the project
Users listed in the Project User Management in the Central Team Portal can also have several of their
information fields in the “Add User” form edited by selecting the icon; First Name, Last Name, Initials,
and Role can be changed. However, the Email address and Alias cannot be modified because these latter
link the user back to their KPMG identity information. When modifications are complete, select the “Update”
button that appears on the user information form.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
74
Figure 131 - Updating user information
Users added via Project User Management can also be deleted, by selecting the icon. Selecting that
icon will launch a confirmation message. Accepting the confirmation will delete the user from that project,
but not from other projects to which they are assigned.
After clicking the Data Preparation Toolbox button, user will see a message containing directions with
which the user can find the relevant content in the global Data Preparation Toolbox portal.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
75
Appendix A. Summary of Data Import and Transformation
Validations
This document summarizes the different types of validation routines performed through the data import
process, which have been grouped as import validation routines and transform validation routines. This
document provides guidance on how each routine works, in what sequence they get executed and the
downstream impact of each validation on subsequent validations in the data import process.
Validation # 1 – 0 KB Validation
Illustration
How it works This file validation routine checks whether raw data files are provided that are 0 KB
in size, indicating that they are blank files, without data or header lines and so
cannot be staged to the analysis database.
Validation type Hard stop. Process cannot continue until all validation issues are addressed
− GL Account Balances
− GL Trial Balance
− Financial Instruments
Potential reasons If the raw data files are blank, the files will not pass the validation and data
for validation to import cannot proceed. Contact appropriate entity personnel and ask for a new
fail and dataset to be provided with the missing data or proceed with processing without
recommended this file, if it is not required.
action
Other validations − If this file validation fails, data import will not continue with the subsequent
affected validations for data file(s) affected. Note: As there is an additional validation in
the upload screen that checks when 0 KB files are uploaded (and prompts the
user that that is not possible, this validation should generally not cause any
failures.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
76
Validation # 1 – 0 KB Validation
− Data validation routines will not be triggered for the affected data file(s).
Illustration
How it works This file validation routine checks whether raw data files are provided in one of the
following encoding types: UTF-16, UTF-8 or ANSI. If any of the raw data files do
not match one of the required encoding types the validation will fail.
Validation type Hard stop. Process cannot continue until all validation issues are addressed
− GL Account Balances
− GL Trial Balance
− Financial Instruments
Potential reasons If the raw data files are provided in an unsupported encoding type, the files
for validation to will not pass the validation and data import cannot proceed. Contact
fail and appropriate entity personnel and ask for a new dataset to be provided in any of the
recommended three acceptable encoding types. Alternatively, ask the IT Support to assist with
action converting the provided data files into one of the acceptable encoding type
formats.
Other validations − If this file validation fails, data import will not continue with the subsequent
affected validations for data file(s) affected, with the exception of the 0 KB validation,
which will be run with the file encoding validation.
− Data validation routines will not be triggered for the affected data file(s).
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
77
Validation # 3 - Column Delimiter Check
Illustration
How it works This data validation routine checks the column delimiter setup in the upload data
popup with the ones available in the data files to import. The validation returns the
rows in the data file which have mismatched column delimiters.
Validation type Hard stop. Process cannot continue until all validation issues are addressed
− GL Trial Balance
− Financial Instruments
Validation This validation exception shows the row number which has less columns than the
Exception header row.
Details
Potential When the validation fails, it is probable that a row is missing a column
reasons for delimiter.
validation to fail
If there are only a small number of differences noted, verify the row numbers for
and
which the validation exception is shown and check if there are column delimiters
recommended
missing. In the situation where a large number of validation exceptions are
action
encountered, verify the number of columns in the header row by verifying the correct
number of column delimiters.
The Exception Details field will provide detailed information on why the validation
has failed.
Other − If this file validation fails, data import will not continue with the subsequent
validations validations for data file(s) affected, with the exception of the 0 KB validation,
affected which will be run with the file encoding validation.
− Data validation routines will not be triggered for the affected data file(s).
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
78
Validation # 4 - Column Count Verification
Illustration
How it works This data validation routine compares the number of columns in each row with the
number of columns in the header. The validation returns the rows in the data file
which have missing delimiters.
Validation type Hard stop. Process cannot continue until all validation issues are addressed
− GL Trial Balance
− Financial Instruments
Validation This validation exception shows the row number which has fewer columns than the
Exception header row.
Details
Potential When the validation fails, it is probable that a row is missing a column
reasons for delimiter.
validation to fail
If there are only a small number of differences noted, verify the row numbers for
and
which the validation exception is shown and check if there are column delimiters
recommended
missing. In the situation where a large number of validation exceptions are
action
encountered, verify the number of columns in the header row by verifying the correct
number of column delimiters.
The Exception Details field will provide detailed information on why the validation
has failed.
Other − If this file validation fails, data import will not continue with the subsequent file
validations validations for data file(s) affected.
affected
− Data validation routines will not be triggered for the affected data file(s).
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
79
Validation # 5 – Upload vs Imported Record Count
Illustration
How it works This data validation routine compares the number of records in the file versus the
number of records imported. The validation returns the total imported records of total
records in the file in case of failure.
Validation type Warning. Not a hard stop to the process but it’s recommended to review all
exception details before continuing
− GL Trial Balance
− Financial Instruments
Validation This validation exception shows the total imported records vs total records in the file.
Exception
Details
Potential When the validation fails, it is probable that a row delimiter and last data
reasons for attribute in the row are missing.
validation to fail
If there are only a small number of differences noted, verify the row delimiter for
and
which the validation exception is shown and check if there are row delimiters
recommended
missing. Depending on the file size and complexity of the file the missing row
action
delimiters can be added. It is recommended to involve a Central Team member
when performing transformation procedures to the data.
The Exception Details field will provide detailed information on why the validation
has failed.
Other If this validation fails, there may be an impact on subsequent data validations
validations depending on the data rows that may be missing as per the record count.
affected
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
80
A.2. Summary of Data Transformation Validations
Transformation validation routines are automated (i.e. performed in Data Workbench) routines designed to
assist the engagement team in the validation of the completeness and accuracy of the imported entity data.
These automated validation routines do not replace the engagement team’s required procedures over data
integrity. Furthermore, it ensures the ability of KPMG Clara workflows to use the data for the data driven
analytics and workpaper creation.
Illustration
How it works This file validation routine checks whether data type in the uploaded files are in
the required numerical format as specified in Data Workbench – Data
Requirements.
Validation type Hard stop. Process cannot continue until all validation issues are addressed
− GL Trial Balance
− GL Account Balances
Validation Scenario: The data in the file has a “period” as a decimal separator except for 1
Exception Details field in the endingBalanceLC column. This field has a comma as a decimal
separator.
Potential reasons If data in any of the number fields of the raw data files are not in the
for validation to fail supported number format the validation will not pass and affected files will
and recommended not be imported. Use the export into Microsoft Excel® functionality to export the
action exception details into file validation reports. The Exception Details field will
provide detailed information on why the validation has failed. Refer to Data
Workbench - Data Transformation guide for guidance on how to transform
existing data files into an acceptable format or contact appropriate entity
personnel and ask for a new dataset to be provided with the required number
formats.
Other validations − If this file validation fails, data import will not continue with the subsequent file
affected validations for data file(s) affected.
− Data validation routines will not be triggered for the affected data file(s).
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
81
Validation # 2 - Data Attribute Validation – Date format
How it works This file validation routine checks whether all date fields are in one of the
following supported date formats:
− YYYY/MM/DD
− YYYY.MM.DD
− YYYY-MM-DD
− MM/DD/YYYY
− MM.DD.YYYY
− MM-DD-YYYY
− DD/MM/YYYY
− DD.MM.YYYY
− DD-MM-YYYY
Validation type Hard stop. Process cannot continue until all validation issues are addressed
Validation Scenario: The data in the JET file has the following date format “mm/dd/yyyy”
Exception Details except for 1 field in the creationDate column. This field has “dd/mm/yyyy” as a
date format. Crucial for the validation to fail is the order of the date parts date (d),
month (m) and year (y) whereas the date parts separator (i.e. “-“, “.” or “/”) of the
format selection has no impact on the validation outcome.
Potential reasons If the date in any of the date fields of the raw data files are not in an allowable
for validation to fail date format, the validation will not pass, and affected files will not be imported.
and recommended Use export into Microsoft Excel® functionality to export the exception details into
action file validation reports. The Exception Details field will provide detailed
information on why the validation has failed. Refer to the Data Workbench - Data
Transformation guide for guidance on how to transform existing data files into an
acceptable format or contact appropriate entity personnel and ask for a new
dataset to be provided with the required date formats.
Other validations − If this file validation fails, data import will not continue with the subsequent file
affected validations for data file(s) affected.
− Data validation routines will not be triggered for the affected data file(s).
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
82
Validation # 3 – Blank Data
Illustration
How it works This data validation routine checks whether raw data files contain data in the
required fields (columns). Optional fields can either have an actual value, be
blank or not be provided in the data file. However, the required fields cannot be
blank and should have an actual value. This automated validation checks if any
data in the required fields from the raw data files is missing. In instances where
data is missing in required fields the validation will show a warning message
( ).
Validation type Hard stop. Process cannot continue until all validation issues are addressed
− GL Account Balances
− GL Trial Balance
Validation Exception Scenario: The column accountName in the GLA file has an empty field.
Details
Scenario: The column accountNumber in the GLA file has an empty field.
Note: If the account number is missing in the GLA file, the following validations
may also show a failed status.
Scenario: The column accountNumber in the GLAB file has an empty field.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
83
Validation # 3 – Blank Data
Note: If the account number is missing in the GLAB file, the following
validations may also show a failed status because the system cannot match a
<blank> account number with the GLA file.
Account Number Match in GLA and GLAB files
Scenario: The column endingBalanceLC in the GLAB file has an empty field.
This will trigger the number format validation because the field does not have a
thousand or decimal separator.
Scenario: The column financialPeriod in the GLAB file has an empty field.
Scenario: The column fiscalYear in the GLAB file has an empty field.
Scenario: The column accountNumber in the JET file has an empty field.
Note: If the account number is missing in the JET file, the following validations
may also show a failed status because the system cannot match a <blank>
account number with the GLA and GLAB file.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
84
Validation # 3 – Blank Data
Scenario: The column amountLC in the JET file has an empty field. This will
trigger the number format validation because the field does not comply with the
selected number format (the field is empty).
Scenario: The column documentNumber in the JET file has an empty field.
Note: If the field document number is blank in this scenario, the validation
exception detail does not indicate the document number which is used;
therefore, it references to @DocumentNumber.
Scenario: The column financialPeriod in the JET file has an empty field.
Note: If the field financial period is blank in this scenario, the following
validations may also show a failed status.
Period Balance equal to zero
Scenario: The column fiscalYear in the JET file has an empty field.
Note: If the fiscal year field is blank in this scenario, the following validations
may also show a failed status.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
85
Validation # 3 – Blank Data
Potential reasons for If the raw data files do not have data in all required fields (columns) the
validation to fail and validation will not pass. In cases when the number of validation failures is large,
recommended action the export to Microsoft Excel® may take a longer time to generate. Instead of
exporting to Microsoft Excel®, the Exception Details field can be used as it
provides detailed information on why the validation has failed. It is
recommended to contact appropriate entity personnel and ask for a new
dataset to be provided that contains all necessary data in the required fields.
Other validations If this validation fails, the impact on subsequent data validations depends on
affected what field contains the missing data. For example, if (required) data in the GL
account number column is missing in the JE Transactions data file, the
subsequent Account Number Match between Trial Balance Files and Journal
Entry Transaction Files will also fail, due to the discrepancy between the
respective GL account number values in the GL Accounts (or GLTB) and
Journal Entry Transactions data files.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
86
Validation # 4 - Duplicate Record Validation
How it works This data validation routine checks the existence of duplicates in the raw data
files. This Validation will fail if data files do not meet the following requirements:
− For the GL Accounts data file, there should be a unique entry for the following
attributes: language code, company code, account name, chart of account ID
and account number.
− For the GL Account Balances data file, there should be a unique entry for the
following attributes: account number, document currency, local currency,
company code, fiscal year, financial period, and Debit/Credit indicator.
− For the JE Transactions data file, there should be a unique entry for the
following attributes: Debit/Credit indicator, company code, fiscal year,
document number and line item.
− For the GL Trial Balances data file, there should be a unique entry for the
following attributes: account number, fiscal year, company code
Note: If the data uploaded contains duplicate records, the duplication has to be
resolved by the user before being able to continue processing. It is not possible to
continue to the next stage of the data processing flow without having resolved this
validation failure.
Validation type Hard stop. Process cannot continue until all validation issues are addressed
− GL Account Balances
− GL Trial Balance
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
87
Validation # 4 - Duplicate Record Validation
Note: If a balance is duplicated in the full scenario, the following validation may
also show a failed status.
Calculated Closing Balances versus Imported Closing Balance
Scenario Duplicate in JET file
Potential reasons If any of the raw data files have duplicate entries, the validation will not pass, and
for validation to fail affected files will not be transformed. Use the export into Microsoft Excel®
and recommended functionality to export the exception details into file validation reports and contact
action appropriate entity personnel and ask for a new dataset to be provided with all
duplicate entries cleared. Depending on the file size and complexity of the file
duplicates can be removed. It is recommended to involve a Central Team
member when performing transformation procedures to the data. The Exception
Details field will provide detailed information on why the validation has failed.
Other validations If this validation fails subsequent data validation routines will not be triggered.
affected
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
88
Validation # 5 – Current Year Closing Balance Verification
Illustration
How it works This data validation routine checks the current year closing balance period number
in the balance file with the period end parameter provided for in the Dataset
creation. For example, a 2019 full year analysis (period 1-12) has a PY closing
period of 12-2018, an opening period of 0-2019 and a CY closing period of 12-2019.
Validation type Warning. Not a hard stop to the process but it’s recommended to review all
validation details before continuing
− GL Trial Balance
Validation Scenario This validation exception shows that the current year closing balance is
Exception not available in the balance file.
Details
Potential The current year closing balance is not imported in the balance file.
reasons for
When the CY closing balance is not imported a user can continue to process but,
validation to fail
the reports in the Validate & Confirm screen, the Account Mapping screen and the
and
Advanced Capabilities will not show CY closing balances. It is strongly
recommended
recommended not to continue without CY closing balances as results downstream
action
cannot be relied upon.
Following this, contact the appropriate entity personnel and ask for a new dataset
with CY closing balances. The Exception Details field will provide detailed
information on why the validation has failed.
Other If this validation fails, there is no effect on execution of subsequent data validation
validations routines.
affected
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
89
Validation # 6 - Account Number Match between Trial Balance files
Illustration
How it works This automated validation checks whether the general ledger accounts present in
the account balance file are included in the entity’s chart of accounts (i.e. the
general ledger accounts data file). For the GL Trial Balance, this validation will
check the consistent between chart of account and balance created based on the
uploaded file.
Validation type Warning. Not a hard stop to the process but it’s recommended to review all
validation details before continuing
− GL Account Balances
− GL Trial Balance
Validation The validation exception informs the user which account number in the GLAB file
Exception Details is not part of the chart of accounts (GLA file). The reference to the account name
is a known limitation and will show #NA#.
Potential reasons − The imported chart of accounts does not contain all general ledger
for validation to fail accounts which are currently in use (i.e. the imported chart of accounts is
and recommended not complete).
action
− The general ledger accounts in the General Ledger account balances
file have been extracted incorrectly (e.g. account numbers are
incomplete).
Other validations If this validation fails, there is no effect on execution of subsequent data
affected validation routines.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
90
Validation # 7 – Account Number Match between Trial balances and Journal Entry Transaction Files
Illustration
How it works This automated data validation checks whether the general ledger accounts
present in the journal entry transactions data file are included in the entity’s
chart of accounts (i.e. the general ledger accounts data file or general trial
balance file).
Validation type Warning. Not a hard stop to the process but it’s recommended to review all
validation details before continuing
Validation Exception The validation exception informs the user which account number in the JET file
Details is not part of the chart of accounts (GLA file) or General Ledger Trial Balance
(GLTB file). The reference to the account name is a known limitation and will
show #NA#.
Potential reasons for − The imported chart of accounts or the trial balance does not contain
validation to fail and all general ledger accounts which are currently in use (i.e. the imported
recommended action chart of accounts or trial balance is not complete).
− The general ledger accounts in the journal entry transactions data file have
been extracted incorrectly (e.g. account numbers are incomplete).
− Use the export into Microsoft Excel® functionality to export the exception
details into validation reports and contact appropriate entity personnel and
ask for a new data file to be provided with matching account numbers
between the data files. The Exception Details field will provide detailed
information on why validation has failed.
Other validations If this validation fails, there is no effect on the execution of subsequent data
affected validation routines.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
91
Validation # 8 – Current year opening balance verification
Illustration
How it works This data validation routine checks the current year opening balance period
number in the balance file with the period start parameter provided for in the
Dataset creation. For example, a 2019 interim analysis (period 7-9) has a PY
closing balance of 12-2018, an opening balance of 6-2019 and a CY closing
balance of 9-2019.
Validation type Warning. Not a hard stop to the process but it’s recommended to review all
validation details before continuing
Validation Exception Scenario This validation exception shows that the current year opening
Details balance is not available in the account balance file.
Potential reasons for The current year opening balance is not imported in the account balance file.
validation to fail and
When the CY opening balance is not imported, a user can continue to process
recommended action
but the reports in the Validation screen and the Advanced Capabilities will not
show CY opening balances. It is strongly recommended not to continue without
CY opening balances as results downstream cannot be relied upon.
Following this, contact the appropriate entity personnel and ask for a new
dataset with CY opening balances. The Exception Details field will provide
detailed information on why the validation has failed.
Other validations If this validation fails, there is no effect on execution of subsequent data
affected validation routines.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
92
Validation # 9 – Period Balance equal to zero Verification
Illustration
How it works This automated validation calculates the sum of account balances per each general
ledger account for each month (for a particular year) and determines whether all
balances sum to zero.
The same automated validation is performed independently on the current period
journal entry transactions data file. Summation is not performed at the journal entry
level but at the general ledger account level per period.
Validation type Warning. Not a hard stop to the process but it’s recommended to review all
validation details before continuing
Validation The validation exception shows information about the net balance of the period that
Exception is out of balance.
Details
Potential If data is not extracted accurately (e.g. data has duplicated or missing records), use
reasons for the export into Microsoft Excel® functionality to export the exception details into
validation to fail validation reports.
and
Following this, contact the appropriate entity personnel and ask for a new dataset
recommended
with balanced general ledger accounts per period(s). The Exception Details field
action
will provide detailed information on why the validation has failed.
Other If this validation fails, there is no effect on execution of subsequent data validation
validations routines.
affected
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
93
Validation # 10 – Prior Year Closing Balance Verification Check
Illustration
How it works This data validation routine checks if the prior year closing balance period number in
the account balance file equals current year -1, period 12. For example, a 2019
interim analysis (period 7-9) has a PY closing balance of 12-2018, an opening
balance of 6-2019 and a CY closing balance of 9-2019.
Validation type Warning. Not a hard stop to the process but it’s recommended to review all
validation details before continuing
− GL Trial Balance
Validation This validation exception shows that the prior year closing balance is not available in
Exception the GLAB/GLTB file.
Details
Potential The prior year closing balance is not imported in the GLAB/GLTB file.
reasons for
When the PY closing balance is not imported a user can continue to process but, it
validation to fail
is strongly recommended not to continue without PY closing balances as results
and
downstream cannot be relied upon (e.g., Planning Analytics Dashboard will not
recommended
populate).
action
Following this, contact the appropriate entity personnel and ask for a new dataset
with PY closing balances. The Exception Details field will provide detailed
information on why the validation has failed.
Other If this validation fails, there is no effect on execution of subsequent data validation
validations routines.
affected
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
94
Validation # 11 – Calculated Closing Balances versus Imported Closing Balance Verification
Illustration
How it works This data validation routine reconciles the imported general ledger account closing
balances with the sum of imported opening balances and the activity (total debits
and total credits) with the Journal Entry transactions. The reconciliation is done on
the level of company code, fiscal year, and GL account number.
Validation type Warning. Not a hard stop to the process but it’s recommended to review all
validation details before continuing
Validation This validation exception shows the balance calculated based on the JET file and
Exception the balance imported using the GLAB/GLTB file.
Details
Potential If there are any differences between the recalculated closing balance and the
reasons for imported closing balance, the validation will fail. However, the user can proceed and
validation to fail should use the Trial Balance Report available in Data Management to investigate
and such differences.
recommended
Central Team or IT Local Support may use the export into Microsoft Excel®
action
functionality to export the exception details into file validation reports and contact
appropriate entity personnel and ask for a new dataset to be provided. The
Exception Details field will provide detailed information on why validation has
failed.
Other If this validation fails, there is no effect on execution of subsequent data validation
validations routines.
affected
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
95
Validation # 12 - Verify that the data aligns with the phase start date and end date
Illustration
How it works The start date and end date of the dataset specified during the dataset creation in
the Dataset Overview screen defines the time span of the Data Workbench analysis.
Data files received from the entity should be related to this specified time span.
However, data files can contain additional period(s) of data as long as the effective
date (i.e. posting date) of those entries are within the specified start date and end
date (e.g. when adjusting entries are entered in period 13 while their effective date is
in period 12, the Journal Entry Transactions file can contain journal entries entered
in all 13 periods). If the effective date of the journal entry included in the raw data file
is not aligned with the specified dataset start and end date this automated validation
will fail.
Validation type Warning. Not a hard stop to the process but it’s recommended to review all
validation details before continuing
Validation The validation exception shows every journal entry which is outside of the dataset
Exception boundaries.
Details
Potential The data extracted by the entity contains journal entries with effective dates
reasons for (i.e. posting date) outside of the date range specified in the dataset. Use the
validation to fail export into Microsoft Excel® functionality to export the exception details into
and validation reports and contact appropriate entity personnel and ask for a new
recommended dataset with corrected information. Depending on the file size and complexity of the
action file the additional journals can be removed. It is recommended to involve a Central
Team member when performing transformation procedures to the data. The
Exception Details field will provide detailed information on why validation has
failed.
Other If this validation fails, there is no effect on execution of subsequent data validation
validations routines.
affected
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
96
A.3. Summary of FSR Data Transformation Validations
Transformation validation routines are automated (i.e. performed in Data Workbench) routines designed to
assist the engagement team in the validation of the completeness and accuracy of the imported entity data.
These automated validation routines do not replace the engagement team’s required procedures over data
integrity. Furthermore, it ensures the ability of Data Workbench to consume the data for the Advanced
Capabilities.
Illustration
Validation type Hard stop. Process cannot continue until all validation issues are addressed
− Loans Subledger
Validation Scenario: The data in the Loans Subledger file has a “period” as a decimal
Exception Details separator except for 12 fields in the interestRateEndOfPeriod,
originalLoadAmount and openingBalance columns. These fields have an
apostrophe or an empty space as a decimal separator.
Potential reasons If data in any of the number fields of the raw data files are not in the
for validation to fail supported number format the validation will not pass and affected files will
and recommended not be imported. Use the export into Microsoft Excel® functionality to export the
action exception details into file validation reports. The Exception Details field will
provide detailed information on why the validation has failed. Refer to Data
Workbench - Data Transformation guide for guidance on how to transform
existing data files into an acceptable format or contact appropriate entity
personnel and ask for a new dataset to be provided with the required number
formats
Other validations − If this file validation fails, data import will not continue with the subsequent file
affected validations for data file(s) affected.
− Only Date format validation will be triggered but other data validation routines
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
97
Validation # 1 - Data Attribute Validation – Number format
Illustration
How it works This automated data validation routine checks whether all date fields are in one of
the following supported date formats:
− YYYY/MM/DD
− YYYY.MM.DD
− YYYY-MM-DD
− MM/DD/YYYY
− MM.DD.YYYY
− MM-DD-YYYY
− DD/MM/YYYY
− DD.MM.YYYY
− DD-MM-YYYY
Validation type Hard stop. Process cannot continue until all validation issues are addressed
− Loans Subledger
Validation Scenario: The data in the Loan Interest Rate Changes file has the following date
Exception Details format “yyyy-mm-dd” except for 1 field in the transactionDate column. These
fields have “dd-mm-yyyy” or dd/mm/yyyy or yyyy’mm’dd as a date format. Crucial
for the validation to fail is the order of the date parts date (d), month (m) and year
(y) whereas the date separator (i.e. “-“, “.” or “/”) of the format selection has no
impact on the validation outcome.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
98
Validation # 2 - Data Attribute Check – Date format
Potential reasons If the date in any of the date fields of the raw data files are not in an allowable
for validation to fail date format, the validation will not pass, and affected files will not be imported.
and recommended Use export into Microsoft Excel® functionality to export the exception details into
action file validation reports. The Exception Details field will provide detailed
information on why the validation has failed. Refer to the Data Workbench - Data
Transformation guide document for guidance on how to transform existing data
files into an acceptable format or contact appropriate entity personnel and ask for
a new dataset to be provided with the required date formats.
Other validations − If this file validation fails, data import will not continue with the subsequent file
affected validations for data file(s) affected.
− Only Number format validation will be triggered but other data validation
routines will not be triggered for the affected data file(s).
Illustration
How it works This automated data validation routine checks the existence of duplicates in the
raw data files. This Validation will fail if data files do not meet the following
requirements:
− For the Loans Subledger file, there should be a unique entry for the following
attributes; Loan ID.
Note: If the data uploaded contains duplicate records, the duplication has to be
resolved by the user before being able to continue processing. It is not possible to
continue to the next stage of the data processing flow without having resolved this
validation failure.
Validation type Hard stop. Process cannot continue until all validation issues are addressed
Validation Scenario: The data contains duplicate Loan ID in Loans Subledger file.
Exception Details
Potential reasons If any of the raw data files have duplicate entries, the validation will not pass and
for validation to fail affected files will not be transformed. Use the export into Microsoft Excel®
and recommended functionality to export the exception details into file validation reports and contact
action appropriate entity personnel and ask for a new dataset to be provided with all
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
99
Validation # 3 – Duplicate Loan ID
duplicate entries cleared. Depending on the file size and complexity of the file
duplicates can be removed. It is recommended to involve a Central Team
member when performing transformation procedures to the data. The Exception
Details field will provide detailed information on why the validation has failed.
Other validations If this validation fails, the impact on subsequent data validations depends on what
affected field contains the duplicate Loan ID. For example, if (required) there are duplicate
Loan ID column in the Loans Subledger data file, the subsequent Overlapping
Period Verification validation will also be warned, due to the overlapping periods
per loan ID based on the dateTo and dateFrom fields from the Interest Rate
Changes file.
Illustration
How it works This automated data validation routine checks the existence of duplicates in the
raw data files. This Validation will fail if data files do not meet the following
requirements:
− For the Financial Instruments Subledger Data file, there should be a unique
entry for the following attributes; Internal Instrument ID.
Note: If the data uploaded contains duplicate records, the duplication has to be
resolved by the user before being able to continue processing. It is not possible to
continue to the next stage of the data processing flow without having resolved this
validation failure.
Validation type Hard stop. Process cannot continue until all validation issues are addressed
Potential reasons If any of the raw data files have duplicate entries, the validation will not pass and
for validation to fail affected files will not be transformed. Use the export into Microsoft Excel®
and recommended functionality to export the exception details into file validation reports and contact
action appropriate entity personnel and ask for a new dataset to be provided with all
duplicate entries cleared. Depending on the file size and complexity of the file
duplicates can be removed. It is recommended to involve a Central Team
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
100
Validation # 3 – Duplicate Financial ID Validation
Other validations If this validation fails, there is no effect on execution of subsequent data validation
affected routines.
Validation # 4 – Loan ID match in Cash Flow Transaction and Sub Ledger Files
Illustration
How it works This automated data validation routine checks whether all the Loan IDs in the
Loans Cashflow Transaction Data file match with the Loan IDs in the Loans
Subledger file.
Validation type Warning. Not a hard stop to the process but it’s recommended to review all
exception details before continuing
Validation Scenario: The validation exception informs the user which Loan IDs in the Loans
Exception Details Cashflow Transaction file is not part of Loans Subledger file.
Potential reasons − The Loan ID which is present in the Loans Cashflow Transaction Data file is
for validation to not present in the Loans Subledger file because of the lack of data in the Loans
fail and Cashflow Transaction Data file.
recommended
action − The Loan ID which is present in the Loans Cashflow Transaction Data file does
not match with the Loan ID in the Loans Subledger file because inappropriate
Loan ID is put on the record in the Loans Cashflow Transaction Data file.
− Use the export into Microsoft Excel® functionality to export the exception details
into validation reports and contact appropriate entity personnel and ask for a
new data file to be provided with accurate and complete Loan IDs. The
Exception Details field will provide detailed information on why validation has
failed.
Other validations If this validation fails, the impact on subsequent data validations depends on what
affected field contains the mismatched Loan ID. For example, if (required) there is at least
one Loan ID found in the Cash Flow Transaction file that is not matched in the Loan
Sub ledger file, then the subsequent Verification of the presence of transaction
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
101
Validation # 4 – Loan ID match in Cash Flow Transaction and Sub Ledger Files
dates within the Analysis Period validation will also be warned, due to the
mismatched Loan ID in the analysis period.
Validation # 5 – Loan ID check with respect to the Interest Rate Changes File
Illustration
How it works This automated data validation routine checks whether all the Loan IDs that are
present in the Loans Interest Rate Changes file type match with Loan IDs in the
Loans Subledger file.
Validation type Warning. Not a hard stop to the process but it’s recommended to review all
exception details before continuing
Validation This validation exception shows that the Loan IDs in the Loans Interest Rate
Exception Changes file are available in the Loans Subledger file.
Details
Potential − The Loan ID which is present in the Loans Interest Rate Changes file is not
reasons for present in the Loans Subledger file because of the lack of data in the Loans
validation to fail Interest Rate Changes file.
and
recommended − The Loan ID which is present in the Loans Interest Rate Changes file does not
action match with the Loan ID in the Loans Subledger file because inappropriate Loan
ID is put on the record in the Loans Interest Rate Changes file.
− Use the export into Microsoft Excel® functionality to export the exception details
into validation reports and contact appropriate entity personnel and ask for a
new data file to be provided with accurate and complete Loan IDs. The
Exception Details field will provide detailed information on why validation has
failed.
Other If this validation fails, the impact on subsequent data validations depends on what
validations field contains the mismatched Loan ID. For example, if (required) there is at least
affected one Loan ID found in the Loans Interest Rate Changes file that is not matched in the
Loan Sub ledger file, then the subsequent validation Verification of Valid Interest
Rate, Overlapping Period Verification, Skipped Period Verification will also be
warned, due to the mismatched Loan ID in the analysis period
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
102
Validation # 6 – New Loan Opening Balance Verification
Illustration
How it works This automated data validation routine checks whether all the loans whose
origination date (‘originationDate’ from Loan Subledger file) that falls under the
analysis period (the period start and end date provided in the parameters) do not
have an opening balance or do not have a value of opening balance that unequal to
‘0’.
Validation type Warning. Not a hard stop to the process but it’s recommended to review all
exception details before continuing
Validation Scenario: This validation exception shows that the Loan IDs whose origination date
Exception that falls under the analysis period have an opening balance that is unequal to ‘0’ in
Details the Loans Subledger file.
Potential − The identified new Loan IDs have a value in opening balance which is not equal
reasons for to ‘0’ for the beginning of the analysis period.
validation to fail
and − Use the export into Microsoft Excel® functionality to export the exception details
recommended into validation reports and contact appropriate entity personnel and ask for a
action new data file to be provided with the correct opening balance, which has no
value or is equal to ‘0’ for every new loan originated within the analysis period.
The Exception Details field will provide detailed information on why validation
has failed.
Other If this validation fails, there is no effect on execution of subsequent data validation
validations routines.
affected
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
103
Validation # 7 – Verification of a Valid Interest Rate
Illustration
How it works This automated data validation routine checks whether the fields under the interest
rate end of period column is not empty (contains only numeric values) and the
maturity date of a loan (based on Loan ID) is less than the period end date and the
loan termination date is not empty.
Validation type Warning. Not a hard stop to the process but it’s recommended to review all
exception details before continuing
Validation Scenario: This validation exception shows that the interest rate end of period
Exception column is either empty or contains any other values than numeric values file.
Details
Note: The field can be empty if the maturity date of a loan (based on Loan ID) is
before the period end date and the loan termination date is empty.
Potential − The Interest rate end of period column is empty in the Loans Subledger file for a
reasons for certain Loan ID.
validation to fail
and − The value of interest rate end of period column in the Loans Subledger file for a
recommended certain Loan ID is not numeric.
action
− Use the export into Microsoft Excel® functionality to export the exception details
into validation reports and contact appropriate entity personnel and ask for a
new data file to be provided with the correct interest rate end of period value for
every Loan ID which is numeric. The Exception Details field will provide
detailed information on why validation has failed.
Note: The interest rate end of period column can be empty if the maturity date of a
loan (based on Loan ID) is before the period end date and the loan termination date
is empty.
Other If this validation fails, there is no effect on execution of subsequent data validation
validations routines.
affected
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
104
Validation #8 - Verification of the presence of transaction dates within the Analysis Period
Illustration
How it works This automated validation checks whether all the dates that are found in the
transaction date column of the Loans Cashflow Transaction Date file are in the
analysis period that was mentioned in the period start and period end date in the
parameters.
Validation type Warning. Not a hard stop to the process but it’s recommended to review all
exception details before continuing
Validation Scenario: The validation exception informs the user which transaction date(s) of
Exception Details the Loan Cashflow Transactions file is/are not in the scoped Analysis period from
the parameters.
Potential reasons − The Transaction date in the Loans Cashflow Transaction Data file is before
for validation to fail Period start date which is provided as a parameter.
and recommended
action − The Transaction date in the Loans Cashflow Transaction Data file is after
Period end date which is provided as a parameter.
− Use the export into Microsoft Excel® functionality to export the exception
details into validation reports and contact appropriate entity personnel and
ask for a new data file to be provided with correct Transaction dates which
are within the analysis period. The Exception Details field will provide
detailed information on why validation has failed.
Other validations If this validation fails, there is no effect on execution of subsequent data
affected validation routines.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
105
Validation # 9 – Overlapping Period Validation
Illustration
How it works This automated validation checks whether there is no overlapping period per Loan
ID based on the dateFrom and dateTo fields from the Loans Interest Rate Changes
file.
Validation type Warning. Not a hard stop to the process but it’s recommended to review all
exception details before continuing
Validation The validation exception shows information about overlapping periods per loan ID
Exception based on the dateTo and dateFrom in the Interest Rate Changes file.
Details
Note: If the ‘From’ date of a period and the ‘To’ date of another period are similar,
then they are not considered as overlapped periods and hence it may be ignored.
Potential − There are overlapping periods per Loan ID based on the dateTo and dateFrom
reasons for fields from the Loans Interest Rate Changes file.
validation to fail
and − For example, if the dateTo and dateFrom of the 1st period is 1/1/2021 and
recommended 3/31/2021, and if those of the 2nd period is 2/1/2021 and 4/30/2021, then the 1st
action and 2nd period are considered to be overlapped.
− Use the export into Microsoft Excel® functionality to export the exception details
into validation reports and contact appropriate entity personnel and ask for a
new data file to be provided with the correct values in the dateTo and dateFrom
field and there is no overlapping period. The Exception Details field will provide
detailed information on why validation has failed.
Other If this validation fails, there is no effect on execution of subsequent data validation
validations routines.
affected
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
106
Validation # 10 - Skipped Period Validation
Illustration
How it works This automated validation checks whether all the periods are available without any
of them being skipped per Loan ID based on the dateFrom and dateTo fields from
the Loans Interest Rate Changes file.
Validation type Warning. Not a hard stop to the process but it’s recommended to review all
exception details before continuing
Validation Scenario: The validation exception shows the information about missing periods per
Exception loan ID based on the dateTo and dateFrom fields from the Interest Rate Changes
Details file.
Potential − There are any missing periods per Loan ID based on the dateTo and dateFrom
reasons for fields from the Loans Interest Rate Changes file.
validation to fail
and − For example, if the dateTo and dateFrom of the 1st period is 1/1/2021 and
recommended 1/31/2021, and if those of the 2nd period is 3/1/2021 and 3/31/2021, then the
action period of the month February is considered to be skipped.
− Use the export into Microsoft Excel® functionality to export the exception details
into validation reports and contact appropriate entity personnel and ask for a
new data file to be provided with the correct values in the dateTo and dateFrom
field and there is no skipped period. The Exception Details field will provide
detailed information on why validation has failed.
Other If this validation fails, there is no effect on execution of subsequent data validation
validations routines.
affected
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
107