0% found this document useful (0 votes)
233 views

GO CKD WP DA DWB UserGuide 2022

Uploaded by

Estela Quispe
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
233 views

GO CKD WP DA DWB UserGuide 2022

Uploaded by

Estela Quispe
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 107

KPMG Clara workflow

KPMG Clara workflow - Enhanced / Core

Non-PaaS SQL version

Data Workbench
User Guide

June 2022
Contents
1. Introduction to Data Workbench ......................................................................................................... 8
Extracting the Right Data ............................................................................................................. 8
Transferring the Data ................................................................................................................... 9
Importing the Data ........................................................................................................................ 9
Key Considerations for Data Import ........................................................................................... 10
1.4.1. Pre-processing of data ........................................................................................................ 10
1.4.2. Use of multiple company codes .......................................................................................... 10
2. Walkthrough Data Workbench .......................................................................................................... 12
Introduction to Data Workbench................................................................................................. 12
Supported Analytics and Data Import Scenarios ....................................................................... 12
2.2.1. Supported Data Scenarios for KPMG Clara analytics Routines ......................................... 13
2.2.2. Supported Data Scenarios for Financial Services Routines ............................................... 14
Accessing Data Workbench ....................................................................................................... 15
Advanced Capabilities Portal ..................................................................................................... 17
2.4.1. Creating an Analysis ........................................................................................................... 17
2.4.2. Viewing the Analysis Status ................................................................................................ 19
2.4.3. Accessing the Analytical Results ........................................................................................ 19
2.4.4. Accessing Data Workbench for Data Processing ............................................................... 20
2.4.5. Selecting Results Widgets to Display ................................................................................. 20
2.4.6. Editing an Analysis .............................................................................................................. 20
2.4.7. Resetting, deleting or requeuing an Analysis ..................................................................... 21
My Analysis/Overview ................................................................................................................ 22
Analytics Selection ..................................................................................................................... 23
2.6.1. Select Analytics for Processing........................................................................................... 23
2.6.2. View the Status and Requirements for Selected Analytics ................................................. 25
2.6.3. Delete an Analytic ............................................................................................................... 25
Upload, Import and Transform Data .......................................................................................... 26
2.7.1. Upload and Import Client Data............................................................................................ 26
2.7.2. Transform the Data Using the Data Field Mapping Functionality ....................................... 32
2.7.3. View the Analysis Details .................................................................................................... 33
2.7.4. Activate SQL Insert to Perform Manual Transformation of Data ........................................ 34
2.7.5. Review the Data Import and Data Transform Validations .................................................. 35
Account Mapping ....................................................................................................................... 36
2.8.1. Loading the Financial Statement Structure ......................................................................... 38
2.8.2. Mapping General Ledger Accounts to the Financial Statement Structure ......................... 39
2.8.3. Changing and Un-mapping Accounts ................................................................................. 40
2.8.4. Saving the Account Mapping .............................................................................................. 43
2.8.5. Using the Export and Import functionality ........................................................................... 43
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
2
2.8.6. Reviewing the Account Mapping Validation Report ............................................................ 46
2.8.7. Publish account mapping .................................................................................................... 48
Parameters ................................................................................................................................. 50
2.9.1. General Ledger Analysis Parameters ................................................................................. 50
2.9.2. Financial Services Routines Parameters ............................................................................ 55
Validations ................................................................................................................................ 57
2.10.1. Upload Validation Routines............................................................................................... 57
2.10.2. Import Validation Routines ................................................................................................ 57
2.10.3. Transformation Validation Routines .................................................................................. 58
2.10.4. Analytic Reports and Reconciliation ................................................................................. 59
2.10.5. Confirmation ...................................................................................................................... 61
Process .................................................................................................................................... 61
Results ..................................................................................................................................... 63
Database Archiving & Restore ................................................................................................. 64
2.13.1. Archive an analysis ........................................................................................................... 64
2.13.2. Restore an analysis .......................................................................................................... 68
Activity Log ............................................................................................................................... 68
2.14.1. Analysis Creation .............................................................................................................. 68
2.14.2 Importing, Transforming and Processing data ................................................................... 69
Central Team Portal ................................................................................................................. 70
2.15.1 Project Overview ................................................................................................................ 70
2.15.2 Project User Management ................................................................................................. 71
Data Preparation Toolbox ........................................................................................................ 75
Appendix A. Summary of Data Import and Transformation Validations ............................................ 76
A.1. Summary of Data Upload and Import Validations ................................................................. 76
A.2. Summary of Data Transformation Validations ...................................................................... 81
A.3. Summary of FSR Data Transformation Validations .............................................................. 97

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
3
How to navigate this document
To navigate between the topics included in this user guide, use the bookmarks (example illustration
included below) to jump to relevant sections.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
4
How to select the correct version of the Data Workbench User Guide
As of 2022.1 release there are two different deployment modes of KPMG Clara workflow - Data
Workbench:
­ PaaS mode leveraging on Azure services for data processing back end
­ Non-PaaS mode that continues using Microsoft SQL Server for data processing
As there are slight differences on the functionality in Data Workbench between PaaS and Non-PaaS
deployments, there are two separate versions of the KPMG Clara workflow - Data workbench User Guide,
each specific to the data processing back-end technology used by Data Workbench.
In order to choose the right Data workbench guidance, confirm with your local KPMG Clara deployment
team, Central Team or IT department whether PaaS or Non-PaaS is applicable for the member firm. There
can be the following scenarios:
­ Regional clouds environments (e.g. EMA and ASPAC Training/STG/Prod) are deployed with the PaaS
mode.
­ Satellite cloud environments may be deployed with PaaS or Non-PaaS based on the member firm
deployment strategy.
­ On-Premises environments are always deployed with Non-PaaS (Microsoft SQL back-end
technologies)
Another way to identify the deployment mode in Data workbench, is to review the analysis details in Data
workbench directly after creating a new analysis and following these steps:
- Satellite can leverage SQL or PaaS based on country decision.
1. Open a 2022.1 (or higher version) engagement from the “Engagements Dashboard”
2. From My Engagement (the 4 square icon in the top right) , select Advanced Capabilities
3. On the Advanced Capabilities Portal, Create a new Analysis or open an existing analysis

4. In the right-hand icon list, select the disk pack icon which opens the Analysis Details
5. Compare the Analysis Details screen from your analysis to the screen prints below

If you see a screen like this,


the environment is deployed
you with Non-PaaS therefore
this document is the
applicable guidance for Data
Workbench

If you see a screen like this,


the environment is deployed
with PaaS therefore please
refer to the PaaS version of
the Data Workbench User
Guide. Guide.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
5
Data Workbench

Data Workbench is the universal data management platform that prepares and
Highlights
processes the data in support of the features and activities teams perform within the
KPMG Clara workflow – Enhanced / Core (from hereon: KPMG Clara). This version
of the guidance specifically addresses instances of KPMG Clara which are
implemented in a Satellite cloud environment or as local instances that are not
deployed with PaaS (e.g. using SQL Server technologies). A separate version of
the user guide addressing the Data Workbench as implemented on a regional or
satellite cloud as a Platform as a Service (PaaS) is available.
This guidance covers the import, data field mapping, account mapping, setting of
parameters and processing of data for the KPMG Clara analytics general ledger
routines, data-enabled working papers and GL apps. All of these capabilities will be
referred to as “KPMG Clara analytics” throughout the remainder of this guide. This
guide also pertains to the processing of the financial services routines, which are
highlighted throughout the guide in individual sections.

KPMG Clara data workbench is the one-stop shop for extraction, transformation,
Purpose
load, and data processing efforts for all capabilities requiring the use of data. It
provides the ability to import and process data to support the KPMG Clara workflow
Advanced Capabilities and is essential to support the extraction, transferring and
importing of the data necessary to execute the KPMG Clara workflow Advanced
Capabilities and Financial Services Routines.
KPMG Clara data workbench has an embedded “data management” process which
includes the following high-level activities:

− Creating and configuring an analysis

− Selecting your analytics

− Uploading, importing, and transforming data

− Mapping general ledger accounts against customizable financial statement


structures (for KPMG Clara analytics)

− Defining analytical parameters

− Reviewing and confirming validations and reconciliation reports

− Executing the processing of analytics

− Presenting and reviewing the analytics in the Advanced Capabilities Portal

− Utilizing processed analytics for the entry of balances in the KPMG Clara
workflow

− Managing access to the project.

Prerequisites − A KPMG Clara workflow engagement must be created

− Complete the KPMG Clara workflow Engagement -- Setup activities

− For certain KPMG Clara analytics (e.g. Planning Analytics, Account Analysis,
Data-Enabled Working Papers, GL Apps), a Financial Statement Structure must
have been created and Process Mapping must have been performed within the
KPMG Clara workflow – Financial Statements Module.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
6
The use of various data-driven activities within the KPMG Clara workflow require
Why is this
data to be processed using the KPMG Clara data workbench. This includes the use
Important?
of the General Ledger analyses (i.e. Journal Entry Analysis, Planning Analytics and
Account Analysis), the Financial Services routines, the data-enabled working papers
and GL apps and the entry of balances into the Financial Statements module.
Following the steps outlined in this document, and the diligent execution and review
of validations within the workflow of the KPMG Clara data workbench application will
facilitate the complete and accurate processing of analytics.

For more Watch the Data Workbench ”How to” videos for details relating to the Data
information … Workbench functionality.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
7
1. Introduction to Data Workbench
Before heading into the steps to process data in Data Workbench, it is important to understand how data
flows from the entity (i.e. the client) to KPMG. The overall process starts upon request of the engagement
team, identifying the relevant analytics. From that point, the communication between the engagement team
and the client is shaped to discuss the detailed data requirements and the availability of automated
extraction scripts. This can be broken down into three stages:

− Extracting the right data

− Transferring the data

− Importing the data.


Each of these phases are discussed in more detail below.
Depending on the complexity of the data to be obtained, the process of extracting and transforming the
data typically requires expertise from various individuals, including the engagement team and subject
matter experts (from hereon: SMEs), such as IRM professionals and Central Team members (where
Central Teams are established). In cases were the data to be obtained and transformed is more complex in
nature, it is highly recommended to involve these experts in the process.

Extracting the Right Data


There are specific requirements the entity must consider when extracting the data files and preparing them
to be transferred to KPMG’s IT environment. These details include the type of system from which data
needs to be obtained, the setup of those systems (to ensure correct extraction parameters), the type of
data that needs to be obtained and the team and client’s familiarity for the process to obtain data from
these systems. For details on the specific data requirements needed for the selected analytics KPMG Clara
analytics and corresponding data files to be imported into Data Workbench, please refer to the Data
Workbench - Data Requirements guide.
As illustrated below, defining the right data to be obtained originates on the side of the engagement team
(and SMEs involved) and starts with the identification of the data extraction requirements, which are then
shared with the client. Best practice is to discuss these requirements with the client, so that it is clear to all
parties involved what data is required, and how the data can be obtained (e.g. by the client themselves, or
via available extraction scripts). In cases where extraction scripts are used, ensure that the client adheres
to internally defined change management processes to obtain the data. Review the ERP ETL Guidance on
the Alex page for KPMG Clara workflow – Engagement team guidance under “ERP ETL Guidance” for
further details.
For teams using the automated ERP Pipelines for SAP and Oracle data sets, please refer to the separate
“ERP Pipelines QRC”, which includes information on how to obtain data for use with this feature using the
available automated data extraction solutions.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
8
Figure 1 – Flow of Data from the Perspective of the Engagement Team

Transferring the Data


When the entity has generated the required data files, the files need to be transferred to KPMG’s IT
environment in a secure manner. Below are some key points to be considered:

− Engagement letter: Consider the terms and conditions of the relevant engagement letter. A country’s
local Risk Management function may provide additional clauses to include in the engagement
letter(s)/contract(s).

− Country guidelines: Work with the Central Team and local Risk Management to understand the
guidelines in place regarding secure data transfer of large files or to establish local guidelines
regarding secure data file transfer when local guidelines do not exist.

− Understand the environment: Work with the Central Team and the local IT Support team to
understand how and to where the data transfer will occur.

− Work with the client: Consider whether the entity has a preference regarding an acceptable method
of transferring data from their facility to the KPMG environment.
It is recommended that an encrypted medium be used when transferring such data. Refer to the Risk
Management Considerations document, local Risk Management and local ITS team guidelines to establish
a secure data transfer process. Examples of these can include the setup and availability of a Secure File
Transfer Protocol (SFTP) that allows for a secure online transfer of files, or the use of encrypted media,
such as encrypted external hard drives to provide a secure method of transferring and protecting both
KPMG’s and the entity’s information during transit.

Importing the Data


After the data files are transferred to the applicable folder on the KPMG Clara workflow server, data needs
to be imported into Data Workbench. Generally, there are two ways of performing the data import into Data
Workbench. Firstly, the user can make use of the Data Import feature available within the application (see
section 2.7. Upload, Import and Transform Data. This will allow users to upload, preview, configure and
import any text (TXT, CSV) and Excel (XLSX) files. Note that importing data through the application, does
not have a file size restriction, but does provide a warning if the files to be uploaded are larger than 2
gigabytes to alert the user of connectivity requirements, and extended upload times.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
9
In cases where large(r) data files need to be imported, other forms of uploading data files are available:
- Upload these directly to the file upload path of the engagement (as indicated in section 1.2
Transferring the Data).
- Upload Zip files containing supported data files (as indicated in section 2.7.1.2 Zip File Upload) in
order to streamline the file upload process.
Follow the guidance in section 2.2 Supported Analytics and Data Import Scenarios Import scenarios to
import the entity’s data files.

Key Considerations for Data Import


1.4.1. Pre-processing of data
The data that is received from the entity may not be in the proper format to be consumed by Data
Workbench. In these instances, the data files need to be transformed or pre-processed prior to import.
There are numerous ways of pre-processing, including, but not limited to, the following tools and methods:

− Using ERP ETL Guidance (SAP, others) and the SQL Insert process (see separate SQL Insert – Data
Workbench)

− Using Microsoft Excel®

− Using Microsoft Power Query®

− Using IDEA.
Data Workbench -- Data Transformation guide provides procedures that can be used with Microsoft Excel®,
Microsoft Power Query® or IDEA® v10 (IDEA®) to accomplish data preprocessing tasks prior to import into
KPMG Clara. Note that these are just some of the tools that can be used. Best practice is that teams use
those tools they are most familiar with, whether it is one of the globally available solutions outlined above,
or other local or third-party resources that achieve the goal of transforming the data into a KPMG Clara
compatible format.
Depending on the complexity of the data received and transformations needed, it is recommended that
SMEs (Central Teams, IRM professionals) are involved during this process to make the data extraction and
data transformation processes as efficient and effective as possible.
Note: When documenting data pre-processing steps, engagement teams can use the KPMG Clara
workflow Computer Assisted Audit Techniques (CAATs) Document, which provides certain pre-populated
content specifically related to the use of Data Workbench and the Advanced Capabilities.

1.4.2. Use of multiple company codes


The company code field is an optional field in the data requirements for Advanced Capabilities that, if it is
part of the entity’s data files, can be imported into Data Workbench. Consideration needs to be given when
processing multiple company codes within the same dataset of KPMG Clara workflow so that the
companies:

− Have a common reporting currency (i.e. local currency)

− A common chart of accounts

− A common reporting and financial period setup

− Be related to the same industry


Note that when using the company code attribute, it has to be used consistently across all data files being
imported (e.g. if company code is provided and mapped for the JET file, it needs to be present and mapped
for GLA and GLAB as well). This also includes ensuring that data is available for the different company
codes (e.g. data for 4 company codes in JET, assumes that data for the same 4 company codes is
available in the GLA, GLAB, and periodic GLTB files).

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
10
For data sets with general ledger accounts having same number but with different names across company
codes (i.e. translated account names but still within the same chart of accounts), it’s important to ensure
that the account number is unique for account mapping purposes, therefore in this scenario a possible
workaround is to concatenate the company code to the account number to make it unique with its own
description.
When reviewing the results of imported data with multiple company codes, engagement teams may need to
consider the following:

− The Excel dashboards are generally set up such that a global slicer can be used to show results by
company. If results are needed on a company code basis, the company code attribute needs to be
included within the pivot table builder.

− Engagement teams should be aware of cases where document numbers are shared (i.e. recurring)
across company codes within their imported data. The native behavior of Microsoft Excel® is to provide
aggregated information, and as such, engagement teams should review the need to include the
company code attribute within the pivot tables to provide disaggregated results.

− The Transaction viewer within the Excel dashboards will provide information by document number,
regardless of company code.

− The Journal Entry Workpaper (within the Journal Entry Analysis dashboard) will provide results at a
document number level.

− Within the Journal Entry Analysis, the Pre-Defined reports are not initially set up by company code. As
such, engagement teams using multiple company codes may want to include this attribute within the
pivot table builder.

− When using multiple company codes, the bifurcation of journal entries for Account Analysis purposes is
performed at a company code and document number level.
Users are now able to allocate balances that have been processed for multiple company codes within a
single data set, through the use of the multi-opinion feature within the Financial Statements – Balances
module. Once the data set has been processed, simply go to the Balances module, select “Advanced
Capabilities” as the source of data for the balances, and subsequently allocate the identified company
codes (as per imported data) to each of the opinions.

Figure 2 – Mapping data set company codes to reporting opinions when importing account balances

1 Select the analysis database from Data workbench


2 List of reporting opinions configured at the Engagement profile
3 Map company codes from processed data set to reporting opinions

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
11
2. Walkthrough Data Workbench
Introduction to Data Workbench
Data Workbench is the universal data management platform that prepares and processes the data in
support of the features and activities teams perform within the workflow. This includes the import, data field
mapping, account mapping, setting of parameters and processing of data for the general ledger routines,
as well as financial services routines.
In order to support the objective of bringing client data into the workflow, most commonly through the
processing of Advanced Capabilities, the following process flow is embedded within Data Workbench:
1. Data Workbench Portal (Section 2.4)
2. My Analysis/Overview (Section 2.5)
3. Analytic Selection (Section 2.6)
4. Upload, Import and Transform Data (Section 2.7)
5. Account Mapping (Section 2.8)
6. Parameters (Section 2.9)
7. Validations (Section 2.10)
8. Processing (Section 2.11)
9. Results (Section 2.12)

Figure 3 – Data processing workflow within Data Workbench

Supported Analytics and Data Import Scenarios


The following analytics and corresponding data scenarios are supported:

− KPMG Clara General Ledger analytics routines (Section 2.2.1)

− KPMG Clara Financial services routines (Section 2.2.2)

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
12
2.2.1. Supported Data Scenarios for KPMG Clara analytics Routines
With respect to the KPMG Clara analytics, there are three general ledger analytics groups available:
1. General Ledger Analysis

a. Journal Entry Analysis JEA

b. Planning Analysis PA

c. Account Analysis AA

2. General Ledger Apps

a. Business Process Leadsheets LEAD

b. Materiality aide MAT

c. KPI dashboard KPI

3. Data Enabled Workpapers

a. Trend Analysis and waterfall


GTWA
analysis (generic)

b. Gross margin substantive analytical


GMT
(trend)

c. Payroll expenses substantive


PEX
analytical (predictive)

d. Gross margin percentage


substantive analytical (trend in GMTR
ratio)

The execution of these analytics is tied to the availability of four distinct data files, further outlined in the
Data Workbench - Data Requirements guide. These four files include the General Ledger Accounts (GLA)
file, the General Ledger Account Balances (GLAB) file, the Periodic General Ledger Trial Balance (GLTB)
file, and the Journal Entry Transactions (JET) file. Depending on the availability of these files, the data can
be offered to Data Workbench following three “scenarios”:

− “Compact” scenario (Only balances) – Either only the GLA and GLAB files or the GLTB file alone with,
at a minimum, all mandatory fields (i.e. account balance data only).

− “Full” scenario (Balances and transactions) – GLA, GLAB and JET files or the GLTB and JET files with,
at a minimum, all mandatory fields

− “JET only” scenario (Only transactions) – Only the JET file with, at a minimum, all mandatory fields.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
13
The following tables specify the required files, first by Analytics and then by File Scenario (i.e. Full or
Compact):
Table 1 - Required files by analytic

JET
Compact Full
Analytics Only
Analytics
Category GLA GLAB GLTB
GLA GLAB GLTB JET
JET JET
Planning Analysis x x x x
GLA Analysis Account Analysis x x
Journal Entry Analysis x x x
Business Process Leadsheets x x x x
General Ledger
KPI dashboard x x x x
App
Materiality aide x x x x
Trend Analysis and waterfall
x x x x
analysis
Payroll expenses substantive
x x x x
Data Enabled analytical
Workpapers Gross margin percentage
x x x x
substantive analytical
Gross margin substantive
x x x x
analytical

2.2.2. Supported Data Scenarios for Financial Services Routines


With respect to the Financial Services Routines, there are four analytics available for Loans and one for
Financial Instruments.
The execution of these analytics is tied to the availability of four distinct data files, further outlined in the
Data workbench - Data Requirements guide. These four files include the loans sub ledger details file (LSL),
the cash flow transactional data file (CFL), the interest rate changes data (IRC) file and the Financial
Instruments Transactions file (FIN).
The following table provides an overview of the files that are needed to process each of the respective
analytics.
Table 2 – Link between Loan Analysis and file type

Financial Services File Types Loan Portfolio New Loans Payment Interest
Routines Scenario Overview Analysis (NLA) Behavior Income
(LPO) Analysis (PBA) Analysis (IIA)

Loans Sub X X X X
Ledger (LSL)
Loans Analysis
Loans Cash X X X
Flow
Transactions
(CFL)

Interest rate X
changes data
(IRC)

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
14
Financial Instruments Analysis (FIA)
The following table provides an overview of the files that are needed to process each of the respective
analytics.
Table 3 – Link between Financial Instrument Analysis and file type

Financial Services Routines File Types Financial Instruments


Scenario Analysis (FIA)

Financial Instruments Financial Instruments X


Analysis Transactions (FIN)

Accessing Data Workbench


From the My Engagement page within KPMG Clara workflow, Data Workbench can be accessed through
the Advanced Capabilities launch point by clicking Advanced Capabilities in the right-hand side
application menu.

Figure 4 – Accessing Data Workbench

In addition to accessing Data Workbench via the engagement page, users have the ability to navigate to
the Data Workbench Portal (i.e. Central Team Portal) to see all of their data projects and analyses in one
overview. In order to access the Data Workbench via the Central Team Portal, users can either make use
of the tile on the KPMG Clara workflow home page.

Figure 5 – Data Workbench / Central Team Portal access point within KPMG Clara workflow

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
15
or access the Central Team Portal using the access point within any given Data Workbench analysis.

Figure 6 – Central Team Portal access point within a Data Workbench analysis

For more information about the Central Team Portal, please see section 2.15 Central Team Portal.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
16
Advanced Capabilities Portal
The Advanced Capabilities Portal is a landing screen within Data Workbench where the user can access all
analyses associated with the selected KPMG Clara workflow engagement. Within the Advanced
Capabilities Portal, the user can perform the following activities:

− Creating an Analysis (Section 2.4.1)

− Viewing the Analysis Status (Section 2.4.2)

− Accessing the Analytical Results (Section 2.4.3)

− Accessing Data Workbench for Data Processing (Section 2.4.4)

− Selecting Results Widgets to Display (Section 2.4.5)

− Editing an Analysis (Section 2.4.6)

− Resetting an Analysis (Section 2.4.7)

− Deleting an Analysis (Section 2.4.8).

2.4.1. Creating an Analysis


The analysis is, from a technical perspective, the most granular level of performing data processing within
an engagement. The way that Data Workbench is set up, is that whenever a KPMG Clara workflow
engagement is created, a corresponding Data Workbench project is created. The project is the “data
container” for all data-related efforts for that specific KPMG Clara workflow engagement. Within one project
though, the user can create one or multiple analyses that serve as containers in which different “cross-
sections” of data can be analyzed, such as the analysis of data for different entities within a group or the
execution of analytics at different points in time (e.g. for “phases” such as planning, interim and year-end).
For each analysis, the user can then also select which analytics to execute, which adds another level of
flexibility to define exactly what data needs to be analyzed.
In order to create an Analysis, users click the Create Analysis button, and enter the following parameters
in the pop-up window, and click the Create Analysis button.

Figure 7 – Creating an analysis

Parameter Required/Optional

Analysis Name Required


Description Optional
Country Required (populated from KPMG Clara Engagement Profile)
Fiscal Year Required
Period Start Date Required
Period End Date Required
Start Period 1
Required
End Period1 Required
Analysis Phase Required
Table 4 – List of parameters in Create analysis pop-up

1
Beginning for KPMG Clara workflow 2021.3, the user can choose to analyze a single financial period within the fiscal year.
When entering the Start and End periods, the user enters the same period number for both.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
17
Figure 8 – Create Analysis pop-up window with available parameters

Below is an overview indicating supported and unsupported scenarios as per current release.

Table 5 - Supported Analysis Scenarios

Parameter Scenario Status

Company codes Single & Multiple company code data sets Supported

Period / Phase Single Fiscal year (Planning, Interim, Year-End, Stub) Supported

Period / Phase Single and Multiple period Supported

Period / Phase Multiple Fiscal years Not Supported

Period / Phase Multiple reporting calendars Not Supported

Adjustment Periods Adjustment Periods at end of Fiscal Year Supported

Adjustment Periods Adjustment Periods within Fiscal Year N/A

Chart of accounts Single chart of accounts Supported

Chart of accounts Multiple chart of accounts Not Supported

After clicking the +Create Analysis button, the user is automatically returned to the Advanced Capabilities
Portal, and the created analysis will display on the screen. Information on each analysis created is
displayed on its own Analysis Card.

Figure 9 – An Analysis Card for a created analysis displayed on the Advanced Capabilities Portal
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
18
2.4.2. Viewing the Analysis Status
During data processing, the user can always return to the Advanced Capabilities Portal to see the progress
of each analysis by viewing the respective analysis subway map. The subway map indicates the current
processing stage for each analysis.

− Completed steps will be indicated with green checkmarks

− Available steps (yet completed) are shown as black “stations”

− Not available steps (activity is not yet available, and has yet to be completed) shown as grey “stations”.

By clicking on the “subway stations”, the user can directly navigate to the corresponding activity:

− Analytics (to navigate to the Select Analytics screen)

− Data (to navigate to the Data Upload/Import/Transform screen)

− Account (to navigate to the Account Mapping screen)

− Parameter (to navigate to the Parameters screen)

− Validation (to navigate to the Validations screen)

− Process (to navigate to the Processing screen)

− Results (to navigate to the Results Overview screen).

Figure 10 – The “analysis subway map” displaying the data processing workflow

2.4.3. Accessing the Analytical Results


Once data processing for an analysis has been completed, the “Analysis Card” will also display a set of
results widgets, one for each analytic that has been selected.
These results widgets offer a quick look into the available results. The detailed results, in the form of the
Excel and / or Power BI dashboards for each of the analytics can also directly be accessed from the
Analysis Card or via the My Results section within the Advanced Capabilities Portal.

Figure 11 – Available Results Widgets and link to Excel dashboards


© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
19
2.4.4. Accessing Data Workbench for Data Processing
The access point to performing data processing can also be found on the Advanced Capabilities Portal. For
each Analysis Card, clicking on the name of the analysis will take the user directly into the underlying data
processing screens, via the My Analysis page.

Figure 12 - Accessing Data Workbench

2.4.5. Selecting Results Widgets to Display


When clicking the filter button under the actions heading next to the “subway map”, the user has the ability
to identify which results widgets should be displayed on the “Analysis Card” (similar as to the View Results
screen). This functionality will be outlined later.
Note: This option only becomes available after an analysis has successfully processed. Until that time,
clicking the button will not trigger any action on the screen.

2.4.6. Editing an Analysis


The user has the ability to edit a limited number of parameters of an analysis by clicking the Pencil icon
also under the actions heading on the “Analysis Card”.

Figure 13 – Edit Button

Figure 14 – Edit Analysis

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
20
Note: After the creation of an analysis, certain parameters (such as Country, Fiscal Year, Period Start
Date, Period End Date, Start Period and End Period) cannot be edited anymore. In case these parameters
need to be corrected, the user has to create a new analysis in which the data has to be reprocessed.

2.4.7. Resetting, deleting or requeuing an Analysis


The user has the option of setting, deleting or requeuing an analysis by selecting the context meu under
the Actions heading on the Analysis screen

Figure 15 - Action - Content menu choices

2.4.7.1. Resetting an Analysis


The user has the ability to reset an analysis by clicking the Reset button on the Analysis Card. This
feature can be activated when the user wants to perform a clean start of an analysis that has been started
before and needs to be changed for various reasons. Upon pressing the Reset button, an alert is opened,
asking the user to confirm resetting the analysis progress. Upon resetting, various objects will be
permanently removed, including:

− Any analytical databases and data cubes associated to this analysis

− Any available analytic results (widgets and Excel dashboards)

− Any data imported/transformed (i.e. field mappings)

− Any account mapping efforts

− Any parameters entered.


As such, ensure that any objects that are required for the recreation of the analysis have been stored
upfront, including any pre-transformed data files, data field mappings and account mapping reports.

Figure 16 – Resetting an analysis

After confirming that the analysis needs to be reset, the previous data container will be removed, and a
new will be created supporting current release features.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
21
2.4.7.2. Deleting an Analysis
The user has the ability to remove an analysis permanently by clicking the Delete button on the Analysis
Card. Upon pressing the Delete button, an alert is opened, asking the user to confirm deleting the analysis
progress. Note that the same considerations apply as outlined for deleting an analysis, and that various
objects are permanently removed when selecting this feature.

Figure 17 – Deleting an analysis

2.4.7.3. Requeuing an Analysis


Requeuing an analysis can be helpful in the event that the dataset is running for a long time or fails
because of a timeout or temporary incident related to the environment.

My Analysis/Overview
As outlined above, when clicking the Workbench button, the user is navigated to the My Analysis screen.
This screen displays the following information:

− The status (via the subway map) for the selected analysis: This shows the status of the individual
processing steps leading up to successful completion of the analysis. The user can also use the
subway map to navigate to any of the stages individually by clicking on them

− My Analytics: This screen lists out the analytics selected for this analysis, including their readiness for
processing and their respective status

› (e.g. “Preparation in Progress” means data is still being prepared,

› “Ready for Processing” means that data has been prepared, validated, confirmed and
analytics processing can be started.

› “Processing Successful” means the analytics processing has been completed and results
dashboards are available.

− The Notifications window: This window displays information relevant to the analysis including the
status of analytics processing (i.e., messages of successfully completed or failed processes, and
calling the user to action) and information about changes on the users who have been added to or
deleted from the engagement in which the analysis was created.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
22
Figure 18 – My Analysis screen

Analytics Selection
The Analytics Selection screen is the start of the Data Workbench data processing journey. Within this
screen, the user has the ability to:

− Select one or more analytics for processing

− View the Analytic Name, (Analytic) Group, (Analytic) Process and the required file types to execute the
selected analytics

− Delete one or more of the selected analytics from the current analysis

Figure 19 – Analytics Screen

2.6.1. Select Analytics for Processing


To select an Analytic, the user clicks the +Analytics button, which opens the Add Analytics pop-up
window. The pop-up window provides the user with access to the library of available analytics to scope the
required analytics of relevance to the analysis.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
23
Figure 20 – Analytics Tab (before Analytics selected)

In addition to selecting analytics, the pop-up window provides information on the analytics, and several
options for filtering or searching and displaying the available analytics, as shown below
Filtering can be performed on the following fields in the Analytics library:

‒ Group (i.e., a set of analytics covering the same area are brought together into the same group),

‒ Process for which business process the analytics is applicable (e.g. financial accounting)

‒ File Type(s) that are required for processing of specific analytics (i.e., the identified file types).
Once dropdown selections have been made in one or more of the library fields using check boxes to pick
values on which to filter, click on the Filter button to activate the filter. Filters can be removed by selecting
the “Select All” option in the affected fields and once again clicking the Filter button.
There is also available a keyword search by group, process, and file type. Note that the keyword search
takes the current filters into account. If results are not returned, ensure that the filters are set appropriately
before searching.
Click Insert to add the selected Analytics to your Analysis. Once analytics have been inserted, they are
added to the My Analytics list and the “Add Analytics” pop-up closes.

Figure 21 – Selecting analytics from the analytics library

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
24
Once analytics have been inserted, they are added to the My Analytics list when the pop-up closes.

Figure 22 – Analytics Tab (after the Analytics are selected)

2.6.2. View the Status and Requirements for Selected Analytics


After having added the analytics to the analysis, the pop-up automatically closes and returns the user to the
main My Analytics screen. Within the grid, the user now has the ability to view the selected analytics from
the main page, including the data requirements (i.e. file types) for each of these analytics. Once processing
has started, this screen will also highlight the status of the analysis by means of the subway map at the top
of the page.

2.6.3. Delete an Analytic


To delete an analytic (in the Analytics tab), the user can select an analytic within the My Analytics list and
click the Delete button. As with any previous delete activities, an alert is being triggered asking the user to
confirm whether they want to remove the analytic.

Figure 23 – Delete an analytic

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
25
Upload, Import and Transform Data

Figure 24 – My Analysis-Status Bar

In the Data screen, the user can perform the following actions:

− Upload and import client data (including setting import parameters)

− Transform the data to the KPMG Clara data requirements using the data field mapping functionality

− View the analysis information

− Activate SQL Insert to perform manual transformation of the data

− Review the Data Import and Data Transformation validations.


These activities will be further outlined in the subsequent sections.

2.7.1. Upload and Import Client Data

2.7.1.1. Single or multiple file upload (TXT, CSV, XLSX)


To start adding data to the analysis, the user clicks the +Data Source button, and then the Data From
Hard drive option.

Figure 25 - Upload - Data Source

This opens a Windows File Explorer, where the user can select one or more files to be uploaded. The
Data Workbench allows for the upload of various file formats, including text files (TXT), comma-delimited
files (CSV), and Excel (XLSX).

 Note that Excel 2003-2007 (XLS) files cannot be uploaded

Note that this single or multiple file upload approach works best with files less than 2GB size. For larger
files, please refer to section 2.7.1.4 Large file upload.

2.7.1.2. Zip file upload


The user can also upload one or multiple zip file(s) by clicking on the +Data Source button and then
selecting the Zip File Upload option. This opens a Windows File Explorer in which the user will have the
ability to select the zip-file from their computer (similar to the single file upload feature). By using this

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
26
option, Data Workbench will only allow the upload of files with “.zip” as extension (other archived
extensions are not supported).

 The Zip file upload feature does not support a hierarchical structure such as nested zip files or a zip
file which includes a folder structure.

After selecting the file, a message banner will be displayed at the top of the screen to indicate that a
backend request has been queued to upload the file then, the archive file will be displayed on My Data
screen with the “Upload in progress” state.

Figure 26 – Zip file upload in progress

Once the upload has completed, Data Workbench will start to unzip the files and the status then changes to
“Upload Successful and Unzip in progress. Refresh to get the latest update”

! Do not refresh the web browser or navigate to another place in the active web browser tab until the
zip file is displaying the status below.

Figure 27 - Zip file upload successful

After the user should refresh the screen until getting the standard pop-up showing the File Upload/Import
screen will come up, where the user can set the parameters for the TXT, CSV, or Excel files that were
within the archive (see 2.7.1.5 File Import).
Notes:

− The upload of other archive file formats is currently not supported through Data Workbench.

− The zip-file upload function will not unzip “nested” zip files. That means, if the zip file contains another
zip-file, the underlying ones will not get automatically unzipped. As such, it is recommended to
separate zip-files (i.e. upload 2 zip-files) rather than one archive with zip-files inside.

− In cases where the zip-file contains multiple folders, ensure that the file names are distinct. In cases
where multiple files with the same name are detected, the application will overwrite the file that already
exists.

2.7.1.3. Set up an ERP Pipeline


The ability to upload an SAP or Oracle dataset, extracted via the KPMG Data Extraction Tools for these
systems, is available by clicking on the +Data Source button and then selecting the Set Up An ERP
Pipeline option. This opens a pop-up window in which the user will have to define what ERP system and
version are being uploaded and then configure the organizational setup for the data to be processed. This
feature is applicable for SAP ECC 6.0 and Oracle R12.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
27
Figure 28 – Set up ERP Pipeline

After setting up the ERP Pipeline, organizational scope can be defined in the following screen:

Figure 29 – Organization scope

Then, in the subsequent pop-up window, the user will have access to select the data files to be uploaded.
By clicking the Browse button, the user is directed to a window to select the data from their computer. Data
files can also be uploaded via the direct transfer to the upload folder, for which the details are shown on the
popup window, or in the Analysis Detail window. Files can be loaded either as text files or a zip file
containing the text files. Only data extracted using the official KPMG Data Extraction tools for SAP and
Oracle is compatible with the functionality of the ERP pipeline. For data obtained from SAP and Oracle
systems in different formats, please use the SQL Insert functionality, and the accompanying ERP ETL
Guidance.

Figure 30 – Import data files

After uploading data using the direct transfer method, click on Refresh button to display the files on screen
and then click on Import data button. Note that the Data screen must not be manually refreshed during the
file upload process, in order to prevent issues with the import process not completing automatically.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
28
A Quick Reference Card detailing the functionality and operation of the ERP pipeline can be found here.

2.7.1.4. Large file upload


In cases where larger files need to be uploaded, this can be done by directly uploading these files to the file
upload path of the engagement. Unlike the process discussed above in section 2.7.1.1 “Single or multiple
file upload”, uploading larger files does not make use of +Data Source button on the data screen. Rather,
you copy the data file(s) from your laptop into an into a shared folder provided by Data Workbench to
upload data for the analysis.
The first step is to find the upload path, which is the location of the server where you want the place the
large data files. In the My Analysis screen, on the right-hand margin, select the icon as shown in the
screenshot below.

Figure 31 - The Analysis Details icon

This icon launches the “Analysis Details” window, which is a listing of server names, including the analysis
database and cube servers and network file shares that are being used in your analysis (these will vary by
analysis).
Open the Analysis Details for your engagement by clicking the disk icon . Select the entire Upload Path
and copy it (Ctrl-C) and close the Analysis Details window. The Upload Path highlighted in the screen print
below is where you will copy the files to upload into Data Workbench.

Figure 32 - Location of the Upload Path

Open a Windows File Explorer window and paste the Upload Path into the Address Bar of the File
Explorer. You may need to refresh the Windows File Explorer to ensure that it is pointing to the Upload
Path you copied in.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
29
Figure 33 - Windows File Explorer with the Upload Path

From a separate File Explorer, drag or copy the files you wish to upload to the Upload Path File Explorer
window and refresh the Data Workbench browser screen.
When the upload for all of your selected files has been completed, you will have to refresh the web browser
at the Data screen and you will see the files displayed, one line for each file in the Select Data from Hard
drive screen. In this screen, you enter the import parameters discussed in section 2.7.1.5 File Import
below.
The same process can also be used for large archive (zip) files. Once the zip file is uploaded to the file
upload location, Data Workbench will automatically unzip the archive. The user will have to wait until the
unzip process is completed (i.e. once process is complete zip file is deleted automatically and only the
contents of the zip-file should be shown in the file upload folder).

2.7.1.5. File import


After having selected the data files to be uploaded, the files will be displayed (only the supported files) in
the File Upload/Import screen. For each of the files, it is then necessary to configure the following import
parameters:

− Column Delimiter (the text character(s) used to separate fields within the data file)

− Text Qualifier (the text character(s) that appear around text strings to differentiate them from other
types of data within the data file)

− Thousand Separator (the text character that is used to separate thousands places in data fields. The
character used varies in different regions of the world. Can be either comma, period, or none.

− Decimal Separator (the text character used to separate decimals from whole numbers in data fields.
Like the Thousand Separator, the character used varies in different regions). Can be either comma or
period or none.

− Select file type(s) (to which data requirement the uploaded data file is related)

− Date Format (the order and formatting of how month, day and year components of dates are formatted
in the data).
Please note that prior to the “Import Data” button becoming available, the “Select All” box (or all the boxes
for the individual files) should be marked in order to be able to proceed with the upload.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
30
Figure 34 - Upload/Import Configurations

In order to populate these values correctly, a “Preview” feature has been implemented which displays the
first 10 records of each data file. This feature can be accessed by clicking the Preview button underneath
each data file.

Figure 35 – Data Preview

After having set all the parameters, the Import Data button will become available, and the user can import
the data into the analysis.
Notes:

− With respect to the column delimiter, it is recommended that a column-delimiter that is not usually
found elsewhere within the client data file 2 be used for the data files, in order for the data import
process to go smoothly. A good example of a “strong” column-delimiter is a “combined” column
delimiter of two or more uncommon characters. We provide the ‘#|#’ (hash-pipe-hash) column delimiter
(which is used in other KPMG data analysis solutions) as a standard column delimiter within the
interface. However, users can define their own column delimiters by selecting Other from the drop-
down menu and then define their own delimiter as used in the data.

− The Text Qualifier option is currently only available for CSV files.

− Data files cannot have the same thousand and decimal separators within the data. Upon choosing one
of the options for either thousand separator or decimal separator, this option will be greyed out on the
subsequent selection.

− Each data file can only be mapped against one file type (i.e. data requirement) simultaneously. In case
a data file contains more than the data for a single file type, the user either needs to split these data

2
Some special characters may appear within data fields such as an “@” sign such as in a field for email addresses or price
quotes. If such a character is also used for a delimiter, Data Workbench will not be able to distinguish actual data from
delimiters, and likely experience import issues. The use of a text qualifier in combination with a more common delimiter can
overcome these issues.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
31
files, or upload the data file another time (with a different name) and then tag it to the other file type.

− Date formats have to be consistent within one data file. In cases where the date formats within one file
are not consistent, this will result in data import and transformation issues and failed validations. Date
formats can be different between files, as date formats are set per data file.

− The preview feature only shows the first 10 records of a data file. This generally is sufficient to identify
what the common settings in the file are. In case different data permutations are present in the file after
the 10th row, these will not be visible in the preview, and might (still) results in failed data import and
transformation processes as soon as the data is being loaded.
Upon clicking Data Import the user will be returned to the Data screen. In the My Data window, the
selected data file names will be displayed. Once these names display, their status will automatically
proceed from “Import in Progress” to “Import Successful”.

 Note: The Data Upload and Data Import stage have automated triggers to refresh the screen. As
such, it is of critical importance that the user does not refresh the screen manually when data files are
not yet listed as “Import in Progress”. Refreshing the screen before this status is reached for all files
will result in the fact that the remaining progress of the files to be uploaded is not picked up anymore.
In those cases, the user will have to remove the data files, and then upload and import them again.
Depending on the file size and system activity, it might take some time before the status is
automatically moved to “Import in Progress”.

2.7.2. Transform the Data Using the Data Field Mapping Functionality
After the successful import of data, the user needs to perform the data field mapping for each imported
data file. The purpose of data field mapping is to map client data fields to the mandatory and optional fields
needed to process the selected analytics. Data field mapping is a prerequisite for being able to continue
with the subsequent data processing stages.
In order to perform the data field mapping, the status of the data files needs to be “Import Successful”.
Then, the user should select each file individually, and perform the mapping of the data fields received in
the client data against the data requirements set by KPMG Clara for each of the selected file types (i.e., the
Common Data Model). This is done by selecting the “correct” value from the source field list (a drop-down
feature), for the field that corresponds with the desired target field value, as shown below.

Figure 36 – Data transformation field mapping

During this process, the user needs to make sure all “Mandatory” fields are mapped prior to transforming
the data files. This can be reviewed in the pane on the right-hand side, where a “Priority” is indicated (an
aggregated conclusion), as well as an assessment of each data requirement (“Target Field”) against the
analytics selected. Fields for the individual analytics can take two statuses: either Mandatory or Optional.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
32
With respect to the Priority field, this is the conclusion over all selected analytics for given target field.
Priority can have one of three statuses:

− Mandatory in case the field is mandatory for all selected analytics,

− Optional in case the field is optional for all selected analytics, and

− Partial in case the field is mandatory for at least one of the selected analytics, but not all.

Figure 37 – Data transformation field Priority and data field requirements per analytic

In cases where the field names in the source data file have the same names as the target field (from KPMG
Clara data requirements such as companyCode, documentNumber. etc.), a simple form of auto-mapping
(based on identical names) will already have taken place, leaving the user having to map only those fields
whose names in the source file do not match one of the target field names. This auto-mapping functionality
simplifies the efforts within the Data Workbench application and promotes the use of pre-processing stages
to prepare data.
Once the data fields for each of the imported files have been (individually) mapped, the user should then
select all the files using the checkboxes in the My Data screen and press the Transform button. This will
start the transformation process and begin moving the imported data into our Common Data Model.

Figure 38 – Data transformation functionality

2.7.3. View the Analysis Details


One feature that has relevance throughout a number of stages, is the availability of the Analysis Details
feature. Throughout all the screens, a database icon is displayed on the top right of the screen in Data
Workbench. Clicking this icon will display relevant information in the area of:
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
33
− The analysis database name: The name of the relational database (for data processing)

− The analysis server and instance name: The name of the server and SQL instance on which the
relational analysis database is located

− The download path: The location where the generated Excel dashboards are stored for this analysis

− Cube DB Name: The name of the multi-dimensional data cube (for results generation)

− Cube DB Server: The name of the server and SSAS instance on which the multi-dimensional data
cube is located

− The upload path: The location where (large) data files can directly be copied to, and will be picked up
by the application during data import

− The archive path: The location where the database backup is stored for this analysis.

Figure 39 – Analysis details pop-up window

Each of these have relevance in different stages of the data processing, such as during the file import (for
larger files), the transformation of data (using SQL Insert), or the debugging of issues.

2.7.4. Activate SQL Insert to Perform Manual Transformation of Data


Other than data field mapping, there is an additional way of performing transformations in Data Workbench.
This alternative is the use of the SQL Insert option, which allows the user to create their own transformation
rules and execute these within SQL Server Management Studio. This feature is the main use case for the
“ERP ETL Guidance”, which describes tailored ways for KPMG Clara to obtain the data needed from
different financial systems.
Upon activating the checkbox for SQL Insert, the user does not have to perform the data field mapping, but
rather can use transformation scripts to transform more complex types of data (e.g. ERP native data that
doesn’t directly match with the KPMG Clara data requirements). For detailed guidance, see the QRC on
SQL Insert and the available ERP ETL Guidance (for available guidance documents, see here in the “Data
Management” section)

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
34
Figure 40 – SQL insertion selection in data transformation process

Figure 41 – SQL insertion syntax guidance

In order to make use of the SQL Insert option, the user performing the transformation must have been
given SQL access via a new role created in Management Console and also have been given access to the
engagement (managed by the engagement team). In addition, the access granted does not give full access
to the database, but is limited to the following:

− Public SQL Server Instance access

− Only Data Manipulation Language (DML) permissions

− No Data Definition Language (DDL) permissions

− Read/Write access only to the database schema in which the data transformation is happening (DBO
and FIN)

− Read access to all remaining Schemas (i.e. GEN, ERP, PTP, OTC, etc.).

2.7.5. Review the Data Import and Data Transform Validations


After data file import and transformation have been completed, the user can review the import and
transform validations for each file by clicking on the check box of the file at the +Data Source section, and
then clicking the Validations button. This opens a pop-up window showing the Upload/Import Validations,
as well as the Transformation Validations. For both of the validation types, the hyperlinked summary lines
can be further expanded to view the details. For failed validations, there is the ability to view the exception
details of the first five issues for a specific validation on-screen, the full exception can be downloaded to an
Excel file.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
35
.
Figure 42 – Data transformation validation details pop-up

For detailed information about file import and transformation validations, refer to Appendix A. Summary of
Data Import Validations.

Account Mapping

Figure 43 – My Analysis-Status Bar

The Account Mapping feature is accessed from the Accounts tab and enables engagement teams to map
General Ledger (GL) accounts imported from the entity’s data files to the KPMG Clara Knowledge
Accounts that have been authored within the Financial Statements module in KPMG Clara workflow.
The essential activities performed within the Accounts screen are as follows:

− Loading the Financial Statement structure

− Mapping general ledger accounts to the Financial Statement structure

− Changing and un-mapping accounts

− Saving the account mapping

− Using the Export and Import features

− Reviewing the Account Mapping Validation report

− Publishing the account mapping.


These steps will be outlined below in their respective sections.
When looking at the account mapping interface, we can distinguish a number of features that are discussed
below:

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
36
Figure 44 – Account mapping interface

1 Switching between the Map Accounts feature and the Mapping Validation Report
2 The General Ledger Accounts window (also referred to as “left hand side” in account
mapping)
3 The Financial Structure window (also referred to as “right hand side” in account mapping)
4 Toggles to show current year ending balances and prior year ending balances in the mapping
grid
5 Statistical information for the account mapping being performed, including the total number of
accounts loaded, the number of mapped accounts, and the number of unmapped accounts –
split between active accounts (i.e., accounts that need to be mapped) and inactive accounts
(i.e., accounts that have no activity and are optional for mapping)
6 Account mapping features to load a financial structure (i.e., obtain from the Financial
Statements module), clear the mapping of all mapped accounts, and import and export
account mapping reports (both from/to an Excel format)
7 Filter and search features within the account mapping windows to search for certain accounts
or groups
8 A legend to identify the state of each account (i.e., unmapped accounts, mapped accounts,
and inactive accounts)
9 The ability to save the account mapping
10 Refresh changes done to the financial statements structure
11 Run mapping suggestions (mapping bot) when the general ledger account group data column
has been mapped in the GLA or GLTB files. For more information on using mapping bot,
please refer to the KPMG Clara mapping bot user guidance.
With respect to the filter functionality, the following is relevant to understand:

− The filters on the General Ledger accounts window can be applied by account type (e.g. revenue,
trade receivables, etc.), by account number ranges or by using key words included in the account
name or account number and then clicking the Apply button. To remove all the applied filters and
display all GL accounts again, click the Reset button.

− The filters also allow to hide or show general ledger accounts with zero balances or transactions in the
period

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
37
Figure 45 – Account mapping filters

− The list of general ledger accounts is grouped by Balance Sheet/Income Statement. Additionally, if the
column for account group is imported for the GLA file, these account group levels will be displayed as
well. The following information displays for each account:

− Account number

− Account name

− Ending balance for the analysis period

− Ending balance for the prior year (optional)

− The list of Financial Statement structure nodes displays all the Financial Statement captions and
workflow accounts. Totals created in the Financial Statement module are not being brought over to the
account mapping, as totals per group are automatically calculated upon mapping accounts to these
nodes.

− Financial Statement caption or workflow account filters can be filtered by selecting the check boxes on
the left and then clicking the Save button. It is also possible to display the Library account mapped to
the workflow account at the Financial Statements structure by using Show knowledge accounts
toggle button.

Figure 46 – Financial statement caption filters

Lastly, it is highly recommended to use the Account Mapping Save button frequently to avoid losing any
changes during the mapping process. Additionally, it is a best practice to download the account mapping
report at the end of the mapping stage. This report can always be reused in cases where data needs to be
reprocessed (without needing to remap all the accounts again), or for roll-forward purposes (e.g. interim to
year-end roll-forward) on the same data set.

2.8.1. Loading the Financial Statement Structure


The first time the Account Mapping screen is loaded for the analysis, the left-hand side displays the general
ledger accounts imported and transformed in the previous stage, with the main source being the GLA data

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
38
file. However, upon first entering the screen, there will be no Financial Statement structure displayed yet to
map the general ledger accounts to on the other side. To load the Financial Statement structure in the right
pane, click the Refresh button located on above the Financial Statement grid:

Figure 47 – Account mapping financial statement structure

Once the financial statement structure is retrieved, the date and time display on the screen of the last
retrieved structure:

Figure 48 – Financial structure update details with the Refresh button

Note: The financial statement structure in the account mapping screen is only retrieved when the user
actively does so, using the functionality described above. Any changes made in between within the
Financial Statement module are not automatically brought over. When the user wants to reload the
financial statement structure in account mapping, click the Refresh button again. Upon doing so, an alert
message will display to ask the user to confirm the reload.
Note that upon reloading the financial statement structure, the account mapping of various nodes might be
lost due to changes in the underlying structures. This includes changes such as renaming Financial
Statement Captions and Workflow Accounts, removing Financial Statement Captions and Workflow
Accounts, and adding new ones, or the mapping/un-mapping of knowledge to the Workflow Accounts.

2.8.2. Mapping General Ledger Accounts to the Financial Statement Structure


Mapping the general ledger accounts to the Financial Statement structure is arguably the most important
function of the account mapping feature. Below, the user can find the steps to perform this activity:
1. To map accounts from the left pane to the Financial Statement structure on the right pane, it is
important to verify that the GL account is properly highlighted in gray. This happens by clicking on the
account name and number.

Figure 49 – Account selection

It is possible to select more than one account using the mouse in combination with the Control or Shift
keys on your keyboard.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
39
2. Once the account is selected, use the mouse to drag and drop the account to the Financial Statement
structure node on the right-hand side of the screen.

Figure 50 – Account mapping drag and drop functionality

3. Make sure the accounts are dragged and dropped to Workflow Account nodes on the right side, and
not to Financial Statement captions. Otherwise, the Unable to Map error message will display in the
upper left corner of the screen.

Figure 51 – Account mapping feedback

4. Once the account is mapped, it will be removed from the left pane and display on the right pane with
the Account mapped indicator:

Figure 52 – Mapped account

5. Upon mapping, the Workflow Account and its parent Financial Statement captions nodes will be
updated with the aggregated ending balance of the accounts, based on accounts mapped to these
nodes.

2.8.3. Changing and Un-mapping Accounts


The engagement team can change the account mapping during the mapping stage by manually un-
mapping the accounts or by dragging and dropping accounts into another workflow account.
1. Verify the account to be changed is selected (highlighted in gray).

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
40
Figure 53 – Account mapping feedback

It is possible to select more than one account by using the mouse in combination with the Control or
Shift keys.
2. Drag and drop the selected GL accounts into another workflow account. In case the accounts are
placed into a Financial Statement caption or a total, the Unable to Map error message displays in the
top left corner of the screen.

Figure 54 – Account mapping feedback

3. Once the account is moved, the balances will be updated in both the previous workflow account and
the new account, along with the corresponding parent Financial Statement captions nodes.

Figure 55 – Mapped account following remapping

Similar to moving an account from one node to another, the user can make use of the Unmap feature
within the account mapping. In order to do so, the user should.
1. Verify the account to be un-mapped is highlighted in gray.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
41
Figure 56 – Mapped account selection

It is possible to select more than one account by using the mouse in combination with the Control or
Shift keys.
2. Right-click the selected accounts and select the Unmap menu option.

Figure 57 – Account unmap pop-up

3. The un-mapped GL account will be removed from the Financial Statement structure on the right pane
and will display again on the left-hand side (as part of the General Ledger accounts) with the
Unmapped indicator.

Figure 58 – Previously mapped account unmapped

In order to perform a “bulk remove” activity, the account mapping screen also offers the ability to un-map all
accounts with one click by using the Clear Mapping button. This option removes all the general ledger
accounts mapped to Workflow Accounts and returns them to the left-hand side of the account mapping
screen with an indication of “Unmapped” accounts.

1. Click the Clear Mapping button on the upper right side of the screen.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
42
Figure 59 – Clear Mapping functionality

2. A confirmation message displays to confirm the un-mapping of all GL accounts.

Figure 60 – Unmapping notification

3. All the accounts will be removed from the right pane and return to the left pane with the Unmapped
account indicator. A confirmation message displays in the upper left corner of the screen.

Figure 61 – Unmapping confirmation

2.8.4. Saving the Account Mapping


This feature allows for saving the account mapping at any time during the mapping process for the analysis
by clicking the Save button. A confirmation message displays in the upper left corner of the screen.

Figure 62 – Account mapping save functionality

It is highly recommended that teams frequently save the account mapping. In addition, it is recommended
to save the mapping whenever changes are made to ensure the latest copy of the mapping is available to
prevent situations where data needs to be reprocessed due to latest changes not being reflected.

2.8.5. Using the Export and Import functionality


In order to support roll-forward of account mapping, the Data Workbench provides roll-forward functionality
via the import and export of previously performed account mappings.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
43
 Note. For cases where the number of accounts exceeds 5,000 for the import and 1,000 for export, the account
mapping will be handled through a back-end service. As a result, the user will have to wait for the import /
export Account Mapping task to be completed, the process can be monitored through the Activity Log.

Export Account Mapping Report


After confirming the Account Mapping is saved, click the Export button. A pop-up is displayed which has
the following buttons for managing the Account Mapping file export to Excel:

Figure 63 - Account Mapping Report - file export

− “Export”, which is active when the pop-up is initially displayed

− “Download”, which becomes active after the report is exported, and

− “Cancel Export”, which The Account Mapping report will be exported into an Excel file.
If the Account Mapping Report was previously downloaded, when the Export function is selected again, the
pop-up will include the date and time of the previous download. Progress of the task can be tracked
through the Activity log feature.
The downloaded Excel file which by default is named “Account Mapping” + (name of the Analysis) displays
the following information:

Figure 64 – Account mapping Excel file

1 Data Workbench analysis data


− Engagement name

− Total mapped, unmapped and inactive accounts


2 Country and industry selected at Engagement Profile.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
44
3 Version template of the account mapping report file.
4 Status of the account (Possible values: Mapped/Unmapped)
5 GL Account number
6 GL Account Name
7 GL Account type (Possible values: Income Statement (P) / Balance Sheet (B))
8 Labelled account: This is the workflow account to which the GL account was mapped.
Depending on whether the workflow account was linked to knowledge accounts, it displays
the value in one of the following ways:
− Workflow account linked to knowledge account:
(Knowledge account identifier) – Workflow account Name.
e.g. 83948 - Equity-accounted investees, 1044862 - Deferred tax assets

− Workflow account not linked to knowledge account:


Workflow account name
e.g. Administrative expenses, Equity-accounted investees.
9 Active/Inactive: Indicator that describes whether the GL Account have journal entries
transactions identified during the analysis period
From the Labeled Account Name column, it is possible to change the mapping of the account to be
imported into Data Workbench.

Figure 65 – Account mapping in Excel functionality

Import account mapping report


Click the Import button and a “Browse File” pop-up is displayed:

Figure 66 - Import Account Mapping - pop-up

2. When the applicable file is selected, the “Browse File” pop-up is displayed again, providing status of
the import and instructions to refresh the web browser to display the accounts mapped from the
imported Account Mapping Report.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
45
Figure 67 – Account mapping import file selection

After refreshing the web browser, the account statistics are updated, and the GL accounts are mapped
to the Workflow Accounts.
If the account mapping report file cannot be read, an error message displays in the upper corner of the
screen.

Figure 68 – Account mapping error for unsupported account mapping file

2.8.6. Reviewing the Account Mapping Validation Report


The Account Mapping Validation report contains the following validations:

− Accounts with Unmatched Account Type: Cases where balance sheet accounts (as per source
data) are mapped to income statement accounts within the Financial Statement hierarchy and vice
versa. This validation is performed irrespective of whether the workflow accounts are linked to
knowledge.

− Accounts with Unexpected Balances: This validation is performed only when the workflow accounts
are linked to the Library. Based on the Library account association, the validation checks whether the
accounts mapped to each workflow account match with the expected balance of that workflow account
(debit or credit).

− Unmapped accounts with current year non-zero balances / non-zero transactions: This validation
displays any unmapped accounts that are relevant to the analysis, i.e. accounts that have balances at
the ending period or transactions during the period. Only accounts that have neither (i.e. no balance,
and no transactions) do not have to be mapped.
The steps to take to review the Account Mapping Validation report are:
1. To access the Account Mapping Validation report, click Mapping Validation Report.

Figure 69 – Mapping validation report

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
46
2. To display the detail of the accounts falling in each validation, expand the section.

Figure 70 – Accounts identified as having unmatched account types

3. The list will display the following information:

− GL Accounts

− Account number and name

− Account type (Possible values: Balance Sheet/Income Statement)

− Accounts linked to knowledge

− Name of the workflow account where the GL account was mapped

− Type of the workflow account (Possible values: Balance Sheet/Income Statement).


There are two actions available at the Account Mapping Validation Report: Either users can un-map
the accounts (which removes the mapping in the account mapping screen and moves the account
back to unmapped accounts on the left-hand side of the screen) or ignore the validations (which allows
users to manually overrule the suggestion). Select one or all accounts under the specific action and
click the Save button.

Figure 71 – Unmap following account mapping validation functionality

When selecting the Unmap action, the selected accounts return to the left pane and the confirmation
message displays in the top left corner of the screen.

Figure 72 – Mapping validation report following correction

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
47
When selecting the Ignore action, the selected accounts will remain in the Validation report and the
confirmation message displays in the top left corner of the screen.

Figure 73 – Validation save notification

In addition, an Excel file can be downloaded with the information with the account mapping validations by
clicking the Export button.

Figure 74 – Mapping validation report Excel export functionality

The exported file contains one sheet for each validation.

Figure 75 – Mapping validation Excel file

2.8.7. Publish account mapping


After having mapped all active accounts and having verified the validations and either addressed or ignored
them, the account mapping can be published to process the selected analytics.
1. Navigate to the Account Mapping Validation report.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
48
Figure 76 – Account mapping validation report navigation

2. Verify all the accounts have been mapped and there are no warnings requiring any action. Then click
the Publish button.

Figure 77 – Account mapping publish functionality

3. In the Review and Publish window, confirm the account mapping has been reviewed and is ready to
be published. Then click the Publish button.

Figure 78 – Review and publish pop-up

4. A confirmation message displays in the top left corner of the screen.

Figure 79 – Account mapping publish confirmation

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
49
Parameters

Figure 80 – Parameters

Within the Parameters screen, the user can enter processing-relevant parameters, most notably driven by
high-risk criteria within the analysis. The available parameters depend on the analytics selected and are
mostly optional in nature. Parameters can be split into two areas: KPMG Clara analytics parameters and
Financial Services Routine parameters.

2.9.1. General Ledger Analysis Parameters


As part of the KPMG Clara analytics parameters, the following configurations can be made.

− KPMG Clara analytics Defining the client name

− Defining the number of periods in the fiscal year (excluding the adjustment periods) for the
annualization of balances on the Planning Analytics dashboard

− Defining for which accounts Account Analysis needs to be performed

− Defining which industry drives the Account Analysis expectations

− Defining relevant weekend days for the weekend-posting analysis in the Journal Entry Analysis
dashboard

− Defining relevant holiday days for the holiday-posting analysis in the Journal Entry Analysis dashboard

− Defining period-end for analysis in the Journal Entry Analysis dashboard

− Defining whether modified Account Analysis expectations should be used or not.


In the section below, we will go deeper into the Account Analysis parameters, as these require some
additional guidance.

2.9.1.1. Client Name


The Client Name text entry is used by engagement teams to define the client name that will be displayed in
the Excel dashboards. The client name should not contain more than 250 characters.

2.9.1.2. Number of periods


Number of Periods is used by engagement teams to define the number of periods for annualization in the
Planning Analytics dashboard. In order to provide an accurate annualization, the engagement team should
not include periods such as consolidation or elimination periods in the number of periods.
The number to be entered here should correspond with the number of fiscal/financial periods the client
recognizes within the fiscal year. This is not to be confused with the number of period provided in the data
(analysis periods).

Figure 81 – KPMG Clara analytics Parameters (Number of periods)

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
50
2.9.1.3. Account Analysis Scoping
The Account Analysis Scoping feature is used to select which accounts will be analyzed using Account
Analysis. Once accounts are selected, Account Analysis will only process those journal entries which
contain, in at least one of their line items, one or more of the selected accounts.

Figure 82 – KPMG Clara analytics Parameters (Account Analysis Scoping)

1 This parameter is only available when Account Analysis has been selected in the Analytics
screen.
2 Use the search functionality to assist in searching and filtering the Workflow Accounts.
3 Workflow Accounts can be selected or deselected using the checkboxes found next to the
Workflow Account name.
Refer to the KPMG Clara Account Analysis User Guide for more insights into this functionality.

2.9.1.4. Industry Selection


The industry parameter selection is used to define the industry assumptions that will be used to populate
the expectations of Account Analysis.
Currently, ten industries are available for use with Account Analysis: Automotive, Industrial Products,
Consumer Products, Life Sciences, Services, Energy and Natural Resources, Real State, Retail,
Technology, and Telecommunications.

Figure 83 – KPMG Clara analytics Parameters (Industry Selection)

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
51
This drop-down menu is populated with all industries selected within the engagement profile page in KPMG
Clara workflow. When attempting to select an industry not authored (see industries above), a notification
displays stating the industry cannot be used with Account Analysis:

Figure 84 – Industry selection error message

When there are multiple industries that have been mapped to the engagement profile, and one of the
industries has been authored, the “Up-Front Expectations” section should be used to view the default
account pairing expectations for the accounts mapped in the “Account Mapping” screen and to review the
appropriateness of the industry assumptions that will be applied.

2.9.1.5. Up-Front Expectation Changes


The Up-Front Expectation Changes screen is used by engagement teams to modify the default industry
expectations used in account analysis. This allows engagement teams to apply entity-specific
considerations to the expectations of the account analysis pairings prior to processing.
Engagement teams have the ability to select whether they would like to apply up-front expectation changes
by (1) making expectation changes to the current phase expectation matrix (which can be exported to the
user’s desktop) or using a roll-forward file from a previous period (e.g. using expectations changes from
interim to final).

Figure 85 – Parameters (Up-Front Expectation Changes)

If there are no up-front expectation changes to be made from the standard industry template, then select
No for both questions. Engagement teams are still able to make changes to the account pairing
expectations in the Account Matrix, found within the Account Analysis portal.
If a roll-forward file is available, select Yes for “Do you want to roll forward expectations from a previous
engagement?” This activates the Import button and allows the upload of a roll-forward file (see Rolling
Forward the Expectation Matrix).
When rolling forward an “Expectation Matrix”, engagement teams review the prior phase “Expectation
Modification” report and the “Expectation Matrix” before the file is imported (e.g. during the process
walkthrough(s)), and (1) confirm that any changes made in the prior phase are still applicable and (2)
consider the need for any additional changes to be made to the “Expectation Matrix”.

Figure 86 –Parameters (Roll-forward expectations)

When up-front expectation changes are to be made without using a roll-forward file, the engagement team
should select Yes to the question “Do you want to make up-front modifications to the current phase?”. This
activates the Export button and allows the download of the “Expectation Matrix” file.
The Expectation Matrix is used to change the default Knowledge applied prior to data processing.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
52
Figure 87 – Expectation matrix

The Expectation Matrix allows engagement teams to make the same modifications that would be
available within the account analysis portal within the Account Matrix. Engagement teams can modify the
cells using drop-down menus as follows:

− Default Expected (green) cell can be modified to Unexpected (red) or back to Expected (green).

− Default Unexpected (red) cell can be modified to Expected (green) or back to Unexpected (red).

− Default Unique (grey) cell can be modified to Expected (green), Unexpected (red), or back to Unique
(grey).

− Default Same (black) cell can be modified to Expected (green), Unexpected (red), or back to Same
(black).
This matrix uses the knowledge account mapping performed in the Financial Statement Module. Therefore,
engagement teams are unable to further modify the names within the matrix nor add/remove columns/rows.
Making a change to the expectation matrix, beyond the accounting combination expectation, will result in
the failure of the import of the modified expectation matrix.
Once the Expectation Matrix has been modified, it can be saved locally to an engagement team
member’s desktop. Once saved locally, the Expectation Matrix can then be imported by selecting Import
to complete the upload.
A confirmation message displays at the top of the page stating “Alert! Expectation modification has been
successfully imported”, verifying the file has been imported.
Once the modified Expectation Matrix has been successfully imported, an engagement team can review
and document the up-front expectation changes made by clicking Review Expectations. (1)

Figure 88 – KPMG Clara analytics Parameters (Up-Front Expectation Changes)

Review Expectations is used to document procedures and rationale considered by the engagement team
in making modifications to the default knowledge. When modifications are made, the debit, credit, prior year

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
53
expectation, default expectation, final expectation and source columns are automatically populated. The
prior year expectation column displays the modified expectation from the roll-forward file (if applicable).
Otherwise, it displays Not Applicable.
The Procedures Performed and Rationale and WP Reference columns are required to be completed by
the engagement team, whether or not modifications have been made through a rolled forward Expectations
Matrix or a newly modified Expectation Matrix. The documentation requirements for these columns tab are
the same as those required for the Account Analysis Expectation Modification Report.
All pre-populated and additional documentation completed are carried forward to the Expectation
Modification Report within the Account Analysis Portal.
Once the relevant text boxes have been populated, select Save and click the Next arrow to continue to
Scoping.

2.9.1.6. Rolling Forward the Expectation Matrix


The Expectation Matrix automatically updates to reflect modifications made in both the Up-Front
Expectation Changes screen, as well as the Account Matrix found in the Account Analysis Portal.
Once all necessary modifications are completed for the engagement, the Expectation Matrix can be
exported from the Expectation Modification Report found within Account Analysis Reports. The Expectation
Matrix is retained if intended to be used to roll-forward future phases (or periods).
The roll-forward logic considers first the mapped knowledge account (from the Library) mapped to the
workflow account. In the event there are multiple workflow accounts linked to the same knowledge account,
the account name is used to differentiate the expectations to be applied from the roll-forward file.
For example, the financial statement structure for the industry Industrial Products contains the following
accounts:
Workflow account name: “Global Revenue”, mapped knowledge account: Revenue
Workflow account name: “USA Revenue”, mapped knowledge account: Revenue
In the course of the analysis, “Global Revenue” does not have cash transactions, and so the expectation of
Credit: “Global Revenue” Debit: “Cash” remains as the default expectation, “unexpected”. In contrast, the
“USA Revenue” does have cash transactions, and so the expectation is Credit: “Global Revenue” Debit:
“Cash” is changed to “expected.”
For the purpose of the roll-forward, the logic will first use the knowledge accounts of “Revenue” and “Cash”
to identify changes to the default expectation. However, in this case, it would identify that there are two
accounts mapped to “Revenue”; therefore, it will move to using the name of the account to identify the
difference in expectations between the “Global Revenue” account and the “USA Revenue” account,
allowing the expectation change to be applied to the correct account.
In the event that the workflow account name has changed from phase to phase, for example, “USA
Revenue” is now “Domestic Revenue” in the financial statement structure, the expectation will no longer be
rolled-forward as the logic would not be able to differentiate between the two accounts.

2.9.1.7. Period End Journals


When activating the period end journal entry option, the user is presented with an additional grid for the
configuration of the financial posting periods and potential post-closing periods. Within this grid, the user
can select the start- and end-date of each period to be considered. Note that an overlap of financial periods
is not possible, and that each posting date can only be attributed to a single posting period. In order to
control these inputs, upon selecting the end-period of a specific financial period, the start-date of the next
period is automatically set to the next possible calendar date.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
54
Figure 89 – Create Analysis pop-up window with post-closing settings
1 Select the Defined Period End Analysis of JE Analysis in Scope with the Yes radio button,
enter the number of adjustment periods and click the refresh icon to add the additional
periods to the grid
2 These checkboxes (uneditable) are the additional periods added to the grid

2.9.2. Financial Services Routines Parameters


As part of the Financial Services routines parameters, the following configurations can be made:

− Defining the client name

− Defining client loan types, and mapping the values from the client data against these customized
categories

− Defining how the clients loan grades match with the standardized risk ratings within the Financial
Services routines

− Defining the lower- and upper-limits for the stratification buckets for loan balances, loan-to-value ratios,
and payment delinquency days.
With respect to these parameters, there are two “mapping” efforts: the mapping of the client loan types and
the mapping of risk ratings. For the mapping of the client loan types, the user can enter a description for (a
group of) loan types (e.g. commercial loans), and subsequently map the values from the client data to this
caption. In the reports, the self-authored description will come up. In addition, for each created loan type,
the user can define for which analytics this loan type should be considered by selection the Yes/No option
under each analytic.

Figure 90 – Financial Service Routine Analysis Parameters (Client Loan Type)

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
55
Similarly, the mapping of client risk ratings works against a standardize risk rating used within the Financial
Services Routines (example, 1-10), where the user has to map the values received in the client data (e.g.
A+, A++, etc.) against these standardized categories.

Figure 91 – Financial Service Routine Analysis Parameters (Client Risk Rating)

Lastly, there are three parameters for the setting of stratification limits for various routines:

− Setting lower and upper limits for loan balances (for the Loan Portfolio Overview charts)

− Setting lower and upper limits for loan-to-value ratios (for the Loan Portfolio Overview charts)

− Setting lower and upper limits for payment delinquency days (for the Payment Behavior Analysis
charts).

Figure 92 – Financial Service Routine Analysis Parameters (Stratification limits)

Each of the buckets can be customized to fit the client data. Users should ensure that all lower and upper
limits are defined, because otherwise data processing might fail due to missing limits.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
56
Validations

Figure 93 – Validation screen navigation

The Validations screen is used to summarize the different types of validation routines performed through
data upload, import, and transformation. The Validations screen also enables manual validation of the
completeness of the import data by using the available reconciliation reports. Users can view the
validations and reports either by analytic group or by analytic (Account Analysis, Planning Analytics or
Journal Entry Analysis, etc.), which is explained in the following subsections.
The Validations screen has the following functionalities:

− Review Upload, Import and Transformation validations

− Download and review the reconciliation reports

− Confirm the data import

− Save the confirmation.

2.10.1. Upload Validation Routines


When uploading a file through Data Import in Data Workbench, the system checks whether the file has
content to be used in the Data Import. If one or more of the uploaded files has no content, then the “0 KB
validation” failed and a pop-up warns the engagement team that this file cannot be uploaded in the Data
screen. The failure of an upload validation routine prevents the completion of Data Import.

Figure 94 – 0 KB Validation

2.10.2. Import Validation Routines


The import validation routines are designed to check whether the raw data files follow the required format
as specified in the Data workbench - Data Requirements. Import validation routines include checks for file
encoding, file types and column delimiters. The failure of an import validation routine prevents the
completion of data import.
For detailed information about these file validation routines, refer to Appendix A. Summary of Data Import
Validations.

Figure 95 – Import Validation on Validations screen

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
57
The status of the import validation section includes the following features.

− Validation status is presented in each validation column per data file.

-- validation passed

-- Validation failed

-- Warning, but not a hard stop of the process

-- Specific validation not applicable to this file

-- validation has been overridden in an ERP Pipeline (for


additional information, see here

− Exporting the validation results into Microsoft Excel® can be done by clicking the Open in Excel button
in the Validation pop-up on the Data screen. This Excel report may be used to assist in obtaining
updated data files when file validation issues have occurred.

2.10.3. Transformation Validation Routines


The transformation data validation routines consist of both screen level validation routines, as well as other
data validation routines. The status of the screen level validation routines is shown to engagement teams
instantly upon the selection of the date format, thousand separator, decimal separator, and mapping of
mandatory source fields. The screenshot below displays the data validation routines in the Validations
screen.

Figure 96 – Transformation Validation on Validations screen

The status of the transformation validation section includes the following features:

− Validation status is presented in each validation column per file type.

-- validation passed

-- Validation failed

-- Warning, but not a hard stop of the process

-- Specific validation not applicable to this file

-- validation has been overridden in an ERP Pipeline (for


additional information, see here

− Exporting the validation results into Microsoft Excel® can be done by clicking the Open in Excel button
in the Validation pop-up on the Data screen. This Excel report may be used to assist in obtaining
updated data files when file validation issues have occurred.
For detailed information about the above validation routines, refer to Appendix A. Summary of Data Import
Validations. Note that the results in the validation pop-up display on two pages.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
58
2.10.4. Analytic Reports and Reconciliation
The Reconciliation Reports enable the user to perform a manual validation of the completeness of imported
data by providing the ability to reconcile imported data to the source data files.
The screen contains three access points to download the following reports:

− Account Balance Report

− Trial Balance Report

− Periodic Transaction Report.


These reports are available and completely populated for the full scenario. The completeness check on
balances should be addressed outside of Data Workbench. By default, when in a JET only scenario, the
Trial Balance Report and Periodic Transaction Report are the only reports populated without showing the
General Ledger account name.

Figure 97 – Analytics reports and Reconciliation on Validations screen

2.10.4.1. Account Balance Report


The Account Balance Report summarizes cumulative balances at the general ledger account level at the
end of each financial period by Company Code and allows the engagement team to manually validate if the
data is complete prior to confirming and processing the data through Data Import. If the balances do not
reconcile, then a new file can be uploaded and there is no downstream impact. Once the data has been
validated, this report can be used for analysis purposes (e.g. planning analytics, final analytics). This report
displays the following information:

− Opening balances, which are obtained from the Balances file

− Cumulative account balances at the end of each financial period, calculated by adding opening
account balances and total balance of transactions for each financial period (i.e. sum of debits and
credits) obtained from the Journal Entry Transactions file

− Closing balances, which display the cumulative balance calculated for the last financial period of the
year.
Opening account balances and closing account balances shown in all three reports discussed above
should agree with each other.
To export the data to Microsoft Excel®, click the Open in Excel button.

Figure 98 – Account Balance Report

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
59
2.10.4.2. Trial Balance Report
The Trial Balance Report is the second option in the Validations screen. Clicking the Open in Excel button
exports the Trial Balance Report. While the Account Balance Report and the Periodic Balance Report
provide similar information in a different format from the Trial Balance Report, the Trial Balance Report
reflects the standard format that can be used to verify the completeness of the imported data files by
Company Code. Evaluation of the completeness of the journal entries can be performed in Data
Workbench through the use of the Trial Balance Report.
The Trial Balance Report contains the following information at the general ledger account level:

− Opening Balance column, which displays current period opening account balances obtained from the
Balances file.

− Total Debits column, which displays the total amount of debit postings recorded to each account
throughout the fiscal year per the Journal Entry Transactions file.

− Total Credits column, which displays the total amount of credit postings recorded to each account
throughout the fiscal year per the Journal Entry Transactions file.

− Calculated Closing Balance column, which displays the calculation based on the Opening Balance,
Total Debits and Total Credits columns in the Trial Balance Report.

− Imported Closing Balance column, which displays the current period closing account balances
obtained from the Balances file.

− Differences column, which displays the calculation based on the difference between the calculated
closing balances and the imported closing balances.

Figure 99 – Trial Balance Report

The engagement team should export the Trial Balance Report to Microsoft Excel® via the Open in Excel
button and use it to (a) assist with the manual reconciliation of the imported data with the current year trial
balance subject to audit and (b) document the performance of such data validation procedures, refer to
data verification in KPMG Clara Computer Assisted Audit Techniques (CAATs) Document.

2.10.4.3. Periodic Transaction Report


The Periodic Transaction Report displays the total activity for each financial period. It contains the following
information at the General Ledger account level by Company Code:

− Opening balances, which are obtained from the Balances file.

− Total balance of transactions (i.e. sum of debits and credits) recorded at each financial period, which
are obtained from the Journal Entry Transactions file.

− Closing balances, which are calculated by adding opening balances and period ending balances for
the entire fiscal year.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
60
This report can also be used for trend analysis and seasonal analysis, once the data has been validated.
To export the data to Microsoft Excel®, click the Open in Excel button.

Figure 100 – Periodic Transaction Report

2.10.5. Confirmation
When Data Validations and Analytics reports have been reviewed by engagement team, the Confirmation
section is intended for teams to manually confirm the analytics are ready for processing by checking the
“Confirmation” field checkbox for all analytic selected in the analysis. This confirmation is a mandatory step
and only confirmed analytics can be processed.

Figure 101 – Confirmation with the Confirm checkboxes highlighted

After saving the Confirmation screen, click on next to proceed to the processing stage.

Process

Figure 102 – My Analysis-Status Bar

The purpose of the Process screen is to start analytics processing once all analysis parameters have been
completed and also to provide information about the progress on the process. In case the “Process” button
is not active, as shown highlighted below, navigate to “Confirmation” screen and review the analytics status
and confirm all the analysis steps have been completed (e.g. green tick marks in the subway map).

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
61
Figure 103 - the Process screen

The Process screen displays the following information:

− Processing Business Logic: Lists all the business logic details of the analytics that are being processed

− Preparing Results Sets: List the status of the results data model preparation (e.g. Power BI data
model)

− Generating Result Visualizations: Displays the status of the results dashboards configuration (e.g.
setting up the data source connection and refreshing the dashboards)
The Processing Business Logic table shows the Process Status for each analytic, together with
timestamps for start-time and end-time of processing. Each analytic has a chevron to expand for viewing
the status of the individual procedures executed on the transformed data. In case of a processing failure,
the user is able to identify the specific components that failed. There is a checkbox next to each business
logic component, for which the user can click on whenever is needed to rerun specific steps related to
changes on imported data, account mapping, parameter changes (or any other changes in the analysis) or
when there is a technical issue with the process.

Figure 104 – Processing Business Logic table

The Preparing Results Sets table lists underling results data source that is used to run and display the
results of the selected analytics. Similar to the information for the Processing Business Logic table, this
table also displays the status and start and end time of processing of the results data model. Also, there is
a check box if the package needs to be re-run (reprocessed) in case of changes on the analysis or possible
technical issues.

Figure 105 - Preparing Results Sets table


© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
62
The Generating Result Visualizations table lists the steps to prepare the visualization artifacts that are
used to display the analytics results. Similar to the two tables above, this table provides information on the
status and start / end time of the process and provides the ability to reprocess if required.

Figure 106 - Generating Result Visualizations table

Results

Figure 107 – My Analysis-Status Bar

The Results tab displays the following information related to successfully processed analytics:

− Analytic Name

− Related Group

− Related Process

− Related File Type(s)

− A toggle to enable/disable the Results Viewer

− The View Results button


Note that the Results tab is only enabled if at least one analytic within the analysis has been processed
successfully.

Figure 108 – Results Overview

To access the analytics results, click on the View Results button to view the My Results tab that displays
the data widgets for the analytics selected in the analysis.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
63
Figure 109 – Data widget

Users can view the results displayed on the data widgets and download the dashboards results:
1 Open in Excel button to download analytics dashboards.
2 Click on the hyperlink(s) to open analytics specific visualizations

Database Archiving & Restore


2.13.1. Archive an analysis
In order to manage space on KPMG Clara workflow server environment, the archiving of the analysis

database is possible once this is fully processed. This feature is available from the pencil icon in the
Action section of the analysis and then in the “Database Properties” tab of the pop-up.

Figure 110 – Database properties tab

Prior to the archiving being available, a setting in Management Console should be reviewed. Please note
that the archiving feature is by default enabled and can be disabled at a country level. Member firms who
host KPMG Clara workflow in the regional cloud can log a support ticket to change this setting and turn off
the archiving functionality.
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
64
Figure 111 – Archive feature default configuration in Management Console

Once all analytics are processed successfully (database and cube processing have completed and
database has been shrunk to release unused space), and no subsequent changes have been made to the
analysis (e.g. parameter, data, account mapping publish, etc.), archiving is initiated:

− Automatically as a result of an implemented archiving schedule; after a predefined number of days


(this is defined at the instance level, by default 14 days) after the last successful processing has taken
place. This setting is managed from the Management Console in the System Configuration section and
cannot be altered or requested to be changed by country. When the archiving feature is enabled, it will
take the value defined at the instance level.

Figure 112 – System Configuration in Management Console

This information is also available in the “Analysis Detail” window for each analysis:

Figure 113 – Archive details

− Manually triggered by the user; from the “Edit Analysis” window, the user can initiate the process by
clicking the “Archive” button. This option is only available when the current status is displayed as
“Shrunk Successfully” or “Archive Failed”, and the analytic processing status is “Processed
Successfully”. This step will then execute the archiving process immediately.

Figure 114 – Archive action button in Analysis details pop-up windows

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
65
The database backup will be stored in the location given in the “Archive Path” that is defined in the Analysis
details (available from the database icon in the analysis).
Note: Archive folder is only accessible by users after the Archive feature has been triggered.

Figure 115 – Archive path in Analysis details pop-up windows

The archiving status is available in the Edit Analysis pop-up windows for each analysis. The following
statuses can occur:

− Provisioned Successfully: The Analysis DB has been provisioned (created) successfully, archive is
not available.

− Provisioning Failed: The Analysis DB provisioning (creation) has failed, archive is not available.

− Database Shrunk Successfully: After data processing the storage space that is not required anymore
is getting released by shrinking the database. The database is ready to be archived (if Analytic
Processing status is displaying “Processed Successfully on <DATE>”).

− Database Shrink in progress: Database shrink has been started and is in progress, archive is not
available.

− Archiving in Progress: The archiving has started, no changes on the analysis should be done until
the end of the process.

− Archiving Successful: The archiving has taken place, meaning that the analysis is now in a “read
only” state.

− Archiving Failed: The archiving has failed, please check the Management Console in order to identify
the root cause.

Figure 116 – Archive status in Analysis details pop-up windows

Once the database is archived, space reduction has been achieved by removing all tables which are no
longer needed outside of the read-only mode from the analysis database. The following information will
remain available:

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
66
Table 6 – List of reports available in Read-only Mode

Screen Reports available

Data Transformation validation reports

Account mapping Account Mapping Export File

Parameter Expectation modification report

Account Balance Report

Validation Trial Balance Report

Periodic Transactions Report

Results Excel Dashboards of processed analytics

After archiving is successful, the analysis will be in “read only” mode, meaning that the following actions will
be deactivated:
Table 7 – List of disabled functions in Read-only Mode

Screen Action

Analytic Overview Add analytics

Add analytics
Analytics
Delete analytics

Add date
Data
Delete data

Drag and drop functionality

Map account

Unmap account
Account mapping
Publish account mapping

Refresh Financial Structure

Unmap/Ignore account in Mapping Validation Report

Save parameter

Parameter Enter/Update parameter

Map/Unmap parameter

Confirm/Unconfirm analytics
Validation
Save validation

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
67
Screen Action

Process Process/Reprocess

Results Slicer user entry

Note: When deleting or resetting the analysis, the backup will also be deleted from the archive folder

2.13.2. Restore an analysis


When the team needs to make changes to the analytics and then reprocess, the database will have to be

restored on the server. This feature is available from the pencil icon in the Action section of the
analysis and then in the “Database Properties” tab of the pop-up.
The restore feature is available only if the Current Status is “Archiving Successful” or “Restoring Failed”. To
restore the database, user can click on the “Restore” button available in the Analysis details pop-up
windows.

Figure 117 – Restore action button in Analysis details pop-up windows

The restoring status is available in the Edit Analysis pop-up windows for each analysis. The following
statuses can occur:

− Restoring in progress: Restoring database has been triggered by the user

− Restoring Successful: Database has been restored successfully, read only mode has been
deactivated (i.e. database is “restored” including all database objects to the point before archiving,
shrinking, etc.).

− Restoring failed: Restoring the analysis database has failed, please check the management Console
in order to identify the root cause.

Activity Log
Information contained within the Management Console is very useful for determining why an analysis fails
and provides useful information for debugging and fixing the error. In order to display the information from
Management Console (user, task type, created on, last update, status, server, error info) into the specific
analysis, there is an Activity Log screen which can be accessed from two different steps:

2.14.1. Analysis Creation


Firstly, when an analysis is created, the user can click on the In-progress message under the engagement
name on the Advanced Capabilities Portal, Activity Log screen will popup. From this user can check status
of the analysis creation:

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
68
Figure 118 – Access to Activity Log during Analysis Creation

2.14.2 Importing, Transforming and Processing data


Secondly, engagement teams can see information from Management Console in the Activity Log screen
during any of the steps within Data Workbench, from Analysis Creation to Results generation, by clicking
on the activity log icon in the right margin navigation bar.

Figure 119 – Activity Log icon

Figure 120 – Activity Log screen

The activity Log can subsequently be exported into a Microsoft Excel format using the button.
This file can be shared with individuals supporting your engagement (e.g. Central Teams, support teams) in
case of issues, and contains additional information about the engagement such as:
− Project Name
− Engagement ID
− Analysis Name
− Analysis Server Name and Instance

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
69
Central Team Portal

2.15.1 Project Overview


The Project Overview has several features to provide the user a view regarding projects to which they have
access. The view can be customized to display the projects in a grid view or a tile view.

Figure 121 - Project Overview features

1 The project grid view can be exported to Excel (for reporting purposes)
2 The project view can be toggled between a tile view and a list view (shown here)
3 The user can filter the listed projects to which they have access by Analysis Name
4 The user can control which fields to display in the Project Overview grid and the order on how
are displayed in the grid from a dropdown menu launched from the list icon .
5 The user can drag and drop one or more column headings to this location (the grouping bar)
to group the Analyses by those categories. Here the column heading “Fiscal year” is the
grouping key.
The Column Options dropdown list allows the user to reorder the available fields on the grid (vertical
ellipsis), remove fields they do not wish to see (trash can), or add fields that are not displayed (+ Add New).

Figure 122 – Column options

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
70
2.15.2 Project User Management
The Project User Management tab allows the user to view which other users have access to the Data
Workbench project either added through Team management access or through Central team Portal, this
functionality is available not just for the currently selected project but across any projects to which the
current active user has access.
When the user first selects the Project User Management functionality, a search box is displayed. It allows
the user to search for Projects by Project Name or Client Name.

Figure 123 - Project User Management initial view

As soon as the user enters one or more characters that is contained within the Project, Client or Username,
a list of projects to which the user has access which meet the search conditions is displayed. As
additional characters are entered, the list is refreshed to include only those projects which continue to meet
the search criteria.

Figure 124 - Project search results

When a user selects one of the projects meeting the search criteria, its details are displayed, as shown in
below.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
71
Figure 125 - Project User Management screen

1 Project Name – Hyperlinked to the Advanced Capabilities Portal for the project
2 Client Name
3 Country
4 A list of Analyses within that Project – each analysis listed has a hyperlink to My Analysis
page
5 A list of users assigned to the selected project
6 A button to add additional users to a project – note that users added from the Central Team
Portal only have access to features in the workbench module of the workflow, not the
workflow itself.
The “Users assigned to this project” listing displays information about each user with access to the project
is displayed, including their first and last names, email address, alias (i.e., their user ID / name portion of
their email address), their initials and an icon with their role in the project.

Figure 126 - Users assigned to this project

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
72
The user roles for projects displayed in the “Users assigned to the project” listing are defined as follows:
Table 8 - Roles defined in Central Team Portal

Role Description Icon

KCW The user was added from Team management, this role has full
access to the Engagement and Data Workbench

Workbench The user was added on Central Team Portal and have access
only to the workbench project and not to the Engagement (e.g.
Financial Statements, Workflow screens, etc.)

Reporting Groups The user was added on Central Team portal and have access
only to specific analyses and company codes in the analysis.
This role is only available on Data Workbench with Sub-Ledger
package deployed (e.g. KPMG Clara Analytics)

2.15.1.1. Adding, deleting, and modifying users in Central Team Portal


Users can be added to a project with a Workbench role as defined in Table 8 above. Note that this means
they cannot access overall workflow functionality but only to the Data workbench project and its features,
such as creating an analysis managing data, mapping accounts, etc.
To add a user, click the “+ User” button in the upper left corner of the “Users assigned to this project”
screen. This launches the following pop-up

Figure 127 – Search results for adding a user to add to a project

The “Search User” field allows you to search for a user by one of the following key words:

− First Name

− Last Name

− Email Address

− Alias

− Last Name, First Name


The search returns a list of users to choose from in the format of alias, email address. When you find the
desired user, select them and click OK.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
73
Figure 128 - Adding the selected user

After the user has been selected, a pop-up displaying the other fields for the user’s information is
displayed, with a drop down for you to select the desired role for the user being added. Select the
Workbench User and the “Add User” button will go active.

Figure 129 – Role selection

Click “Add User” (which is hidden when the Role dropdown menu is opened) and the user is assigned to
the project

Figure 130 - New user assigned to the project

Users listed in the Project User Management in the Central Team Portal can also have several of their
information fields in the “Add User” form edited by selecting the icon; First Name, Last Name, Initials,
and Role can be changed. However, the Email address and Alias cannot be modified because these latter
link the user back to their KPMG identity information. When modifications are complete, select the “Update”
button that appears on the user information form.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
74
Figure 131 - Updating user information

Users added via Project User Management can also be deleted, by selecting the icon. Selecting that
icon will launch a confirmation message. Accepting the confirmation will delete the user from that project,
but not from other projects to which they are assigned.

Data Preparation Toolbox


Tools and Guidance on the Extraction, Transformation and Loading (ETL) of ERP systems data into KPMG
Clara is being made available, directly within Data Workbench by linking through to the global Data
Preparation Toolbox. The information that can be found here aims to be useful for both Central Teams and
engagement team users in trying to obtain data from various systems (e.g. via the KPMG data extraction
solutions for SAP and Oracle, or via the availability of ERP ETL guidance), for transforming data inputs
(e.g. via Data Wrangler), or for interpreting and transforming data to be ingested in KPMG Clara’s Data
Workbench (e.g. via transformation guidance for best practices on tools like IDEA or Microsoft Excel). The
user can access the Data Preparation Toolbox by pressing the corresponding button located on the right-
hand side navigation bar within the Data Workbench screens.

Figure 132 – Data Preparation Toolbox access

After clicking the Data Preparation Toolbox button, user will see a message containing directions with
which the user can find the relevant content in the global Data Preparation Toolbox portal.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
75
Appendix A. Summary of Data Import and Transformation
Validations
This document summarizes the different types of validation routines performed through the data import
process, which have been grouped as import validation routines and transform validation routines. This
document provides guidance on how each routine works, in what sequence they get executed and the
downstream impact of each validation on subsequent validations in the data import process.

A.1. Summary of Data Upload and Import Validations


File validation routines are automated routines that include checks for duplicate data, validation of the
encoding type, check for the completeness of the data fields, check for date format and check for number
format. When these file validations fail, this will prevent the data import process from proceeding for
the related data file. Consequently, other validation routines related to that data file will also be affected.

Validation # 1 – 0 KB Validation

Illustration

How it works This file validation routine checks whether raw data files are provided that are 0 KB
in size, indicating that they are blank files, without data or header lines and so
cannot be staged to the analysis database.

Validation type Hard stop. Process cannot continue until all validation issues are addressed

Data files affected − GL Accounts

− GL Account Balances

− GL Trial Balance

− Journal Entry Transactions

− Loan Sub Ledger

− Loan Interest Rate Changes

− Loan Cash Flow Transactions

− Financial Instruments

Potential reasons If the raw data files are blank, the files will not pass the validation and data
for validation to import cannot proceed. Contact appropriate entity personnel and ask for a new
fail and dataset to be provided with the missing data or proceed with processing without
recommended this file, if it is not required.
action

Other validations − If this file validation fails, data import will not continue with the subsequent
affected validations for data file(s) affected. Note: As there is an additional validation in
the upload screen that checks when 0 KB files are uploaded (and prompts the
user that that is not possible, this validation should generally not cause any
failures.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
76
Validation # 1 – 0 KB Validation

− Data validation routines will not be triggered for the affected data file(s).

Validation # 2 - File Encoding validation

Illustration

How it works This file validation routine checks whether raw data files are provided in one of the
following encoding types: UTF-16, UTF-8 or ANSI. If any of the raw data files do
not match one of the required encoding types the validation will fail.

Validation type Hard stop. Process cannot continue until all validation issues are addressed

Data files affected − GL Accounts

− GL Account Balances

− GL Trial Balance

− Journal Entry Transactions

− Loan Sub Ledger

− Loan Interest Rate Changes

− Loan Cash Flow Transactions

− Financial Instruments

Potential reasons If the raw data files are provided in an unsupported encoding type, the files
for validation to will not pass the validation and data import cannot proceed. Contact
fail and appropriate entity personnel and ask for a new dataset to be provided in any of the
recommended three acceptable encoding types. Alternatively, ask the IT Support to assist with
action converting the provided data files into one of the acceptable encoding type
formats.

Other validations − If this file validation fails, data import will not continue with the subsequent
affected validations for data file(s) affected, with the exception of the 0 KB validation,
which will be run with the file encoding validation.

− Data validation routines will not be triggered for the affected data file(s).

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
77
Validation # 3 - Column Delimiter Check

Illustration

How it works This data validation routine checks the column delimiter setup in the upload data
popup with the ones available in the data files to import. The validation returns the
rows in the data file which have mismatched column delimiters.

Validation type Hard stop. Process cannot continue until all validation issues are addressed

Data files − GL Accounts


affected
− GL Account Balances

− GL Trial Balance

− Journal Entry Transactions

− Loan Sub Ledger

− Loan Interest Rate Changes

− Loan Cash Flow Transactions

− Financial Instruments

Validation This validation exception shows the row number which has less columns than the
Exception header row.
Details

Potential When the validation fails, it is probable that a row is missing a column
reasons for delimiter.
validation to fail
If there are only a small number of differences noted, verify the row numbers for
and
which the validation exception is shown and check if there are column delimiters
recommended
missing. In the situation where a large number of validation exceptions are
action
encountered, verify the number of columns in the header row by verifying the correct
number of column delimiters.
The Exception Details field will provide detailed information on why the validation
has failed.

Other − If this file validation fails, data import will not continue with the subsequent
validations validations for data file(s) affected, with the exception of the 0 KB validation,
affected which will be run with the file encoding validation.

− Data validation routines will not be triggered for the affected data file(s).

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
78
Validation # 4 - Column Count Verification

Illustration

How it works This data validation routine compares the number of columns in each row with the
number of columns in the header. The validation returns the rows in the data file
which have missing delimiters.

Validation type Hard stop. Process cannot continue until all validation issues are addressed

Data files − GL Accounts


affected
− GL Account Balances

− GL Trial Balance

− Journal Entry Transactions

− Loan Sub Ledger

− Loan Interest Rate Changes

− Loan Cash Flow Transactions

− Financial Instruments

Validation This validation exception shows the row number which has fewer columns than the
Exception header row.
Details

Potential When the validation fails, it is probable that a row is missing a column
reasons for delimiter.
validation to fail
If there are only a small number of differences noted, verify the row numbers for
and
which the validation exception is shown and check if there are column delimiters
recommended
missing. In the situation where a large number of validation exceptions are
action
encountered, verify the number of columns in the header row by verifying the correct
number of column delimiters.
The Exception Details field will provide detailed information on why the validation
has failed.

Other − If this file validation fails, data import will not continue with the subsequent file
validations validations for data file(s) affected.
affected
− Data validation routines will not be triggered for the affected data file(s).

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
79
Validation # 5 – Upload vs Imported Record Count

Illustration

How it works This data validation routine compares the number of records in the file versus the
number of records imported. The validation returns the total imported records of total
records in the file in case of failure.

Validation type Warning. Not a hard stop to the process but it’s recommended to review all
exception details before continuing

Data files − GL Accounts


affected
− GL Account Balances

− GL Trial Balance

− Journal Entry Transactions

− Loans Sub Ledger

− Loans Interest Rate Changes

− Loans Cash Flow Transactions

− Financial Instruments

Validation This validation exception shows the total imported records vs total records in the file.
Exception
Details

Potential When the validation fails, it is probable that a row delimiter and last data
reasons for attribute in the row are missing.
validation to fail
If there are only a small number of differences noted, verify the row delimiter for
and
which the validation exception is shown and check if there are row delimiters
recommended
missing. Depending on the file size and complexity of the file the missing row
action
delimiters can be added. It is recommended to involve a Central Team member
when performing transformation procedures to the data.
The Exception Details field will provide detailed information on why the validation
has failed.

Other If this validation fails, there may be an impact on subsequent data validations
validations depending on the data rows that may be missing as per the record count.
affected

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
80
A.2. Summary of Data Transformation Validations
Transformation validation routines are automated (i.e. performed in Data Workbench) routines designed to
assist the engagement team in the validation of the completeness and accuracy of the imported entity data.
These automated validation routines do not replace the engagement team’s required procedures over data
integrity. Furthermore, it ensures the ability of KPMG Clara workflows to use the data for the data driven
analytics and workpaper creation.

Validation # 1 - Data Attribute Validation – Number Format

Illustration

How it works This file validation routine checks whether data type in the uploaded files are in
the required numerical format as specified in Data Workbench – Data
Requirements.

Validation type Hard stop. Process cannot continue until all validation issues are addressed

Data files affected − Journal Entry Transactions

− GL Trial Balance

− GL Account Balances

Validation Scenario: The data in the file has a “period” as a decimal separator except for 1
Exception Details field in the endingBalanceLC column. This field has a comma as a decimal
separator.

Potential reasons If data in any of the number fields of the raw data files are not in the
for validation to fail supported number format the validation will not pass and affected files will
and recommended not be imported. Use the export into Microsoft Excel® functionality to export the
action exception details into file validation reports. The Exception Details field will
provide detailed information on why the validation has failed. Refer to Data
Workbench - Data Transformation guide for guidance on how to transform
existing data files into an acceptable format or contact appropriate entity
personnel and ask for a new dataset to be provided with the required number
formats.

Other validations − If this file validation fails, data import will not continue with the subsequent file
affected validations for data file(s) affected.

− Data validation routines will not be triggered for the affected data file(s).

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
81
Validation # 2 - Data Attribute Validation – Date format

How it works This file validation routine checks whether all date fields are in one of the
following supported date formats:

− YYYY/MM/DD

− YYYY.MM.DD

− YYYY-MM-DD

− MM/DD/YYYY

− MM.DD.YYYY

− MM-DD-YYYY

− DD/MM/YYYY

− DD.MM.YYYY

− DD-MM-YYYY

Validation type Hard stop. Process cannot continue until all validation issues are addressed

Data files affected Journal Entry Transactions

Validation Scenario: The data in the JET file has the following date format “mm/dd/yyyy”
Exception Details except for 1 field in the creationDate column. This field has “dd/mm/yyyy” as a
date format. Crucial for the validation to fail is the order of the date parts date (d),
month (m) and year (y) whereas the date parts separator (i.e. “-“, “.” or “/”) of the
format selection has no impact on the validation outcome.

Potential reasons If the date in any of the date fields of the raw data files are not in an allowable
for validation to fail date format, the validation will not pass, and affected files will not be imported.
and recommended Use export into Microsoft Excel® functionality to export the exception details into
action file validation reports. The Exception Details field will provide detailed
information on why the validation has failed. Refer to the Data Workbench - Data
Transformation guide for guidance on how to transform existing data files into an
acceptable format or contact appropriate entity personnel and ask for a new
dataset to be provided with the required date formats.

Other validations − If this file validation fails, data import will not continue with the subsequent file
affected validations for data file(s) affected.

− Data validation routines will not be triggered for the affected data file(s).

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
82
Validation # 3 – Blank Data

Illustration

How it works This data validation routine checks whether raw data files contain data in the
required fields (columns). Optional fields can either have an actual value, be
blank or not be provided in the data file. However, the required fields cannot be
blank and should have an actual value. This automated validation checks if any
data in the required fields from the raw data files is missing. In instances where
data is missing in required fields the validation will show a warning message
( ).

Validation type Hard stop. Process cannot continue until all validation issues are addressed

Data files affected − GL Accounts

− GL Account Balances

− GL Trial Balance

− Journal Entry Transactions

Validation Exception Scenario: The column accountName in the GLA file has an empty field.
Details

Scenario: The column accountNumber in the GLA file has an empty field.

Note: If the account number is missing in the GLA file, the following validations
may also show a failed status.

− Account Number Match in GLA and JET files

− Account Number Match in GLA and GLAB files


Scenario: The column accountType in the GLA file has an empty field.

Scenario: The column accountNumber in the GLAB file has an empty field.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
83
Validation # 3 – Blank Data

Note: If the account number is missing in the GLAB file, the following
validations may also show a failed status because the system cannot match a
<blank> account number with the GLA file.
Account Number Match in GLA and GLAB files
Scenario: The column endingBalanceLC in the GLAB file has an empty field.
This will trigger the number format validation because the field does not have a
thousand or decimal separator.

Scenario: The column financialPeriod in the GLAB file has an empty field.

Scenario: The column fiscalYear in the GLAB file has an empty field.

Scenario: The column accountNumber in the JET file has an empty field.

Note: If the account number is missing in the JET file, the following validations
may also show a failed status because the system cannot match a <blank>
account number with the GLA and GLAB file.

− Account Number Match in GLA and GLAB files

− Calculated Closing Balances versus Imported Closing Balances

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
84
Validation # 3 – Blank Data

Scenario: The column amountLC in the JET file has an empty field. This will
trigger the number format validation because the field does not comply with the
selected number format (the field is empty).

Scenario: The column documentNumber in the JET file has an empty field.

Note: If the field document number is blank in this scenario, the validation
exception detail does not indicate the document number which is used;
therefore, it references to @DocumentNumber.
Scenario: The column financialPeriod in the JET file has an empty field.

Note: If the field financial period is blank in this scenario, the following
validations may also show a failed status.
Period Balance equal to zero
Scenario: The column fiscalYear in the JET file has an empty field.

Note: If the fiscal year field is blank in this scenario, the following validations
may also show a failed status.

− Period Balance equal to zero

− Calculated Closing Balances versus Imported Closing Balance


Scenario: The column postingDate in the JET file has an empty field. This will
trigger the date format validation because the field does not comply with the
selected date format (the field is empty).

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
85
Validation # 3 – Blank Data

Potential reasons for If the raw data files do not have data in all required fields (columns) the
validation to fail and validation will not pass. In cases when the number of validation failures is large,
recommended action the export to Microsoft Excel® may take a longer time to generate. Instead of
exporting to Microsoft Excel®, the Exception Details field can be used as it
provides detailed information on why the validation has failed. It is
recommended to contact appropriate entity personnel and ask for a new
dataset to be provided that contains all necessary data in the required fields.

Other validations If this validation fails, the impact on subsequent data validations depends on
affected what field contains the missing data. For example, if (required) data in the GL
account number column is missing in the JE Transactions data file, the
subsequent Account Number Match between Trial Balance Files and Journal
Entry Transaction Files will also fail, due to the discrepancy between the
respective GL account number values in the GL Accounts (or GLTB) and
Journal Entry Transactions data files.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
86
Validation # 4 - Duplicate Record Validation

How it works This data validation routine checks the existence of duplicates in the raw data
files. This Validation will fail if data files do not meet the following requirements:

− For the GL Accounts data file, there should be a unique entry for the following
attributes: language code, company code, account name, chart of account ID
and account number.

− For the GL Account Balances data file, there should be a unique entry for the
following attributes: account number, document currency, local currency,
company code, fiscal year, financial period, and Debit/Credit indicator.

− For the JE Transactions data file, there should be a unique entry for the
following attributes: Debit/Credit indicator, company code, fiscal year,
document number and line item.

− For the GL Trial Balances data file, there should be a unique entry for the
following attributes: account number, fiscal year, company code
Note: If the data uploaded contains duplicate records, the duplication has to be
resolved by the user before being able to continue processing. It is not possible to
continue to the next stage of the data processing flow without having resolved this
validation failure.

Validation type Hard stop. Process cannot continue until all validation issues are addressed

Data files affected − GL Accounts

− GL Account Balances

− GL Trial Balance

− Journal Entry Transactions

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
87
Validation # 4 - Duplicate Record Validation

Validation Scenario: Duplicate in GLA file


Exception Details

Scenario: Duplicate in GLAB file

Note: If a balance is duplicated in the full scenario, the following validation may
also show a failed status.
Calculated Closing Balances versus Imported Closing Balance
Scenario Duplicate in JET file

Note: If a transaction is duplicated in this scenario, the following validations may


also show a failed status.

− Period balance equal to zero

− Calculated Closing Balances versus Imported Closing Balance


Scenario Duplicate in GLTB file

Potential reasons If any of the raw data files have duplicate entries, the validation will not pass, and
for validation to fail affected files will not be transformed. Use the export into Microsoft Excel®
and recommended functionality to export the exception details into file validation reports and contact
action appropriate entity personnel and ask for a new dataset to be provided with all
duplicate entries cleared. Depending on the file size and complexity of the file
duplicates can be removed. It is recommended to involve a Central Team
member when performing transformation procedures to the data. The Exception
Details field will provide detailed information on why the validation has failed.

Other validations If this validation fails subsequent data validation routines will not be triggered.
affected

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
88
Validation # 5 – Current Year Closing Balance Verification

Illustration

How it works This data validation routine checks the current year closing balance period number
in the balance file with the period end parameter provided for in the Dataset
creation. For example, a 2019 full year analysis (period 1-12) has a PY closing
period of 12-2018, an opening period of 0-2019 and a CY closing period of 12-2019.

Validation type Warning. Not a hard stop to the process but it’s recommended to review all
validation details before continuing

Data file affected − GL Account Balances

− GL Trial Balance

Validation Scenario This validation exception shows that the current year closing balance is
Exception not available in the balance file.
Details

Potential The current year closing balance is not imported in the balance file.
reasons for
When the CY closing balance is not imported a user can continue to process but,
validation to fail
the reports in the Validate & Confirm screen, the Account Mapping screen and the
and
Advanced Capabilities will not show CY closing balances. It is strongly
recommended
recommended not to continue without CY closing balances as results downstream
action
cannot be relied upon.
Following this, contact the appropriate entity personnel and ask for a new dataset
with CY closing balances. The Exception Details field will provide detailed
information on why the validation has failed.

Other If this validation fails, there is no effect on execution of subsequent data validation
validations routines.
affected

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
89
Validation # 6 - Account Number Match between Trial Balance files

Illustration

How it works This automated validation checks whether the general ledger accounts present in
the account balance file are included in the entity’s chart of accounts (i.e. the
general ledger accounts data file). For the GL Trial Balance, this validation will
check the consistent between chart of account and balance created based on the
uploaded file.

Validation type Warning. Not a hard stop to the process but it’s recommended to review all
validation details before continuing

Data files affected − GL Accounts

− GL Account Balances

− GL Trial Balance

Validation The validation exception informs the user which account number in the GLAB file
Exception Details is not part of the chart of accounts (GLA file). The reference to the account name
is a known limitation and will show #NA#.

Potential reasons − The imported chart of accounts does not contain all general ledger
for validation to fail accounts which are currently in use (i.e. the imported chart of accounts is
and recommended not complete).
action
− The general ledger accounts in the General Ledger account balances
file have been extracted incorrectly (e.g. account numbers are
incomplete).

− Account balances are presented in “dummy” or discontinued accounts


not in the chart of accounts.
Use the export into Microsoft Excel® functionality to export the exception details
into validation reports and contact appropriate entity personnel and ask for a new
data file to be provided with matching account numbers between the data files.
The Exception Details field will provide detailed information on why validation
has failed.

Other validations If this validation fails, there is no effect on execution of subsequent data
affected validation routines.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
90
Validation # 7 – Account Number Match between Trial balances and Journal Entry Transaction Files

Illustration

How it works This automated data validation checks whether the general ledger accounts
present in the journal entry transactions data file are included in the entity’s
chart of accounts (i.e. the general ledger accounts data file or general trial
balance file).

Validation type Warning. Not a hard stop to the process but it’s recommended to review all
validation details before continuing

Data files affected - GL Accounts


- GL Trial Balance
- Journal Entry Transactions

Validation Exception The validation exception informs the user which account number in the JET file
Details is not part of the chart of accounts (GLA file) or General Ledger Trial Balance
(GLTB file). The reference to the account name is a known limitation and will
show #NA#.

Potential reasons for − The imported chart of accounts or the trial balance does not contain
validation to fail and all general ledger accounts which are currently in use (i.e. the imported
recommended action chart of accounts or trial balance is not complete).

− The general ledger accounts in the journal entry transactions data file have
been extracted incorrectly (e.g. account numbers are incomplete).

− Journal entries have been posted to rarely used or discontinued accounts


not in the imported chart of accounts.

− Use the export into Microsoft Excel® functionality to export the exception
details into validation reports and contact appropriate entity personnel and
ask for a new data file to be provided with matching account numbers
between the data files. The Exception Details field will provide detailed
information on why validation has failed.

Other validations If this validation fails, there is no effect on the execution of subsequent data
affected validation routines.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
91
Validation # 8 – Current year opening balance verification

Illustration

How it works This data validation routine checks the current year opening balance period
number in the balance file with the period start parameter provided for in the
Dataset creation. For example, a 2019 interim analysis (period 7-9) has a PY
closing balance of 12-2018, an opening balance of 6-2019 and a CY closing
balance of 9-2019.

Validation type Warning. Not a hard stop to the process but it’s recommended to review all
validation details before continuing

Data file affected GL Account Balances

Validation Exception Scenario This validation exception shows that the current year opening
Details balance is not available in the account balance file.

Potential reasons for The current year opening balance is not imported in the account balance file.
validation to fail and
When the CY opening balance is not imported, a user can continue to process
recommended action
but the reports in the Validation screen and the Advanced Capabilities will not
show CY opening balances. It is strongly recommended not to continue without
CY opening balances as results downstream cannot be relied upon.
Following this, contact the appropriate entity personnel and ask for a new
dataset with CY opening balances. The Exception Details field will provide
detailed information on why the validation has failed.

Other validations If this validation fails, there is no effect on execution of subsequent data
affected validation routines.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
92
Validation # 9 – Period Balance equal to zero Verification

Illustration

How it works This automated validation calculates the sum of account balances per each general
ledger account for each month (for a particular year) and determines whether all
balances sum to zero.
The same automated validation is performed independently on the current period
journal entry transactions data file. Summation is not performed at the journal entry
level but at the general ledger account level per period.

Validation type Warning. Not a hard stop to the process but it’s recommended to review all
validation details before continuing

Data files − GL Account Balances


affected
− GL Trial Balance

− Journal Entry Transactions

Validation The validation exception shows information about the net balance of the period that
Exception is out of balance.
Details

Potential If data is not extracted accurately (e.g. data has duplicated or missing records), use
reasons for the export into Microsoft Excel® functionality to export the exception details into
validation to fail validation reports.
and
Following this, contact the appropriate entity personnel and ask for a new dataset
recommended
with balanced general ledger accounts per period(s). The Exception Details field
action
will provide detailed information on why the validation has failed.

Other If this validation fails, there is no effect on execution of subsequent data validation
validations routines.
affected

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
93
Validation # 10 – Prior Year Closing Balance Verification Check

Illustration

How it works This data validation routine checks if the prior year closing balance period number in
the account balance file equals current year -1, period 12. For example, a 2019
interim analysis (period 7-9) has a PY closing balance of 12-2018, an opening
balance of 6-2019 and a CY closing balance of 9-2019.

Validation type Warning. Not a hard stop to the process but it’s recommended to review all
validation details before continuing

Data file affected − GL Account Balances

− GL Trial Balance

Validation This validation exception shows that the prior year closing balance is not available in
Exception the GLAB/GLTB file.
Details

Potential The prior year closing balance is not imported in the GLAB/GLTB file.
reasons for
When the PY closing balance is not imported a user can continue to process but, it
validation to fail
is strongly recommended not to continue without PY closing balances as results
and
downstream cannot be relied upon (e.g., Planning Analytics Dashboard will not
recommended
populate).
action
Following this, contact the appropriate entity personnel and ask for a new dataset
with PY closing balances. The Exception Details field will provide detailed
information on why the validation has failed.

Other If this validation fails, there is no effect on execution of subsequent data validation
validations routines.
affected

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
94
Validation # 11 – Calculated Closing Balances versus Imported Closing Balance Verification

Illustration

How it works This data validation routine reconciles the imported general ledger account closing
balances with the sum of imported opening balances and the activity (total debits
and total credits) with the Journal Entry transactions. The reconciliation is done on
the level of company code, fiscal year, and GL account number.

Validation type Warning. Not a hard stop to the process but it’s recommended to review all
validation details before continuing

Data files − GL Account Balances


affected
− GL Trial Balance

− Journal Entry Transactions

Validation This validation exception shows the balance calculated based on the JET file and
Exception the balance imported using the GLAB/GLTB file.
Details

Potential If there are any differences between the recalculated closing balance and the
reasons for imported closing balance, the validation will fail. However, the user can proceed and
validation to fail should use the Trial Balance Report available in Data Management to investigate
and such differences.
recommended
Central Team or IT Local Support may use the export into Microsoft Excel®
action
functionality to export the exception details into file validation reports and contact
appropriate entity personnel and ask for a new dataset to be provided. The
Exception Details field will provide detailed information on why validation has
failed.

Other If this validation fails, there is no effect on execution of subsequent data validation
validations routines.
affected

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
95
Validation # 12 - Verify that the data aligns with the phase start date and end date

Illustration

How it works The start date and end date of the dataset specified during the dataset creation in
the Dataset Overview screen defines the time span of the Data Workbench analysis.
Data files received from the entity should be related to this specified time span.
However, data files can contain additional period(s) of data as long as the effective
date (i.e. posting date) of those entries are within the specified start date and end
date (e.g. when adjusting entries are entered in period 13 while their effective date is
in period 12, the Journal Entry Transactions file can contain journal entries entered
in all 13 periods). If the effective date of the journal entry included in the raw data file
is not aligned with the specified dataset start and end date this automated validation
will fail.

Validation type Warning. Not a hard stop to the process but it’s recommended to review all
validation details before continuing

Data files Journal Entry Transactions


affected

Validation The validation exception shows every journal entry which is outside of the dataset
Exception boundaries.
Details

Potential The data extracted by the entity contains journal entries with effective dates
reasons for (i.e. posting date) outside of the date range specified in the dataset. Use the
validation to fail export into Microsoft Excel® functionality to export the exception details into
and validation reports and contact appropriate entity personnel and ask for a new
recommended dataset with corrected information. Depending on the file size and complexity of the
action file the additional journals can be removed. It is recommended to involve a Central
Team member when performing transformation procedures to the data. The
Exception Details field will provide detailed information on why validation has
failed.

Other If this validation fails, there is no effect on execution of subsequent data validation
validations routines.
affected

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
96
A.3. Summary of FSR Data Transformation Validations
Transformation validation routines are automated (i.e. performed in Data Workbench) routines designed to
assist the engagement team in the validation of the completeness and accuracy of the imported entity data.
These automated validation routines do not replace the engagement team’s required procedures over data
integrity. Furthermore, it ensures the ability of Data Workbench to consume the data for the Advanced
Capabilities.

Validation # 1 - Data Attribute Validation – Number format

Illustration

How it works This automated data validation performs the following:


Checks whether data type in the Loans Interest Rate Changes, Loans Subledger,
Loans Cashflow Transaction Data and Financial Instruments Subledger Data files
are in required numerical format as specified in Data Requirement Appendix.

Validation type Hard stop. Process cannot continue until all validation issues are addressed

Data files affected − Loans Interest Rate Changes

− Loans Subledger

− Loans Cashflow Transaction

− Financial Instruments Subledger

Validation Scenario: The data in the Loans Subledger file has a “period” as a decimal
Exception Details separator except for 12 fields in the interestRateEndOfPeriod,
originalLoadAmount and openingBalance columns. These fields have an
apostrophe or an empty space as a decimal separator.

Potential reasons If data in any of the number fields of the raw data files are not in the
for validation to fail supported number format the validation will not pass and affected files will
and recommended not be imported. Use the export into Microsoft Excel® functionality to export the
action exception details into file validation reports. The Exception Details field will
provide detailed information on why the validation has failed. Refer to Data
Workbench - Data Transformation guide for guidance on how to transform
existing data files into an acceptable format or contact appropriate entity
personnel and ask for a new dataset to be provided with the required number
formats

Other validations − If this file validation fails, data import will not continue with the subsequent file
affected validations for data file(s) affected.

− Only Date format validation will be triggered but other data validation routines

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
97
Validation # 1 - Data Attribute Validation – Number format

will not be triggered for the affected data file(s).

Validation # 2 - Data Attribute Check – Date format

Illustration

How it works This automated data validation routine checks whether all date fields are in one of
the following supported date formats:

− YYYY/MM/DD

− YYYY.MM.DD

− YYYY-MM-DD

− MM/DD/YYYY

− MM.DD.YYYY

− MM-DD-YYYY

− DD/MM/YYYY

− DD.MM.YYYY

− DD-MM-YYYY

Validation type Hard stop. Process cannot continue until all validation issues are addressed

Data files affected − Loans Interest Rate Changes

− Loans Subledger

− Loans Cashflow Transaction

− Financial Instruments Subledger

Validation Scenario: The data in the Loan Interest Rate Changes file has the following date
Exception Details format “yyyy-mm-dd” except for 1 field in the transactionDate column. These
fields have “dd-mm-yyyy” or dd/mm/yyyy or yyyy’mm’dd as a date format. Crucial
for the validation to fail is the order of the date parts date (d), month (m) and year
(y) whereas the date separator (i.e. “-“, “.” or “/”) of the format selection has no
impact on the validation outcome.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
98
Validation # 2 - Data Attribute Check – Date format

Potential reasons If the date in any of the date fields of the raw data files are not in an allowable
for validation to fail date format, the validation will not pass, and affected files will not be imported.
and recommended Use export into Microsoft Excel® functionality to export the exception details into
action file validation reports. The Exception Details field will provide detailed
information on why the validation has failed. Refer to the Data Workbench - Data
Transformation guide document for guidance on how to transform existing data
files into an acceptable format or contact appropriate entity personnel and ask for
a new dataset to be provided with the required date formats.

Other validations − If this file validation fails, data import will not continue with the subsequent file
affected validations for data file(s) affected.

− Only Number format validation will be triggered but other data validation
routines will not be triggered for the affected data file(s).

Validation # 3 – Duplicate Loan ID

Illustration

How it works This automated data validation routine checks the existence of duplicates in the
raw data files. This Validation will fail if data files do not meet the following
requirements:

− For the Loans Subledger file, there should be a unique entry for the following
attributes; Loan ID.
Note: If the data uploaded contains duplicate records, the duplication has to be
resolved by the user before being able to continue processing. It is not possible to
continue to the next stage of the data processing flow without having resolved this
validation failure.

Validation type Hard stop. Process cannot continue until all validation issues are addressed

Data files affected − Loans Subledger

Validation Scenario: The data contains duplicate Loan ID in Loans Subledger file.
Exception Details

Potential reasons If any of the raw data files have duplicate entries, the validation will not pass and
for validation to fail affected files will not be transformed. Use the export into Microsoft Excel®
and recommended functionality to export the exception details into file validation reports and contact
action appropriate entity personnel and ask for a new dataset to be provided with all

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
99
Validation # 3 – Duplicate Loan ID

duplicate entries cleared. Depending on the file size and complexity of the file
duplicates can be removed. It is recommended to involve a Central Team
member when performing transformation procedures to the data. The Exception
Details field will provide detailed information on why the validation has failed.

Other validations If this validation fails, the impact on subsequent data validations depends on what
affected field contains the duplicate Loan ID. For example, if (required) there are duplicate
Loan ID column in the Loans Subledger data file, the subsequent Overlapping
Period Verification validation will also be warned, due to the overlapping periods
per loan ID based on the dateTo and dateFrom fields from the Interest Rate
Changes file.

Validation # 3 – Duplicate Financial ID Validation

Illustration

How it works This automated data validation routine checks the existence of duplicates in the
raw data files. This Validation will fail if data files do not meet the following
requirements:

− For the Financial Instruments Subledger Data file, there should be a unique
entry for the following attributes; Internal Instrument ID.
Note: If the data uploaded contains duplicate records, the duplication has to be
resolved by the user before being able to continue processing. It is not possible to
continue to the next stage of the data processing flow without having resolved this
validation failure.

Validation type Hard stop. Process cannot continue until all validation issues are addressed

Data files affected − Financial Instruments Subledger

Validation Scenario: The data contains duplicate Internal Instrument ID in Financial


Exception Details Instruments Subledger file.

Potential reasons If any of the raw data files have duplicate entries, the validation will not pass and
for validation to fail affected files will not be transformed. Use the export into Microsoft Excel®
and recommended functionality to export the exception details into file validation reports and contact
action appropriate entity personnel and ask for a new dataset to be provided with all
duplicate entries cleared. Depending on the file size and complexity of the file
duplicates can be removed. It is recommended to involve a Central Team

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
100
Validation # 3 – Duplicate Financial ID Validation

member when performing transformation procedures to the data. The Exception


Details field will provide detailed information on why the validation has failed.

Other validations If this validation fails, there is no effect on execution of subsequent data validation
affected routines.

Validation # 4 – Loan ID match in Cash Flow Transaction and Sub Ledger Files

Illustration

How it works This automated data validation routine checks whether all the Loan IDs in the
Loans Cashflow Transaction Data file match with the Loan IDs in the Loans
Subledger file.

Validation type Warning. Not a hard stop to the process but it’s recommended to review all
exception details before continuing

Data files − Loans Subledger


affected
− Loans Cashflow Transaction Data

Validation Scenario: The validation exception informs the user which Loan IDs in the Loans
Exception Details Cashflow Transaction file is not part of Loans Subledger file.

Potential reasons − The Loan ID which is present in the Loans Cashflow Transaction Data file is
for validation to not present in the Loans Subledger file because of the lack of data in the Loans
fail and Cashflow Transaction Data file.
recommended
action − The Loan ID which is present in the Loans Cashflow Transaction Data file does
not match with the Loan ID in the Loans Subledger file because inappropriate
Loan ID is put on the record in the Loans Cashflow Transaction Data file.

− Use the export into Microsoft Excel® functionality to export the exception details
into validation reports and contact appropriate entity personnel and ask for a
new data file to be provided with accurate and complete Loan IDs. The
Exception Details field will provide detailed information on why validation has
failed.

Other validations If this validation fails, the impact on subsequent data validations depends on what
affected field contains the mismatched Loan ID. For example, if (required) there is at least
one Loan ID found in the Cash Flow Transaction file that is not matched in the Loan
Sub ledger file, then the subsequent Verification of the presence of transaction
© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
101
Validation # 4 – Loan ID match in Cash Flow Transaction and Sub Ledger Files
dates within the Analysis Period validation will also be warned, due to the
mismatched Loan ID in the analysis period.

Validation # 5 – Loan ID check with respect to the Interest Rate Changes File

Illustration

How it works This automated data validation routine checks whether all the Loan IDs that are
present in the Loans Interest Rate Changes file type match with Loan IDs in the
Loans Subledger file.

Validation type Warning. Not a hard stop to the process but it’s recommended to review all
exception details before continuing

Data file affected − Loans Subledger

− Loans Interest Rate Changes

Validation This validation exception shows that the Loan IDs in the Loans Interest Rate
Exception Changes file are available in the Loans Subledger file.
Details

Potential − The Loan ID which is present in the Loans Interest Rate Changes file is not
reasons for present in the Loans Subledger file because of the lack of data in the Loans
validation to fail Interest Rate Changes file.
and
recommended − The Loan ID which is present in the Loans Interest Rate Changes file does not
action match with the Loan ID in the Loans Subledger file because inappropriate Loan
ID is put on the record in the Loans Interest Rate Changes file.

− Use the export into Microsoft Excel® functionality to export the exception details
into validation reports and contact appropriate entity personnel and ask for a
new data file to be provided with accurate and complete Loan IDs. The
Exception Details field will provide detailed information on why validation has
failed.

Other If this validation fails, the impact on subsequent data validations depends on what
validations field contains the mismatched Loan ID. For example, if (required) there is at least
affected one Loan ID found in the Loans Interest Rate Changes file that is not matched in the
Loan Sub ledger file, then the subsequent validation Verification of Valid Interest
Rate, Overlapping Period Verification, Skipped Period Verification will also be
warned, due to the mismatched Loan ID in the analysis period

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
102
Validation # 6 – New Loan Opening Balance Verification

Illustration

How it works This automated data validation routine checks whether all the loans whose
origination date (‘originationDate’ from Loan Subledger file) that falls under the
analysis period (the period start and end date provided in the parameters) do not
have an opening balance or do not have a value of opening balance that unequal to
‘0’.

Validation type Warning. Not a hard stop to the process but it’s recommended to review all
exception details before continuing

Data file affected Loans Subledger

Validation Scenario: This validation exception shows that the Loan IDs whose origination date
Exception that falls under the analysis period have an opening balance that is unequal to ‘0’ in
Details the Loans Subledger file.

Potential − The identified new Loan IDs have a value in opening balance which is not equal
reasons for to ‘0’ for the beginning of the analysis period.
validation to fail
and − Use the export into Microsoft Excel® functionality to export the exception details
recommended into validation reports and contact appropriate entity personnel and ask for a
action new data file to be provided with the correct opening balance, which has no
value or is equal to ‘0’ for every new loan originated within the analysis period.
The Exception Details field will provide detailed information on why validation
has failed.

Other If this validation fails, there is no effect on execution of subsequent data validation
validations routines.
affected

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
103
Validation # 7 – Verification of a Valid Interest Rate

Illustration

How it works This automated data validation routine checks whether the fields under the interest
rate end of period column is not empty (contains only numeric values) and the
maturity date of a loan (based on Loan ID) is less than the period end date and the
loan termination date is not empty.

Validation type Warning. Not a hard stop to the process but it’s recommended to review all
exception details before continuing

Data file affected Loans Subledger

Validation Scenario: This validation exception shows that the interest rate end of period
Exception column is either empty or contains any other values than numeric values file.
Details
Note: The field can be empty if the maturity date of a loan (based on Loan ID) is
before the period end date and the loan termination date is empty.

Potential − The Interest rate end of period column is empty in the Loans Subledger file for a
reasons for certain Loan ID.
validation to fail
and − The value of interest rate end of period column in the Loans Subledger file for a
recommended certain Loan ID is not numeric.
action
− Use the export into Microsoft Excel® functionality to export the exception details
into validation reports and contact appropriate entity personnel and ask for a
new data file to be provided with the correct interest rate end of period value for
every Loan ID which is numeric. The Exception Details field will provide
detailed information on why validation has failed.
Note: The interest rate end of period column can be empty if the maturity date of a
loan (based on Loan ID) is before the period end date and the loan termination date
is empty.

Other If this validation fails, there is no effect on execution of subsequent data validation
validations routines.
affected

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
104
Validation #8 - Verification of the presence of transaction dates within the Analysis Period

Illustration

How it works This automated validation checks whether all the dates that are found in the
transaction date column of the Loans Cashflow Transaction Date file are in the
analysis period that was mentioned in the period start and period end date in the
parameters.

Validation type Warning. Not a hard stop to the process but it’s recommended to review all
exception details before continuing

Data files affected - Loans Cashflow Transaction Data

Validation Scenario: The validation exception informs the user which transaction date(s) of
Exception Details the Loan Cashflow Transactions file is/are not in the scoped Analysis period from
the parameters.

Potential reasons − The Transaction date in the Loans Cashflow Transaction Data file is before
for validation to fail Period start date which is provided as a parameter.
and recommended
action − The Transaction date in the Loans Cashflow Transaction Data file is after
Period end date which is provided as a parameter.

− Use the export into Microsoft Excel® functionality to export the exception
details into validation reports and contact appropriate entity personnel and
ask for a new data file to be provided with correct Transaction dates which
are within the analysis period. The Exception Details field will provide
detailed information on why validation has failed.

Other validations If this validation fails, there is no effect on execution of subsequent data
affected validation routines.

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
105
Validation # 9 – Overlapping Period Validation

Illustration

How it works This automated validation checks whether there is no overlapping period per Loan
ID based on the dateFrom and dateTo fields from the Loans Interest Rate Changes
file.

Validation type Warning. Not a hard stop to the process but it’s recommended to review all
exception details before continuing

Data files - Loans Interest Rate Changes


affected

Validation The validation exception shows information about overlapping periods per loan ID
Exception based on the dateTo and dateFrom in the Interest Rate Changes file.
Details
Note: If the ‘From’ date of a period and the ‘To’ date of another period are similar,
then they are not considered as overlapped periods and hence it may be ignored.

Potential − There are overlapping periods per Loan ID based on the dateTo and dateFrom
reasons for fields from the Loans Interest Rate Changes file.
validation to fail
and − For example, if the dateTo and dateFrom of the 1st period is 1/1/2021 and
recommended 3/31/2021, and if those of the 2nd period is 2/1/2021 and 4/30/2021, then the 1st
action and 2nd period are considered to be overlapped.

− Use the export into Microsoft Excel® functionality to export the exception details
into validation reports and contact appropriate entity personnel and ask for a
new data file to be provided with the correct values in the dateTo and dateFrom
field and there is no overlapping period. The Exception Details field will provide
detailed information on why validation has failed.

Other If this validation fails, there is no effect on execution of subsequent data validation
validations routines.
affected

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
106
Validation # 10 - Skipped Period Validation

Illustration

How it works This automated validation checks whether all the periods are available without any
of them being skipped per Loan ID based on the dateFrom and dateTo fields from
the Loans Interest Rate Changes file.

Validation type Warning. Not a hard stop to the process but it’s recommended to review all
exception details before continuing

Data files - Loans Interest Rate Changes


affected

Validation Scenario: The validation exception shows the information about missing periods per
Exception loan ID based on the dateTo and dateFrom fields from the Interest Rate Changes
Details file.

Potential − There are any missing periods per Loan ID based on the dateTo and dateFrom
reasons for fields from the Loans Interest Rate Changes file.
validation to fail
and − For example, if the dateTo and dateFrom of the 1st period is 1/1/2021 and
recommended 1/31/2021, and if those of the 2nd period is 3/1/2021 and 3/31/2021, then the
action period of the month February is considered to be skipped.

− Use the export into Microsoft Excel® functionality to export the exception details
into validation reports and contact appropriate entity personnel and ask for a
new data file to be provided with the correct values in the dateTo and dateFrom
field and there is no skipped period. The Exception Details field will provide
detailed information on why validation has failed.

Other If this validation fails, there is no effect on execution of subsequent data validation
validations routines.
affected

© 2022 Copyright owned by one or more of the KPMG International entities. KPMG International entities provide no services to clients. All rights reserved.
INTERNAL USE ONLY.
107

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy