XI Tutorials PDF
XI Tutorials PDF
XI Tutorials PDF
This is about those forms of middleware that are used to connect applications to other
applications. We generally refer to the use of this type of middleware as Enterprise Application
Integration or EAI. EAI middleware mediates between applications in a number of ways, but most
commonly we think in terms of the transformation and routing of data and the orchestration of
business process flows.
There is the implication here that these applications reside in a heterogeneous world--different
operating platforms, disparate data models and data stores, and heterogeneous network and
communications protocols.
Overview of XI
Reduced maintenance costs: As the client replaced multiple integration technologies with SAP
XI, it gained immediate cost savings by reducing associated maintenance costs. Additionally, the
client gained the capability to utilize its manpower more effectively.
Enhanced flexibility: The ability to modify applications and add new functionalities without
impacting other systems and businesses gave the client tremendous flexibility in reacting to
dynamic changes in the market.
Standardizes on the technology integration layer that would be used for connectivity between all
systems
Achieve a central monitoring of message flow, instead of having to monitor several systems
Enterprise IT motto
Yesterday
Most conventional or traditional enterprise applications were custom built to address a specific
business need.
Today
The present advanced enterprise applications addresses multiple business tasks, with each
landscape having a higher degree of heterogeneous complexity.
Evolution of SAP
At first a single database integration in a single centralized data model: In one system with several
applications one database, (e.g. an R/3 system with MM, SD, CO, FI, HR, ) with the applications having
access to the data structures across the components. Integration in this case is and was fairly easy.
Then SAP and 3rd party vendors provided other solutions as e.g. CRM. SRM,.. These solutions and their
respective systems needed to be integrated to the ERP environment (e.g. an R/3 backend system). This
brought added complexity and the beginning of many individual point-to-point connections.
With the SAP Exchange Infrastructure and collaborative business, SAP approaches the integration
challenge from a different angle. The basic idea is to provided a runtime infrastructure which allows
heterogeneous systems to be tied together with fewer connections and at the same time, in order to
connect those applications and let message flow from one application to the other, have a centralized
storage of the integration knowledge.
Business drivers of Integration Projects:
Data Coordination
Mainly deals with transactional data between applications.
Mainly focuses on modeling and orchestrating the workflow between individual functions and
applications.
Goal is to provide management with immediate awareness of changing business events across
the enterprise.
Combines the data and functionality of an enterprises existing ones with new business process
logic, custom code and user -facing front ends.
Middleware flow:
Component view of XI
XI is not a single component, but rather a collection of components that work together flexibly to
implement integration scenarios.
Integration Builder: A client-server framework for accessing and editing two stores of Shared
Collaboration knowledge. It has two parts, which are fat clients to SLD where we can import the objects
and use them locally. The basic reason for separating Integration Repository from Integration Directory is
because by separating design time activities from configuration time activities, SAP can ship content from
the Integration Repository, which each customer can implement for their specific landscape in the
Integration Directory.
Integration Repository: It is used for the design and development of interface, Process and Mapping
objects that are used to implement Integration Scenarios. Usually they contain static objects, which can
be used for different landscapes by defining the routing rules in Integration Directory.
Integration Directory: They contain dynamic objects where in we configure scenarios using the objects
from Integration Repository and route the messages between systems.
Integration Server: This component provides run time for XI. This is central processing engine of XI.
Business Process Engine: Business Process Engine enables SAP Netweaver with BPM capability by
processing integration processes at runtime. BPE uses functions of the workflow engine and generates
workflow from integration process at runtime.
Integration Engine: Integration engine enables processing of XML messages that are exchanged
between applications in heterogeneous system landscapes. Using adapters such as IDoc, http, it can
process IDocs(Intermediate documents), http requests and Remote Function Calls. It is runtime
environment of SAP Exchange Infrastructure, which has the task of receiving, processing and forwarding
XML messages. Processing is done with the evaluation of Collaboration agreements, by determination of
receivers and execution of mapping programs.
Adapter Engine: Adapter engine is used to connect Integration Engine to SAP systems and external
systems. Various types of adapters are provided to convert XML and HTTP based messages to the
specific message protocol and format required by the partner systems and vice-versa. It is based on
adapter framework, in turn based on SAP J2EE Engine (as part of the SAP Web Application Server) and
J2EE Connector Architecture (JCA).
This document gives you clear understanding of ABAP Mapping, how to enable the option to use ABAP
Mapping class in XI Interface Mappings, how to parse and read the input XML source structure and build
the output XML target structure.
Below are different ways of achieving this conversion in XI:
1.
2.
3.
4.
The rest of the document gives you pre-requisites and steps that are necessary for making use of ABAP
Mapping class.
Open http://<host>:<port>/exchangeProfile/index.html
Choose
IntegrationBuilder
->IntegrationBuilder.Repository
com.sap.aii.repository.mapping.additionaltypes
Maintain the Entry R3_ABAP|Abapclass;R3_XSLT|XSL (ABAPEngine)
Check whether the data has been successfully read from the exchange profile:
->
Open http://<host>:<port>/rep/support/admin/index.html
Choose Administration of Properties -> All Properties . If the value associated to
parameter com.sap.aii.repository.mapping.additionaltypes is not correct, Choose at the
top of the page and refresh.
2.
3.
The method EXECUTE of this interface needs to be developed, this method will be called from
during mapping, the method has the following parameters:
The SOURCE parameter contains the input XML structure in XSTRING format.
The mapping program needs to construct the RESULT xstring, which will be used for the
construction of output XML structure.
4.
The explanation of the ABAP Mapping is provided later in the document in the section Steps to be
followed in the ABAP Mapping Class.
The scenario here involves the conversion of a customer format xml file containing the
information of the employees in one specific format to another format. The source file is available
on the FTP and the Sender file adapter picks it. This file is deleted and archived on a different
directory on the FTP. The source XML file is converted into target XML file by the ABAP mapping
program and this target file is placed on the FTP again on a different directory.
Here
is
the
source
XML
structure:
Here
is
the
target
XML
structure:
The following are the object details that are developed for this scenario:
Objects developed in the Integration Repository
o
DATA TYPES
DT_Emp_Det
DT_Employees
MESSAGE TYPE
MT_Emp_Det
MT_Employees
MESSAGE INTERFACES
MI_Emp_Det_OB
MI_Employees_IB
MESSAGE MAPPING
INTERFACE MAPPING
IM_MI_Emp_Det_OB_MI_Employees_IB
Business Service YASH_FTP (the same business service is used as Sender and
Receiver )
Communication Channels
CC_XML_Sender_FIle
CC_XML_Receiver_FIle
Receiver
Agreement
Sender
Agreement
Interface
Determination
Receiver
Determination
The first step is to parse the input XML document, for this we need to create a new parser
instance and implement the DOM generating interface to the created parser instance.
The below are the interfaces and their methods used in creating the parser instance.
Parse input document
* Initialize Input Document
data: idocument type ref to if_ixml_document.
* creates a new Document instance and returns an interface pointer to this
instance.
idocument = ixml_factory->create_document( ).
data: iparser type ref to if_ixml_parser.
* creates a new Parser instance and returns an interface pointer to this
instance.
iparser = ixml_factory->create_parser( stream_factory = stream_factory
istream
= istream
document
= idocument ).
* implements the DOM-generating interface to the parser
iparser->parse( ).
From the above we understand that the pre-requisites to create a new parser instance are a
Stream factory, an XML Input stream to parse and a document instance into which the XML input
stream is to be parsed.
To create a stream factory, first step is to create a Main factory, i.e. instance of IXML class using
the method create of the class if_ixml.
* Create Main Factory
data: ixml_factory type ref to if_ixml.
* creates an instance of the iXML class and returns an interface pointer to the instance.
ixml_factory = cl_ixml=>create( ).
* creates a new StreamFactory instance and returns an interface pointer to this instance
data: stream_factory type ref to if_ixml_stream_factory.
stream_factory = ixml_factory->create_stream_factory( ).
To create Input stream we need the above Stream Factory instance created
* Create Input Stream
data: istream type ref to if_ixml_istream.
* creates a new XML input stream for the given ABAP xstring
istream = stream_factory->create_istream_xstring( SOURCE ).
Where SOURCE is the input XML string.
Now the second step, Traverse through the nodes of the input XML
Now we need to traverse through the nodes of the input XML string and get the values of each
node and store them in ABAP work areas or internal tables which will later be used in
constructing the target XML structure
if_ixml_node
if_ixml_node_list
if_ixml_node_iterator
if_ixml_node_collection
The iXMLNode object is the primary datatype for the entire Document Object
Model. It represents a single node in the document tree.
Here our source XML structure will contain a number of nodes in multiple hierarchical structure
but we know the parent node and so we should start traversing from the parent node.
Next step is we need to create a node iterator as below, which will traverse the above Node
collection from left to right
emp_node_iterator = emp_node_collection->CREATE_ITERATOR( ).
WHERE emp_node_iterator is of type IF_IXML_NODE_ITERATOR
Loop thorugh each node and get the name of the structure node
emp_node = emp_node_iterator->GET_NEXT( ).
WHERE emp_node is of type IF_IXML_NODE
Create a NODE LIST instance to traverse through the child nodes under this node, use the
following method
emp_node_list = emp_node->GET_CHILDREN( ).
WHERE emp_node_list is of type ref to IF_IXML_NODE_LIST
Now get the number of nodes in the above NODE LIST structure as below:
emp_node_list_length = emp_node_list ->GET_LENGTH( ).
WHERE emp_node_list_length is of type I
Now to traverse to each node inside the above node list collection and get the name and value of
each node the following methods need to be used, since there are emp_node_list_length number of
child node we need to loop for emp_node_list_length times
* Loop through as many child nodes as there are for the structure
DO emp_node_list_length TIMES.
*Here we get access to the subnodes i.e. PERSONAL and JOB, and so as explained earlier we need
to again collect the node list of each of these nodes, get their children nodes list and finally get
access to the element nodes present in these children nodes.
emp_subnode = emp_node_list_iterator->get_next( ).
emp_subnode_name = emp_subnode->get_name( ).
emp_subnode_list = emp_subnode->get_children( ).
emp_subnode_list_length = emp_subnode_list->get_length().
DO emp_subnode_list_length TIMES.
w_index = sy-index - 1.
w_node = emp_subnode_list->get_item( w_index ).
w_node_name = w_node->get_name( ).
w_node_value = w_node->get_value( ).
* Check the name of the node and move the value into the corresponding ABAP work area.
ENDDO.
ENDDO.
As described above we could traverse through the nodes and get those values, once the values
are moved to our ABAP work areas, we need to construct the target XML string
Now to create an element in the above created output document instance , use the following
method:
Creates a simple element with the given name (and namespace) and the specified value as text content. Note
that the instance returned implements the iXMLElement interface, so attributes can be specified directly on the
returned object.
PARENTNODE = odocument->create_simple_element( name = 'EMPLOYEES' parent = odocument ).
As above we could continue adding elements with desired names and values and hierarchy by
looping through the internal tables that we collected by parsing the input document.
Once the ODOCUMENT is constructed, we need to create the output stream and render the
document
* render document
* create output stream
data: ostream type ref to if_ixml_ostream.
ostream = stream_factory->create_ostream_xstring( result ).
WHERE result is the output XML string
Create renderer
*
data:
renderer
renderer
=
document
renderer->render( ).
create
type
ref
ixml_factory->create_renderer(
=
to
ostream
odocument
renderer
if_ixml_renderer.
ostream
).
emp_node_length
type i,
emp_node_iterator
type ref to if_ixml_node_iterator,
emp_node
type ref to if_ixml_node,
emp_node_list
type ref to if_ixml_node_list,
emp_node_list_length
type i,
emp_node_list_iterator type ref to if_ixml_node_iterator,
emp_subnode
type ref to if_ixml_node,
emp_subnode_list
type ref to if_ixml_node_list,
emp_subnode_list_length
type i,
emp_subnode_name
type string,
w_node_name
type string,
w_node_value
type string,
w_fieldname
type string,
w_tablename
type string,
w_index
type i,
w_node
type ref to if_ixml_node.
data:
odocument
type ref to if_ixml_document,
fs_output
type string,
employee_node type ref to if_ixml_node,
personal_node type ref to if_ixml_node,
job_node
type ref to if_ixml_node,
rnode
type ref to if_ixml_node.
data:
begin of fs_job,
empid
type string,
company
type string,
department
type string,
designation type string,
begindate
type string,
enddate
type string,
salary
type string,
end of fs_job,
begin of fs_personal,
empid
type string,
firstname type string,
middlename type string,
lastname type string,
age
type i,
gender
type string,
end of fs_personal,
t_personal like standard table of fs_personal,
t_job
like standard table of fs_job.
field-symbols:
<fs> type any,
<fs_table> type table.
Initialize iXML
type-pools ixml.
class cl_ixml definition load.
create main factory
data: ixml_factory type ref to if_ixml.
ixml_factory = cl_ixml=>create( ).
create stream factory
data: stream_factory type ref to if_ixml_stream_factory.
stream_factory = ixml_factory->create_stream_factory( ).
create input stream
data: istream type ref to if_ixml_istream.
*
*
*
*
*
*
*
*
*
*
RECEIVING
RVAL
= EMPLOYEE_NODE
.
loop at t_personal into fs_personal.
clear fs_output.
CALL METHOD ODOCUMENT->CREATE_SIMPLE_ELEMENT_NS
EXPORTING
NAME
= 'PERSONAL'
PARENT = EMPLOYEE_NODE
PREFIX = ''
URI
= ''
VALUE = fs_output
RECEIVING
RVAL
= PERSONAL_NODE
.
concatenate fs_personal-firstname
fs_personal-middlename
fs_personal-lastname
into fs_output.
CALL METHOD ODOCUMENT->CREATE_SIMPLE_ELEMENT_NS
EXPORTING
NAME
= 'FULLNAME'
PARENT = personal_node
PREFIX = ''
URI
= ''
VALUE = fs_output
RECEIVING
RVAL
= RNODE
.
move fs_personal-age to fs_output.
CALL METHOD ODOCUMENT->CREATE_SIMPLE_ELEMENT_NS
EXPORTING
NAME
= 'AGE'
PARENT = personal_node
PREFIX = ''
URI
= ''
VALUE = fs_output
RECEIVING
RVAL
= RNODE
.
CALL METHOD ODOCUMENT->CREATE_SIMPLE_ELEMENT_NS
EXPORTING
NAME
= 'GENDER'
PARENT = personal_node
PREFIX = ''
URI
= ''
VALUE = fs_personal-gender
RECEIVING
RVAL
= RNODE.
loop at t_job into fs_job where empid = fs_personal-empid.
AT NEW EMPID.
CALL METHOD ODOCUMENT->CREATE_SIMPLE_ELEMENT_NS
EXPORTING
NAME
= 'JOB'
PARENT = EMPLOYEE_NODE
PREFIX = ''
URI
= ''
VALUE = ''
RECEIVING
RVAL
= JOB_NODE.
ENDAT.
concatenate fs_job-company
fs_job-department
fs_job-designation
fs_job-begindate
fs_job-enddate
fs_job-salary
into fs_output
separated by '*'.
CALL METHOD ODOCUMENT->CREATE_SIMPLE_ELEMENT_NS
EXPORTING
NAME
= 'COMPANYDATA'
PARENT = JOB_NODE
*
PREFIX = ''
*
URI
= ''
VALUE = fs_output
RECEIVING
RVAL
= RNODE.
endloop.
endloop.
* create output stream
data ostream type ref to if_ixml_ostream.
ostream = stream_factory->create_ostream_xstring( result ).
* create renderer
data renderer type ref to if_ixml_renderer.
renderer = ixml_factory->create_renderer( ostream = ostream
document = odocument ).
* implements DOM based interface to the renderer.
renderer->render( ).
endmethod.
</PERSONAL>
- <JOB>
<COMPANY>XYZ</COMPANY>
<DEPARTMENT>SAP</DEPARTMENT>
<DESIGNATION>Consultant</DESIGNATION>
<BEGINDATE>20050606</BEGINDATE>
<ENDDATE />
<SALARY>600000</SALARY>
</JOB>
- <PERSONAL>
<EMPID>002</EMPID>
<FIRSTNAME>Bala</FIRSTNAME>
<MIDDLENAME>Krishna</MIDDLENAME>
<LASTNAME>Reddy</LASTNAME>
<AGE>25</AGE>
<GENDER>M</GENDER>
</PERSONAL>
- <JOB>
<COMPANY>XYZ</COMPANY>
<DEPARTMENT>SAP</DEPARTMENT>
<DESIGNATION>Consultant</DESIGNATION>
<BEGINDATE>20050606</BEGINDATE>
<ENDDATE>20061206</ENDDATE>
<SALARY>600000</SALARY>
</JOB>
- <JOB>
<COMPANY>XYZ</COMPANY>
<DEPARTMENT>SAP</DEPARTMENT>
<DESIGNATION>Consultant</DESIGNATION>
<BEGINDATE>20070106</BEGINDATE>
<ENDDATE />
<SALARY>800000</SALARY>
</JOB>
</EMPLOYEE>
</ns0:MT_Emp_Det>
Target output XML File generated:
<?xml version="1.0" ?>
- <EMPLOYEES>
- <PERSONAL>
<FULLNAME>SantoshKumarK</FULLNAME>
<AGE>25</AGE>
<GENDER>M</GENDER>
</PERSONAL>
- <JOB>
<COMPANYDATA>XYZ*SAP*Consultant*20050606**600000</COMPANYDATA>
</JOB>
- <PERSONAL>
<FULLNAME>BalaKrishnaReddy</FULLNAME>
<AGE>25</AGE>
<GENDER>M</GENDER>
</PERSONAL>
- <JOB>
<COMPANYDATA>XYZ*SAP*Consultant*20050606*20061206*600000</COMPANYDATA>
<COMPANYDATA>XYZ*SAP*Consultant*20070106**800000</COMPANYDATA>
</JOB>
</EMPLOYEES>
Mapping Templates are for extending the reusability of mapping objects in XI. They are created based on
data type and can be saved as mapping templates.
Mapping template can be reused or loaded into other message mapping or mapping templates.
It is a Standard schema for describing the message structure at runtime.
Features:
Mapping templates can be defined for:
Data type
Complex type in IDOC & RFC
Complex type in external Definition
Mapping template can be in any SWCV.
Mapping Templates
Select the mapping for which you want to create the mapping template, choose source and target node
and choose the mapping and click the Create Template based on mapping.
Enter the name and description of the Mapping Template and Click the Create Button
Choose the Templates suitable for the mapping between from the Source to Destination Data Types.
The
mapping
will
be
displayed
as
in
the
mapping
template.
Additional:
Mapping templates are data type based; we will make some change in data type and new field in
structure
to
demonstrate
the
same:
Also I will make some change in source message mapping (mapping from where we made templates or
in template based mapping).
Check the template which was created based on the above message mapping, though there is a change
in the message mapping, those changes does not effect Mapping Template. But when we make the
changes in the data type that is used in the Message mapping, those changes affect Mapping Template,
hence Mapping Templates are based on data types, which are created from Message Mappings.
Also check the mapping where we used this template, only data type effect will be there.
Alert Configuration in XI
Alert:
Analertisanotificationinformingitsrecipientsthatacriticalorveryimportantsituationhasarisen.The
situation is as severe that an action must be taken immediately in order to resolve the situation. The
systemrecognizesthesituationandsendsthealert.
Alertscanbeusedtopreventdelaysintheprocessingofcriticalsituations,becausethetimebetween
discoveringandrespondingtosuchsituationsisreducedconsiderably.
NowwewillgothroughthestepsrequiredforAlertConfiguration,
1.AlertClassification
Tobeabletosenddifferenttypesofalertsunderspecificconditions, differentalertcategoriesareto
bedefined.Acategorycontainsvariouspropertiesandotherspecificationsthatdefinethealertswithin
thatcategory,for
exampleexpirydate,ortheescalationrecipient.
Alertcategoriescanbeassignedtoanalertclassification.Ifyoudonotwanttocreateaclassificationon
yourown,youcanalwayscreatecategorieswithinthe default classification folder Unclassified.
However, for a better overview, it is recommendable to create different alert classifications to group
alertcategoriesthatbelongtothesametopic.
ProceduretoCreateAlertClassification,
Go to Transaction code ALRTCATDEF and make sure that you are in change mode,
2.AlertCategory
Analertcategorycontainsvariouspropertiesandotherspecificationsthatdefinethealertswithin
thatcategory.Thecategorydefinestheconditionswhenaspecificalertissenttowhom.
Alert categories can be defined by applications or customers using the alert category definition
environment,whichisaccessedintransactionALRTCATDEF.Youcandefineanalertcategorytosuit
your business requirements. When the critical situation defined in the alert category arises, the
systemrecognizesthisandsendsanalertinstanceofthiscategorytotherecipientsdetermined.
ProceduretoCreateAlertCategory
Ensure that you are in change mode in the alert category definition environment (transaction
ALRTCATDEF).
Choose Create Alert Category.
OnthePropertiestabpage:
In the Description field, enter a description for the alert category. Choose a description that is
informative with respect to the content of the alert category. The description is language-dependent.
If required, you can select a classification in the Classification field. If you do not choose a specific
classification, the category is stored in the classification folder Unclassified. For more information on
classifications, see Alert Classification.
In the Max. No. of Dels field, specify a maximum number of times that an alert of this category is to be
delivered if it is not confirmed. This refers to delivery using a communication channel other than to the
recipients display program (UWL, application-specific program, or alert inbox).
Select Dynamic Text if the texts of the alert category cannot be defined at this stage. This refers to
situations in which the texts are not known until runtime.
o No translation can be performed for alerts with dynamic text. System messages can be
entered manually in several languages.
In the Expiry Time in Min. field, you can enter a life span for alerts of this category if the alerts will no
longer be relevant after a specific period of time. If the expiry time elapses, the alert is removed from
the alert inbox and is no longer delivered using any other channel.
OntheContainertabpage,
Defineanyvariablesthatyoumaywanttouseintheshorttextorlongtext.Youcanalsodefineother
applicationspecific variables, such as company code or material number. These variables are then
replaced at runtime with values from the application. It is therefore the interface between the
applicationthattriggersthealertandthecentralAlertFramework.
InternalUsageoftheContainer
TheAlertFrameworkusesthealertcontainernotonlyfortheexchangeofapplicationspecificvariables,
butalsofortheexchangeofinternalinformation.Thefollowingvariablesareusedforthispurpose.
Name
Meaning
Typing
_ALERT_RECIPIENTS
Recipientlist
typesalrttrcp
_ALERT_ACTIVITIES
Subsequentactivities
typetableofsalrtsacti
_ALERT_EXPIRATION
Expirydate/time(time
stamp)
typetimestamp
_ALERT_DYNAMIC_SHORTTEXT
Shorttext
typesalrtdcatd(CHAR60)
_ALERT_DYNAMIC_LONGTEXT
Longtext
typetableofCHAR255
_EVT_OBJECT
Triggeringobject
typeBORIDENT
_ALERT_LOGICAL_SYSTEM
Logicalsysteminwhich
thealertistriggered
typeRFCDEST
OntheShortandLongTexttabpage,
Enter texts for the alert category. You can include text variables referring to elements of the alert
container or system symbols. In the case of a container element, the variable must be defined in the
alertcontainer.Theentryinthetextmustbeintheform&<ElementName>&.
Recipient Determination
Alert Management must know who the recipients of alerts of a particular category are, so that it
can inform the correct parties. There are various ways of determining the recipients of alerts.
Below
you
can
see
three
ways
of
specifying
recipients.
A system administrator determines the recipients of a particular alert category in the definition
environment (transaction ALRTCATDEF). The administrator can define individual recipients
(using Fixed Recipients) or roles (using Recipients Via User Roles). If a role is specified, all
recipients who have the assigned role receive the alerts of the category in question.
With the above steps we are done with the Alert Configuration. Now define any Integration
process where after a Receive step include a Switch step/ fork where we can send alert to
required recipient by specifying Alert Category.
To see whether Alert has been triggered or not, Go to RWB -> ALERT INBOX and see any
mail came to the Alert Inbox or run transaction SOST in SAP system where you will see the
details of all send requests.
Following is the snap shot of the mail that generally a required recipient receives into his Inbox.
UsefulinformationregardingSAPXI
Useful information regarding SAP XI
XI transactions
SXMB_IFR
SXMB_MONI
SXI_MONITOR
SLDCHECK
SLDAPICUST
SXMB_ADM
SXI_CACHE
SXMB_MONI_BPE
StartIntegrationBuilder
IntegrationEngineMonitoring
XI:MessageMonitoring
TestSLDConnection
SLDAPICustomizing
IntegrationEngineAdministration
XIDirectoryCache
ProcessEngineMonitoring
XI URLs
(<host> is the host name of the XI server and <sys#> is its system number)
http://<host>:5<sys#>00/rep
http://<host>:5<sys#>00/sld
http://<host>:5<sys#>00/rwb
http://<host>:5<sys#>00/MessagingSystem
http://<host>:5<sys#>00/mdt/amtServlet
http://<host>:5<sys#>00/exchangeProfile
http://<host>:5<sys#>00/CPACache
http://<host>:5<sys#>00/CPACache/refresh?mode=delta
http://<host>:5<sys#>00/CPACache/refresh?mode=full
ABAP transactions
SE38
SE11
ST22
SPROXY
SE80
ABAPDOCU
SE37
SE24
ABAPEditor
ABAPDictionary
ABAPdumpanalysis
ABAPProxyGeneration
ObjectNavigator
ABAPDocumentationandExamples
ABAPFunctionModules
ClassBuilder
Administrative transactions
SM21
SMQ1
SMQ2
RZ70
SM58
SM59
SMICM
OnlineSystemLogAnalysis
qRFCMonitor(OutboundQueue)
qRFCMonitor(InboundQueue)
SLDAdministration
AsynchronousRFCErrorLog
RFCDestinations(Display/Maintain)
ICMMonitor
IDOCtransactions
WE60
DocumentationforIDoctypes
BD87
StatusMonitorforALEMessages
IDX1
PortMaintenanceinIDocAdapter
IDX2
MetaDataOverviewinIDocAdapter
WE05
IDocLists
WE02
DisplayIDoc
WE19
Testtool
WE09
SearchforIDocsbyContent
WE20
PartnerProfiles
WE21
Portdefinition
IntegrationEngineConfiguration
This document gives you clear understanding of Global configuration and Specific configuration data of a
Business system or client Configuration data of the integration engine. It is necessary to configure the
Integration Engine, which enables the exchange of messages at run time.
This is an administrative activity that helps to optimize the utility of Integration Engine. As a PI consultant,
understanding of Integration Engine configuration helps, how best you can utilize the resources available
with Integration engine.
To understand the value of configuring the Integration Engine, one needs to know about Integration
Engine.
Integration Server constitutes of three engines:
Business Process Engine: BPE executes and persists Integration process.
Integration Engine: IE executes the integration logic of Integration Builder. Messages in
XI are passed through a series of pipeline steps belongs to XI Pipeline.
Adapter Engine: Adapter engine Converts XML and HTTP based messages to/from
specific protocol as configured with Target/Source system
Salient Points:
Configuration data is client specific. This allows you to configure multiple systems (clients) in
different ways in one SAP system.
Global configuration data defines the role of the business system. This data can be
configured in two ways. By loading the configuration data from the System landscape
directory or configuring locally.
Prerequisite for the first method: Business system details about the current business system
should be available in SLD.
Steps for loading Global Configuration data:
oGo to SXMB_ADM transaction. Choose Integration Engine Configuration
oSave the loaded data.
Note: The above loaded Global configuration data from SLD can be revoked by using
Undo Global Changes. Also note that the system displays a message if no data is maintained in
the System Landscape Directory
Steps for Changing Global Configuration data:
oGo to SXMB_ADM transaction. Choose Integration Engine Configuration
oChoose Edit Change Global Configuration Data.
o For Role of Business System field, select the option as either Application System
or Integration Server.
Integration Server
The Integration server executes only integration logic available Integration Builder. They can
also be identified as Pipe Line Steps. It receives XML message, determines the receiver,
executes the mappings, and routes the XML message to the corresponding receiver systems.
Thus configured Integration Engine is identified to be Central Configured Integration engine.
Note: Only one client of SAP system can be configured as Integration Server.
Application system
The Application system will not execute the integration logic. It in turn calls the integration
server to execute the integration logic if required. It acts as sender or receiver of XML
messages. So, the Application system with a local Integration Engine requires the Integration
server to execute the integration logic.
Two ways of Global configuration for configuring your business system to call an Integration
Server (Other than your business system):
o
http://<host>:<port>/sap/xi/engine?type=entry
dest://<Integration Server-Destination>
Create a RFC destination of type H in the current client and provide host name, port
of the integration server and path prefix as
/sap/xi/engine?type=entry
Also provide the user and the password.
oGotoSXMB_ADMtransaction.ChooseIntegrationEngineConfiguration
oIntheCategoryfield,choosetheparametercategory.
oChooseChangeSpecificConfigurationData.
o Go to change mode, this allows to define the new values to the parameters or
change the existing values of the parameters. Use the input help to select the
parametersandvalues.
Use the above procedure to define the parameters of the following categories.
1.IDOC Adapter
2.Sync/Async Communication
3.Business Process Engine
4.Monitoring
5.Performance Monitoring
6.Runtime
Find below the screenshot illustrating a sample business system configured for the above parameters.
An effort is made to explain the values that should be set for the categories and meaning for the
configured values. Please click here to view the same.
Description: File Content Conversion is used to convert non-XML file (flat file) to XML file and vice-versa.
In this blog, we will see how to convert flat file to XML file when file structure is bit complex. For example,
when same columns shows different information in different rows.
Business Case: Lets take an example with file shown below. The file contains employee details. First
row contains Header information of Employee (Employee ID and Name), Second and Third row contains
his weekly details (Week No, Working Hours in the Week and Wage for the Week), there can be n
number of rows for weekly details and Last row contains Employee Monthly Summary (Total hours
worked and Total Wage).
Screen shot
Here I have added a filed Key in all the records, we will discuss about this field while configuring Sender
File Adapter.
2. Create Message Type.
3) Create Message Interface for Sender and Receiver. In this scenario I am using same Message Type
for both Sender and Receiver as we are converting flat file to XML file.
4) Create Message Mapping.
1.
Configuration
of
Sender
File
Adapter
2.
Receiving
File
Adapter
3.
Configure
5.
Receiver
Agreement
6.
On Executing the Scenario we will get the output file in XML Format.
The message header of an XI message contains a header for adapter-specific message attributes that
the sender adapter can use to write additional information to the message header. This enables sender
adapters to write information that is not known until runtime to the message.
Features
The key for accessing the value of an adapter-specific attribute comprises a namespace belonging to the
adapter and an attribute name.
The adapter namespace comprises the namespace in the Integration Repository in which the adapter
metadata for the adapter is saved and the name of the adapter metadata object. The adapter
namespaces for the adapters shipped by SAP therefore have the following format:
http://sap.com/xi/XI/System/<Adapter Metadata Object Name of Adapter>.
The adapter metadata objects are located in the namespace http://sap.com/xi/XI/System of software
component SAP BASIS.
Mapping API
The classes for accessing the adapter-specific attributes are part of the mapping API (package
com.sap.aii.mapping.api):
com.sap.aii.mapping.api.DynamicConfigurationKey
Class used to create a key object for an adapter-specific attribute. The key object comprises the adapter
namespace and the attribute name.
com.sap.aii.mapping.api.DynamicConfiguration
Class used to read, change, or delete the value of an adapter-specific attribute. In a method, you use
objects of type DynamicConfigurationKey to access the attributes.
Lets have a look at the simple scenario where sender will put a file, having customer details, at a FTP
server. The file can have different names. The File needs to be moved to the other location, where
Target File Name = Customer Name from Source File + Source File Name
IntegrationRepository
1.Create Data Type, Message Type and Message Interface for Sender and Receiver service.
2. Create Message Mapping.
GetTargetFile is a User-Defined function, which extracts the file name at runtime. The sample code for
the function is
ConfigurationonID.
Parameter FileName, under adapter specific message attributes, is checked in the communication
channel, the source file name will be present in the header of the XI message.
Configure Receiver Communication Channel.
1. Check the checkbox Variable Substitution. Value from field ofile of message
copied into the variable var. This variable is used as File Name for output file.
MT_Output is
Configure the Receiver Determination, Interface Determination, Sender agreement, Receiver agreement.
Customer
name
ABC
is
concatenated
to
the
input
file
name
Reference:RefertothefollowinglinkforinformationonClassesandMethods
http://help.sap.com/saphelp_nw2004s/helpdata/en/43/03612cdecc6e76e10000000a422035/content.
htm
This document provides you the information on how to install the J2EE Adapter Engine and also the
Configuration settings need to be done after the installation.
Here this document is divided into two parts. In the part1 we discuss about Installing J2EE Adapter
Engine and in part2 - steps needed after installing and also checking the decentralized adapter in the
central instance.
Reasons for Decentral SAP J2EE-based AE:
When the above-mentioned situation occurs we can use either Plain J2SE Adapter Engine (or)
Decentralized J2EE Adapter Engine.
Here is a simple Diagram on J2EE Adapter Engine. This diagram shows only when Firewalls comes in
between Integration server and Business systems. The reason behind using J2EE Adapter can be
anyone of the reasons mentioned above. Most of the cases when customer is not interested in FTP, we
can go for this option.
Lets see how to install the Local J2EE Adapter Engine and the prerequisites needed while installing the
J2EE adapter engine.
Important UserIds and Passwords from XI Server
1.
2.
3.
4.
5.
SAPJSF
J2EE_ADMIN
J2EE_GUEST
XISUPER
XILDUSER
Now Select SAP Net weaver 2004s support Release 2 -->Optional Standalone Units -->J2EE Adapter
Engine -->Oracle (Select related Database on which our Adapter is going to install) -->
Click on J2EE Adapter Installation.
Step2:
Provide the password for Administrator (this is for local J2EE engine)
Step3:
Provide the password for XILDUSER (this must match the SAP XI host, if not XI will
Become unstable)
Here one more point we should remember is give full path of the host name of XI including domain also. I
hope that would be better.
Step4:
Make sure the password matches the password in XI system
Now Start the J2EE from SAP management console and starting the J2EE
If you have dynamic IP address, then hosts file in server should be changed before starting Decentral
Adapter Engine.
I followed the above-mentioned steps and installed the Decentralized Adapter Engine Successfully.
This document is in continuation to the part 1 (installation of Decentralized Adapter Engine. For Part 1,
click here.
This document provides information on post-installation activities.
Following three Steps are mandatory to ensure proper working of the decentralized adapter engine.
1.
2.
3.
Port: specify the HTTP standard access port of the SLD where this is the HTTP port of the J2EE
engine.
User: Specify a Java user that already exists on the host where the SLD Bridge runs (XISUPER, for
example).
Password: Enter the user password.
5. You can test your settings by sending test data to the SLD; click the blue arrow with the quick info
text Trigger data transfer to System Landscape Directory.
1. Enter the following URL in a browser for entering the values in exchange profile of decentralized
adapter engine: http://<AE_server>:<AE_HTTP_Port>/exchangeProfile.
2.Choose Connections.
3.Enter the respective values.
Note : You can open the following url : http://<PI_server>:<_HTTP_Port>/exchangeProfile and choose
Connections here and copy the same values in decentralized adapter engine exchange profile
parameters.
2. Select your Adapter engine, from there you can ping the server and send a test message.
4. After all the above settings the Decentralized Adapter Engine should appear in drop down of Adapter Engine
in Communication channel in ID.
Caution: The Integration Server stores the SLD data in the cache. To update this: In the Integration
Server transaction SXI_CACHE, start ---> Go ----> Adapter Engine Cache. Select the relevant row with
the obsolete port and delete the cache entry (Delete symbol). All cache entries are deleted. The next time
a message is sent to the relevant adapter engine, the updated data is read from SLD.
Check: After you restart the J2EE applications in the SLD, choose Content Maintenance ---> XI Adapter
Framework to navigate to your Adapter Engine.
For example, under the associated instances and XI Adapters Service XIRA ---> Associated Instances ---> Port for XIRA of af.<SID>.<hostname>, you will find the URL that is used by the Integration Server to
send messages to the AE. After the message is resent, check whether the Adapter Engine cache was
also updated in the Integration Server (SXI_CACHE ---> Go to---> Adapter Engine Cache).
Note:
1.
2.
3.
4.
hope
document
will
be
helpful
to
work
on
Decentralized
J2EE
Adapter
Engine.
Apart from the Decentralized J2EE Adapters we have another option called Plain J2SE Adapters.. This
document gives you the knowledge on how to use Plain J2SE Adapter.
As J2SE Adapter Engine is a lightweight setup, which doesnt occupy the hard disk space much.
Prerequisites needed for J2SE Adapter Engine.
1)A Java development kit (JDK) 1.3 or 1.4
2) The optional package Java Servlet Version 2.3 or higher (servlet.jar), you can copy the package
directly to the adapter installation directory (tech_adapter); otherwise it must be located in the Java
CLASSPATH.
After the setup for J2SE Adapter Engine, click on install_adapter and then run_adapter in the folder
tech_adapter for starting the J2SE Engine.
Click on one of the adapters and click on configure button to configure the adapters
After configuring the adapter, click on Store Configuration Data to save the configured properties. And
then do Restart and Reload.
For each and every adapter there will a properties file created with name <name of the
adapter>.properties in Configuration folder.
Now the adapter Receiver_test is running successfully.
When an adapter is failed, it will show in red colored and to check the log of that adapter click on View
Log.
To stop the Adapter click on Stop Button. To terminate adapter, click on Terminate Button.
..
1) We have already seen about Adapters .Next Documentation on Plain J2SE Adapters.
3) Password Management:
4) Certificate Management:
5) Services:
Now in the Services we have SLDaccessor and HttpServer
a) Click on HttpServer. Choose the authentication and protocol here for plain (http) or SSL (https)
b) For SLDaccessor.This is useful to connect the SLD to which our Plain J2SE needs to be connected.
Enter SLD.host, SLD.port, SLD.user, SLD.password
6) Traces:
7) Test Environment:
Before starting this document please refer the topic Working with PlainJ2SE Adapters which I posted
earlier.
This document gives an idea on how PlainJ2SE Adapters work with XI.
Step1: Steps in PI server
Step2: Configuring Plain J2SE File Adapter for both Sender and Receiver.
Step3: Test the Scenario.
Step1: Steps in PI (XI)
Create Data types, Message Types, Message Interfaces, Message Mapping and Interface Mappings in
IR.
e) Create Interface Mapping:
Steps in ID:
A. Create a communication Channel for the Receiver Business Service. And No need of
configuring Sender communication Channel.
B. No Sender Agreements.
C. Create Receiver Agreements
D. Create Receiver Determination.
E. Create Interface Determination.
a)Create a communication Channel for the Receiver Business Service
Adapter: XI
Type: Receiver
From the above screenshot Target Host will be the IP address of the Target system to which PI has to
send.
Service Number can be anything.
e) Create Interface Determination.
Finally configured steps in ID with colored :
Step2: Configuring Plain J2SE File Adapter for both Sender and Receiver
a)
Below mentioned Parameters are Mandatory while configuring a Sender PlainJ2SE Adapter.
classname=com.sap.aii.messaging.adapter.ModuleFile2XMB
version=30
##Here we are using a Simple Row Conversion
mode=FILE2XMBWITHROWCONVERSION
## Integration Engine address and document settings (example, see docu)
XI.TargetURL=<Integration Engine URL of XI to which PlainJ2SE Adapter need to
connect>
XI.User=<XI UserId>
The user must have the authorizations of group SAP_XI_APPL_SERV_USER
On the Integration Server.
XI.Password=<Password>
XI.Client=<XI Client >
XI.Language=EN
#XI.SenderParty=<Sender Party>
XI.SenderService=<Sender Business Service>
XI.Interface=<Sender Message interface>
XI.InterfaceNamespace=<Namespace>
#XI.ReceiverParty=<Receiver Party>
XI.ReceiverService=<Receiver Business Service>
The below mentioned parameters have to mention according to our requirement. Here we are doing with
row conversion.
After Configuring Adapter follow these steps click Store Configuration Data next click Restart and then
click Reload.
If Adapter is working fine then it should be in Green status. Check View log if it shows Red.
b)
After Configuring Adapter follow these steps click Store Configuration Data next click Restart and then
click Reload.
If Adapter is working fine then it should be in Green status. Check View log if it shows Red.
Step3: Test the Scenario:
Now place a file with name PlainJ2se.txt in Input folder After the poll interval of 30secs as we mentioned
in sender file adapter it should be archived in Archive folder and should place the target file name
PlainJ2se<add timestamp>.txt in Output directory.
Check in the View log of sender File plain J2SE after placing a file in Input folder.
The above screenshot shows the successful process done by the Sender adapter as it was picked the
file from Input folder sent to Integration Engine and also archived to D:/Archive.
This screenshots shows that Receiver FileJ2SE was also processed successfully.
Note:
1) We can check this Adapter Engine in RWB of PI system.
Mapping is nothing but a set of rules, which are used to transform the source XML to target XML, based
on the business requirements.
To understand the mapping concepts let me first explain the basics taking a simple example:
A simple mapping in the graphical editor looks as below. The Age XML tag in the target structure is
created for each Age tag in the source structure.
Example 1
The color indicators for the XML messages has a specific meaning:
1.Green color indicates that we can map these XML tags according to our requirements
2.Red color indicates that these tags are mandatory and should not be left unmapped.
Example 2
In Example 2 we can see that Age is an Element. Element is nothing but a simple XML tag. MT_Target is
a Node. Node is a collection of XML tags.
What is Occurrence: Occurrence is nothing but the number of times a given xml tag appears in the xml
document. Occurrences are defined in the datatype editor and a minimum and maximum occurrence is
specified for each field.
Referring to Example 1 the occurrence of the source Age field should be similar to the occurrence of
target Age field. Occurrence 1:1 indicates a minimum and a maximum occurrence of 1 i.e it should be
present once.
For Address field the occurrence is 1: unbounded which means minimum occurrence should be once but
maximum can be n number of times.
Context and Context Handling:
Context is nothing but the level or the hierarchy at which an XML tag is present in the xml document with
respect to its parent node.
Context Handling can be achieved using Node functions like RemoveContexts and SplitByValue.
Context Changing can be done to shift the level of the xml tag during runtime. Example 3 shows this
below.
Initially by default the element first was under the node name and can be seen by right click and context.
Now I would like to change the context of the first tag to the root MT_first as shown:
</item>
</item>
</item>
</item>
The Queue gets filled as and when the values in the message arrive. If at any particular level there are no
further elements present then a Context Change is inserted.
The following Scenario explains in depth:
Mapping is the way of transforming one XML structure to another XML Structure. As a part of it we do
certain operations like breaking child nodes and attaching them to its parent node and more in an XML
structure.
In XI/PI we have the following mapping techniques
1.
2.
3.
4.
Graphical Mapping
ABAP Mapping
JAVA Mapping and
XSLT Mapping.
Among all the above mapping techniques, JAVA mappings improves the performance and are
preferred as they gets executed on J2EE engine directly. But in case of graphical mapping, XI/PI converts
the mapping program into executable Java code internally based on graphical design and executes it on
J2EE engine making it comparatively less efficiency. Java mappings are more useful when performance
of integration server is concerned during runtime. Java mappings are Parsing Programs that can be
developed in NWDS (NetWeaverDeveloperStudio), import as .jar files and can be used in the mapping
part of Integration Repository.NWDS provides suitable Java environment and higher level tools to parse
XML documents through the Simple API for XML (SAX) and the Document Object Model (DOM)
interfaces. The SAX and DOM parsers are standards that are implemented in several different languages.
In the Java programming language, you can instantiate the parsers by using the Java API for XML
Parsing (JAXP).
JAVA mapping can be done in two ways:
1.
2.
DOM Parsing
SAX Parsing
Parsing is a technique of reading and identifying XML tags from an XML document. DOM and SAX can
either be a validating or a non-validating parser. A validating parser checks the XML file against the rules
imposed by DTD or XML Schema. A non-validating parser doesn't validate the XML file against a DTD or
XML Schema. Both Validating and non-validating parser checks for the well-formedness of the XML
document.
DOM
Parsing
Technique:
The DOM (DocumentObjectModel) API is based on an entirely different model of document processing
than the SAX API. Instead of reading a document one piece at a time (as with SAX), a DOM parser reads
an entire document. It then makes A tree for the entire document available to program code for reading
and updating.
At the core of the DOM API are the Document and Node interfaces. A Document is a top-level object that
represents an XML document. The Document holds the data as a tree of Nodes, where a Node is a base
type that can be an element, an attribute, or some other type of content. The Document also acts as a
factory for new Nodes. Nodes represent a single piece of data in the tree, and provide all of the popular
tree operations. You can query nodes for their parent, their siblings, or their children. You can also modify
the document by adding or removing Nodes.
SAX
PARSING
TECHNIQUE
SAX (Simple API for XML) is an event-driven model for processing XML.The SAX model is quite different.
Rather than building a complete representation of the document, a SAX parser fires off a series of events
as it reads the document from beginning to end. Those events are passed to event handlers, which
provide
access
to
the
contents
of
the
document.
Event
Handlers
There are three types of event handlers: DTD Handlers, for accessing the contents of XML DocumentType Definitions; Error Handlers, for low-level access to parsing errors; and, by far the most often used,
Document Handlers, for accessing the contents of the document. A SAX processor will pass the following
events to a Document Handler:
1.
2.
3.
SAX
Parses node by node
DOM
1. Stores the entire XML document
into memory before processing
4.
2.
3.
Objective:
SAP XI produces the XML documents at the inbound/outbound of the integration pipeline, there is a need
for us to map the source XML doc to the target XML doc. Message Mapping editor is a built in feature,
which helps to map the source and target XML messages graphically. In this document we discuss
various Node Functions related to it. Following Node Functions are covered in the document:
creatIf
removeContexts
replaceValue
exists
splitByValue
collapseContexts
useOneAsMany
sort and sortByKey
mapWithDefault
formatByExample
createIf
Description: I have criteria for existing. It is used when you want to create target node or element
based on some condition.
Fig.1
Figure depicts createIf node functionality. ID is mapped to MT_Reciever_Minor using createIf function.
Fig.2
Figure depicts result of mapping in Fig.1.As, the value of ID is less than 25, MT_Reciever_Minor node is
created in target side.
Fig.3
Figure depicts createIf node functionality. ID is mapped to MT_Reciever_Major using createIf function.
Fig.4
Figure depicts result of mapping in Fig.3.As, the value of ID is greater than 25, MT_Reciever_Major node
is created in target side.
removeContext
Description: Removes all immediate level contexts of a source field. In this way, you can delete all
hierarchy levels and generate a list.
Fig.5
Figure depicts removeContexts functionality.
Fig.6
Figure depicts result of mapping in Fig.5.By applying removeContexts, all the Cust_NAME fields came in
one context.
replaceValue
Description: Replaces the value of element with a value that you can define in the dialog for the function
properties.
Fig.7
Figure depicts replaceValue functionality.
Fig.8
replaceValue will replace the value of Cust_NAME element with the value defined i.e.20Name.
.
exists
Description: This is the most frequently needed when mapping IDoc structure to file structure. Lot of
times we come across a scenario where the fields (occurrence=0) are not mandatory in the IDocs are not
populated in the source XML and they are required in the target XML (occurrence =1) which gives a
mapping run time exception that target element cannot be created. We can handle the error by checking
whether the source tag exists and if it does not we can pass an empty value, which generates the
required target field.
Fig.9
Figure depicts exists functionality. ID (source field) is checked for its existence by using exists function.
Fig.10
Figure depicts the existence of ID (source field), so same value is populated in target file.
Fig.11
Figure depicts the non-existence of ID (source field), so same the constant value defined is populated in
target file.
splitByValue
Description: splitByValue() is the counterpart to removeContexts(): Instead of deleting a context, you can
insert a context change in the source value queue.
You can insert a context change in the queue after each value or after each change to the value, or after
each tag without a value.
splitByValue can be achieved in following three ways:
1. EACH VALUE
Fig.12
Figure depicts the splitByValue(Each Value) functionality.
Fig.13
Figure depicts the splitByValue (Each Value) mapping result.
2. VALUE CHANGE
Fig.14
Fig.15
Figure depicts the splitByValue (Value Change) mapping result.
3. EMPTY VALUE
Fig.16
Fig.17
collapseContext
Description: It takes the first value from all the contexts and put them into a context on the target side,
So that all values come under one context.
Empty contexts are replaced by empty strings.
Fig.18
Figure depicts the collapseContexts functionality.
Fig.19
Figure depicts the collapseContexts mapping result.
useOneAsMany
Description: As shown in the figure below the maximum occurrence of the header node in the source is
1 and the target is unbounded. So we have only one occurrence of MatNo & MatDesc, which has to be
replicated for every line Item.
Fig.20
Above figure depicts, both source & target structures.
Fig.21
Above fig depicts, mapping of usOneAsMany function
In the above figure, useOneAsMany takes three arguments.
1.
2.
3.
Fig.22
Number of time Item node appears (repeats), same number of time Header node will appear in target
side. So, source Item node is mapped to Target Header node.
Fig.23
In above fig, Item node is duplicated.
Fig.24
In above fig, Item node is duplicated twice in the source structure. Accordingly, Header is repeating twice
in the target side i.e. MatNo & MatDesc.
Fig.25
Figure depicts the sort functionality.
As the functions sort and sortByKey only sort the elements within the same context, we have to use the
function removeContexts before sort. After sort we have to restore the original contexts. We do this with
the node function splitByValue
Fig.26
Figure depicts the sortByKey functionality.
In the above fig The Cust_Name is sorted based on the Cust_ID.
sortByKey: take two arguments, first argument acts as key for sorting the second argument.
Fig.27
In the above fig Cust_Name is sorted based on Cust_ID.
mapWithDefault
Description: This function provides a default value, whenever the source element is not available.
Fig.28
Figure depicts the mapWithDefault functionality.
Fig.29
Figure depicts the mapWithDefault mapping result.
formatByExample
Description: This function has two arguments, which must both have the same number of values. To
generate the target, the function takes the values from the first argument and combines them with the
context changes from the second argument.
This function allows grouping of values of a tag (Cust_NAME) according to values of another tag
(Cust_ID).
Fig.30
In the Example, Cust_ID is repeated 5 times having 5 different contexts. By using removeContexts
function, remove all the 5 contexts. Then split the Cust_ID in different context whenever value change
occurs, that is achieved using splitByValue (Value Change).From each context, take first Cust_ID and
put it in one context using collapseContexts. So, Creating number of Customers nodes based on result
of collapseContexts.
Fig.31
In the Example, Cust_ID is repeated 5 times having 5 different contexts. By using removeContexts
function, remove all the 5 contexts. Then split the Cust_ID in different context whenever value change
occurs, that is achieved using splitByValue (Value Change).From each context, take first Cust_ID and
put it in one context using collapseContexts. Again split into different context based on each value by
using splitByValue (Each Value) .So, Creating number of Cust_ID fields based on result of splitByValue.
Fig.32
The figure above depicts formatByExample functionality.
formatByExample takes two arguments:
1. First argument: Field that is to be grouped.
2. Second argument: Base for grouping of first argument.
Fig.33
In the fig above, Cust_NAME is grouped in 2 context based on Cust_ID in the target structure.