Data Flows
Data Flows
Followed links: What is Data Flow? How to use Data Flow in PEGA Platform? - OneStopPega
Data flow:
We will use data flow whenever we have transaction of data is huge and performance is highly
considered.
Whenever we create dataflow there will be one source and one destination.
Source configurations:
To receive data from an activity or from a data flow with a destination that refers to your
data flow, select Abstract.
To receive data from a different data flow, select Data flow. Ensure that the data flow that
you select has an abstract destination defined.
Destination configurations:
Abstract: If you want other data flows to use your data flow as their source
Activity: If you want an activity to use the output data from your data flow
Case: If you want to start a case as the result of a completed data flow
Dataflow: If you want to send output data to a different data flow
Data set: If you want to save the output data into a data set.
1. Filter:
By using filter we can add some conditions.if that condition satisfies those records will be passed to
destination.
Testing:
Run the dataflow and see the results.
2. Compose:
If we want to combine data from two sources into a page or page list property, we will use compose
shape.
The data from the secondary source is appended to the incoming data record as an embedded data
page
For compose we need two sources right, so I have created another dataset, but that dataset is in
different class.
Create property in another class.
Testing:
3. Convert:
You change the class of the incoming data pages to another class when you need to make the data
available elsewhere.
For example, you want to store data in a data set that is in a different class than your data flow and
contains different names of properties than the source data set. You might also want to propagate
only a part of the incoming data to a branched destination, like strategy results (without customer
data) to the Interaction History data set.
Here I want to convert my source data to another class i.e., customer info class.
Testing
4. Merge:
If we want to merge the data from two different sources, we will use merge.
Here whenever the criteria match that result will be passed to destination.
Here every record matches the condition so 6 records are coming in destination.
5. Data transform:
If we want to include the datatransform and if you think to manipulate the data, we can include
datatransform.
6. Strategy:
If we want to invoke the strategy, for strategy we need the input data right, for that we will use
dataflow and we have to include the strategy shape so that strategy result will be processed and will
pass into destination.
Make decision - The strategy that is executed by the data flow is designed only to issue a decision.
Make decision and store data for later response capture -The strategy that is executed by the data
flow is designed to issue a decision and you want to store the decision results for a specified period
of time. You can use this data for delayed adaptive model learning and issuing a response capture at
a later time.
Capture response for previous decision by interaction ID - The strategy that is executed by the data
flow is designed to retrieve the adaptive inputs and strategy results for the interaction ID.
Capture response for previous decision in the past period - The strategy that is executed by the data
flow is designed to retrieve the adaptive inputs and strategy results from the particular period of
time.
Testing:
7. Text analyser:
Reference Text Analyzer rules to apply text analysis in your data flow. Build data flows that can
analyze text data to derive business information from it. For example, you can analyze the text-based
content, such as emails and chat messages.
If we want to send email we can use send email shape, similarly send sms shape also.
Types of Dataflows:
Go over a finite set of data and eventually complete processing. Batch data flows are mainly used for
processing large volumes of data.
Go over an infinite set of data. Real-time data flows are always active and continue to process
incoming stream data and requests.