0% found this document useful (0 votes)
9 views17 pages

Data Flows

The document outlines the concept of data flows in the PEGA platform, emphasizing their use for handling large data transactions efficiently. It details the configurations for both source and destination, as well as various shapes for data manipulation, including Filter, Compose, Convert, Merge, Data Transform, Strategy, and Text Analyzer. Additionally, it categorizes data flows into Batch, Real-time, and Single case data flows based on their processing characteristics.

Uploaded by

pranaisj
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views17 pages

Data Flows

The document outlines the concept of data flows in the PEGA platform, emphasizing their use for handling large data transactions efficiently. It details the configurations for both source and destination, as well as various shapes for data manipulation, including Filter, Compose, Convert, Merge, Data Transform, Strategy, and Text Analyzer. Additionally, it categorizes data flows into Batch, Real-time, and Single case data flows based on their processing characteristics.

Uploaded by

pranaisj
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 17

Data Flows

Followed links: What is Data Flow? How to use Data Flow in PEGA Platform? - OneStopPega

Creating a data flow | Pega

Data flow:

We will use data flow whenever we have transaction of data is huge and performance is highly
considered.

Whenever we create dataflow there will be one source and one destination.

Data will be passed from source to destination.

Source configurations:
 To receive data from an activity or from a data flow with a destination that refers to your
data flow, select Abstract.

 To receive data from a different data flow, select Data flow. Ensure that the data flow that
you select has an abstract destination defined.

 To receive data from a data set, select Data set.


 To retrieve and sort information from the PegaRULES database, an external database, or an
Elasticsearch index, select Report definition

Destination configurations:

Abstract: If you want other data flows to use your data flow as their source

Activity: If you want an activity to use the output data from your data flow

Case: If you want to start a case as the result of a completed data flow
Dataflow: If you want to send output data to a different data flow

Data set: If you want to save the output data into a data set.

In between source and destination, we can manipulate our data.

For that we have some shapes.


Before knowing the shapes we will create dataset for source .

I am creating database dataset.


Now we will discuss about shapes:

1. Filter:

By using filter we can add some conditions.if that condition satisfies those records will be passed to
destination.

Here I am adding the condition as type of customer=”existing”

I have four records as type of customer=”existing”.

Testing:
Run the dataflow and see the results.

Click actions-> run.

In destination we got only four records.

2. Compose:

If we want to combine data from two sources into a page or page list property, we will use compose
shape.

But for two sources one common property should be there.

The data from the secondary source is appended to the incoming data record as an embedded data
page

For compose we need two sources right, so I have created another dataset, but that dataset is in
different class.
Create property in another class.
Testing:

Whatever the records in customer ds will add into destination.

Source records total will be pass into destination.

3. Convert:

You change the class of the incoming data pages to another class when you need to make the data
available elsewhere.

For example, you want to store data in a data set that is in a different class than your data flow and
contains different names of properties than the source data set. You might also want to propagate
only a part of the incoming data to a branched destination, like strategy results (without customer
data) to the Interaction History data set.

Here I want to convert my source data to another class i.e., customer info class.
Testing

Customer class data will be converted into customer info class.

4. Merge:

If we want to merge the data from two different sources, we will use merge.

Here sources must be of same class.

Whereas in compose we can use different classes sources.

Here whenever the criteria match that result will be passed to destination.

Here I want to merge two different classes data.

So, for that I am creating Montecarlo dataset.


If condition doesn’t satisfy, we can exclude the source components results by clicking on the
checkbox.
Testing:

Here every record matches the condition so 6 records are coming in destination.

5. Data transform:

If we want to include the datatransform and if you think to manipulate the data, we can include
datatransform.
6. Strategy:

If we want to invoke the strategy, for strategy we need the input data right, for that we will use
dataflow and we have to include the strategy shape so that strategy result will be processed and will
pass into destination.

Make decision - The strategy that is executed by the data flow is designed only to issue a decision.
Make decision and store data for later response capture -The strategy that is executed by the data
flow is designed to issue a decision and you want to store the decision results for a specified period
of time. You can use this data for delayed adaptive model learning and issuing a response capture at
a later time.

Capture response for previous decision by interaction ID - The strategy that is executed by the data
flow is designed to retrieve the adaptive inputs and strategy results for the interaction ID.

Capture response for previous decision in the past period - The strategy that is executed by the data
flow is designed to retrieve the adaptive inputs and strategy results from the particular period of
time.

Testing:

7. Text analyser:

Reference Text Analyzer rules to apply text analysis in your data flow. Build data flows that can
analyze text data to derive business information from it. For example, you can analyze the text-based
content, such as emails and chat messages.
If we want to send email we can use send email shape, similarly send sms shape also.

Types of Dataflows:

Batch data flows

Go over a finite set of data and eventually complete processing. Batch data flows are mainly used for
processing large volumes of data.

Real-time data flows

Go over an infinite set of data. Real-time data flows are always active and continue to process
incoming stream data and requests.

Single case data flows


Executed on request with the data flow source set to abstract. Single case data flows are mostly used
to process inbound data.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy