Mule 2
Mule 2
Mule 2
source
event handling
subflow
---------
sub flow
dont have source or event handling
sub flow can be called by main flow
if you drag and drop try block, you can have event handling in subflow
payload
-----------
using setpayload .. data can be passed from main flow to sub flow #[payload]
using setVariable -- variable can be passed from main flow to subflow
#[vars.varName]
For asynchronous
--------------
drag and drop async scope and inside put the subflow
it will create thread
private flow
-------------
it dont have source, but has error handling.
drag and dropping any connector will create private flow automatically
database
------------
database insert --
e.g
{
"emp_id" : payload.emp_id;
}
select
------------
database select -- ouput is resultset
in transform message
prepare a json and create meadata type (upload json application>json)
and create a mapping between payload (which is available from previous select) and
metadata and output payload
response
------------
transform message
secure properties
====================
add secure configuration module to studio
encrypt password:
setup key and algorithm
add anypoint enterprise security plugin to stuido
or use java -jar secure-properties-tools.jar
global elements > secure configuration properties == you can refer the yaml file
error handling
-----------
on error propagate
on error continue
raise error
error handler
validation
----
drag and drop validation from mule palette
error.description
in error response > you can set the payload that custom message we created
variables also can be injected ..like status code or headers
error propagate ==> it catches error type defined (specific exception) and throws
error back at the
main source (with custom payload or message)
Jenkins
-----------------
install git plugin
github integraton plugin
you can see this application deployed in mulesoft cloud hub > runtime manager
after deployment,
now create only one step in jenkins for checkout / build trigger of git and in
script path .. mention jenkinsfile in pipeline section
pipeleine script from scm
salesforce
--------
put transform message before connector, so that it generates output metadata
automatically
Create operation
query api for contact object
salesforce.com/services/oauth2/token?grant_type=?&client_id=&client_Secret=&
(content-type as url_encoded, POST)
authentication
------------------
generate a separate private/public keypair for the Mule server and then add that
publickey to the accounts authorized_keys (/etc/ssh/auth_keys) file on the SSH
server and put the privatekey on the Mule server (create keystore and refer it in
identity file).
design API
----------
open API
APIARY
RAML - interfaces for API implementation, mocked endpoints
eg.
/{ID}:
get:
put:
/flights:
get:
queryParmeters:
destination:
required: false
enum:
- SFO
- LAX
- CLE
Mule flow
=========
Mule event source ---> Mule event processors ---> connector endpoint
proxy
========
create proxy api using api manager (select api from exchange)
API gateway enforce policies on proxy (rate limiting, spike control, security)
mule structure
-------------
Mule Message(Attributes, payload), variables
asyncs
---------------------------
when using flow reference, events are passed synchronusly between flows
select VM in mule palete under module and select publish consume and drag to canvas
(add queue (transient))
domain project ==> can be used to share global configuration elements between
applications
expose multiple services on same port, call flows in other applicatios etc
global
---------
encapsulate all global elements in a separate configuration file (global.xml)
maven
------
groupid
aretefact id
version
dependency
plugin
structure
-------
src/main/mule
src/main/java
src/main/resources
src/test/munit
e.g
/add-employee:
post:
body:
application/json
example:
{
}
responses:
200:
body:
application/json:
example:
{
}
500:
get:
queryParameters:
empId:
required:true
type:integer
example:100
you can do mock service and give to product owner and business before development
best practices
==============
global configuration file for all configurations (global elements)
global error handler
then flow for each resource is defined in separate configuration files under
implementations folder
then flow reference for them is refered in main flow
under resources folder > create yaml or properties file for each environment or
cofiguration properties.
under resources (yaml, wsdl, logger, keystore, api specification) [log with splunk
or elastic search - centralized logging]
AS you add any connectors from modules, a module reference will be added into the
project. a dependency will be added in pom.xml
src/main/mule
src/main/java
src/main/resources
src/test/munit
pom.xml
API management
-------------
steps:
1)publish API specification to exchange from design center of any point platform
2)import to API manager :: >> API Administration > get from Exchange. you can
choose basic endpoint or endpoint with proxy. also speficy implementation url
Then take API id and go to global configuration > auto discovery and specify there
and flow name
deploy it again to runtime manager
APP ID (auto discovery) is the link between the instance running in API manager
with policies and application running in Runtime manager.
Go to access managment > add busines group : take client id and client secret ...
copy that to API Manager window of Any point studio
and deploy the app to cloudhub from studio
apply policies
(client id enforcement (create client application in exchange and get clientid and
client secret and pass it in headers while consuming api), oauth 2, basic auth,
header injection etc, rate limiting)
API fragments
---------
traits
security schemes
data types
examples
trait - resuable code ..like x-transaction-id ..or what ever client needs to pass
it in headers
security scheme - like oauth
move all these into separate files and inject them in RAML API spec for
transaction headers, security scheme, base uri parameters, query parameters, data
type, payload etc
!include <.json>
traits:
responsemessage
then using is keywords ..you can refer the trait
OAUTH
======
when we register our application with oauth provider, they give client id, client
secret, grant type as client credentials
when we consume api, we need to request for token by passing clientid and client
secret and grant type, it generates a access token
then after obtaining access token, pass this token in authorizaton header as bearer
in API administration> while applying API policy, select OAUTH2.0 and then give
validateToken url for access token
proxy
--------
in api administration .. select either basic endpoint or endpoint with proxy
scatter gather
------
scatter invoking different requests in parallel
gather - gther different responses in payload array
batch job
----------
batch job has batch step
file
-------
The File, FTP, and SFTP connectors provide a listener (called On New or Updated
file in Studio and Design Center) that polls a directory for files that have been
created or updated.
Set the autoDelete parameter to true. This setting deletes each file after it has
been processed so that all files found in the next poll are new.
Use the watermark parameter to only pick files that have been created or updated
after the last poll was executed.
Instead of a list of elements that you receive with a fixed-size batch commit, the
streaming functionality – an iterator – ensures that you receive all the records in
a batch job without running out of memory. By combining streaming batch commit and
DataMapper streaming in a flow, you can transform large datasets in one single
operation and one single write to disk.
security
---------
<secure-property-placeholder:config key="${prod.key}" location="test.$
{env}.properties"/>
When you add data to the properties file, Mule gives you the option to encrypt the
data. You then choose an encryption algorithm (of the 19 available), and enter an
encryption key. That encryption key is the only way to decrypt the properties in
the properties file. In other words, the encryption key is the only thing that
unlocks the Credentials Vault.
Next, create a global Secure Property Placeholder element which locates the
Credentials Vault (that is, the properties file), and retrieve encrypted
properties. However, the Secure Property Placeholder can only access the
Credentials Vault (that is, to decrypt the data) if it has the key.
Therefore, you need to configure the Secure Property Placeholder to use the key
that Mule collects from the user at runtime (see code below). In this context, the
key to decrypt the properties becomes a runtime password.
deployment
-------
Encrypted credentials are available in all platform deployments: CloudHub, Runtime
Fabric, and Runtime Manager. To use encrypted credentials when deploying, you need
to set up your Maven master encrypted password and your settings-security.xml file.
{l9vZ2uM5SdgHy+H12z4pX7LEOZn3Kbnqmt3kIquLjnQ=}
Create a settings-security.xml file in your ~/.m2 repository, and add your
encrypted master password.
<settingsSecurity>
<master>{l9vZ2uM5SdgHy+H12z4pX7LEOZn3Kbnqmt3kIquLjnQ=}</master>
</settingsSecurity>
Encrypt your Anypoint platform password:
{HTWFGH5BG9QmvJ1B=}
Add your encrypted Anypoint Platform password to your settings.xml file in the
<server> element:
<settings>
...
<servers>
...
<server>
<id>my.anypoint.credentials</id>
<username>my.anypoint.username</username>
<password>{HTWFGH5BG9QmvJ1B=}</password>
</server>
...
</servers>
...
</settings>
In your configuration deployment, reference the credentials injecting the server id
configured in your settings.xml file. Below is an example of a CloudHub Deployment
using encrypted credentials:
<plugin>
...
<configuration>
<cloudHubDeployment>
<uri>https://anypoint.mulesoft.com</uri>
<muleVersion>${mule.version}</muleVersion>
<server>my.anypoint.credentials</server>
<applicationName>${cloudhub.application.name}</applicationName>
<environment>${environment}</environment>
<properties>
<key>value</key>
</properties>
</cloudHubDeployment>
</configuration>
</plugin>
throttling
--------
no of requests per min
configure that in SLA tier of API administration
then add Rate limiting in API policy
MULE_CORRELATION_ID: single ID for entire group (all output fragments of the same
original message share the same value).
unit test
--
org.mule.tck.AbstractMuleTestCase
questions
===========
mule FTP - authentication
how do u connect to salesforce cloud
backout queue and dead letter queue
private flow and subflow
ssh-keygen to generate both public and private keys -- public key will be created
in keystore under resources which will be imported in identity file in
connector config
PGP for encryption and decryption
------------
Global error handler
error handling > reference excepton strategy
Once we have imported the jar as maven dependency, we need to specify Mule file
name present in the JAR in our Mule Project.
Go to Global Elements > Create > Expand Global Configuration > Import
In Error handling part, “On Error Continue” is checking for the retry count if it
has reached to its max or not. Inside error flow of “On Error Continue” retry count
value is getting incremented and after some seconds of sleep; flow reference will
again call HTTPFlow
Drag and Drop the JSON Validate Schema from Mule Palette to validate the input
payload.
resourceTypes:
collection: item: for single item in collection
usage: Use this resourceType to represent any collection of items
description: A collection of <<resourcePathName>>
get:
description: Get all <<resourcePathName>>, optionally filtered
responses:
200:
body:
application/json:
type: <<typeName>>[]
/foos:
type: { collection / item: { "typeName": "Foo" } }
securitySchemes:
customTokenSecurity : !include securitySchemes/security.raml
apply to APIs
securedby: customTokenSecurity
database polling
----------------
create poll scope
drop datbase select inside poll scope
correlation/aggregation pattern
-----------
flattern reduce
JMS
flow processing strategy
https://www.youtube.com/watch?v=i6hX8WeVtD4
traits vs resource types