Informatica
Informatica
Informatica
Introduction
Course Objectives
By the end of this course you will: Understand how to use the major PowerCenter components for development Be able to build basic ETL mappings and mapplets* Be able to create, run and monitor workflows Understand available options for loading target data Be able to troubleshoot most problems Note: The course does not cover PowerCenter optional features or XML support.
* A mapplet is a subset of a mapping
3
About Informatica
Founded in 1993 Leader in enterprise solution products Headquarters in Redwood City, CA Public company since April 1999 (INFA)
Worldwide distributorship
Informatica Products
PowerCenter PowerAnalyzer ETL batch and real-time data integration BI reporting web-browser interface with reports, dashboards, indicators, alerts; handles real-time metrics Centralized metadata browsing cross-enterprise, including PowerCenter, PowerAnalyzer, DBMS, BI tools, and data modeling tools Data access to mainframe, mid-size system and complex files
SuperGlue*
PowerExchange
PowerCenter Data access to transactional applications and Connect products real-time services
Informatica Resources
www.informatica.com provides information (under Services) on: Professional Services Education Services my.informatica.com sign up to access: Technical Support Product documentation (under Tools online documentation) Velocity Methodology (under Services) Knowledgebase Webzine Mapping templates devnet.informatica.com sign up for Informatica Developers Network Discussion forums Web seminars Technical papers
6
Decision Support
Data Warehouse
Transaction level data Optimized for transaction response time Current Normalized or De-normalized data
Aggregate data Cleanse data Consolidate data Apply business rules De-normalize data
Transform
Extract
8
ETL
Load
Repository Designer Workflow Workflow Rep Server Manager Manager Monitor Administration Console
PowerCenter 7 Architecture
Native
Sources Informatica Server
Native
Targets
TCP/IP
Native
Repository Designer Workflow Workflow Rep Server Manager Manager Monitor Administrative Console
Repository
Not Shown: Client ODBC connections from Designer to sources and targets for metadata
10
Repository Servers
Repository Databases Sources and Targets
Platforms:
Client tools run on Windows Servers run on AIX, HP-UX, Solaris, Redhat Linux, Windows Repositories on any major RDBMS
11
12
Demonstration
13
15
Repository Server
TCP/IP
Import from: Relational database Flat file XML object Create manually
Repository Agent
Native
DEF
16
Repository
Repository Server
TCP/IP
Repository Agent
Native
DEF
17
Repository
18
Flat File
DEF
Repository Server
TCP/IP
Repository Agent
Native
DEF
19
Repository
21
Repository Server
TCP/IP
DATA
Repository Agent
Native
DEF
22
Repository
Data Previewer
Preview data in
Relational database sources Flat file sources Relational database targets Flat file targets
23
24
25
Metadata Extensions
Allows developers and partners to extend the metadata stored in the Repository Metadata extensions can be:
User-defined PowerCenter users can define and create their own metadata Vendor-defined Third-party application vendor-created metadata lists
For example, applications such as Ariba or PowerCenter Connect for Siebel can add information such as contacts, version, etc.
26
Metadata Extensions
Can be reusable or non-reusable
Can promote non-reusable metadata extensions to reusable; this is irreversible (except by Administrator)
Reusable metadata extensions are associated with all repository objects of that object type A non-reusable metadata extensions is associated with a single repository object
Administrator or Super User privileges are required for managing reusable metadata extensions
27
28
30
31
View Synonym
Repository Agent
Native
DEF
32
Repository
DATA
Repository Agent
Native
DEF
33
Repository
34
35
36
Mappings
Mappings
By the end of this section you will be familiar with:
38
Mapping Designer
Iconized Mapping
39
Midstream XML Parser: reads XML from database table or message queue
Midstream XML Generator: writes XML to database table or message queue More Source Qualifiers: read from XML, message queues and applications
41
Transformation Views
A transformation has three views:
Iconized shows the transformation in relation to the rest of the mapping Normal shows the flow of data through the transformation Edit shows transformation ports (= table columns) and properties; allows editing
42
Usage
Modify SQL statement User Defined Join Source Filter Sorted ports Select DISTINCT Pre/Post SQL
43
44
To use a semi-colon outside of quotes or comments, escape it with a back slash (\)
45
Expression Transformation
Perform calculations using non-aggregate functions (row level)
Ports Mixed Variables allowed Create expression in an output or variable port Usage Perform majority of data manipulation
46
Expression Editor
An expression formula is a calculation or conditional statement for a specific port in a transformation
Performs calculation based on ports, functions, operators, variables, constants and return values from other transformations
47
Expression Validation
The Validate or OK button in the Expression Editor will:
Parse the current expression Remote port searching (resolves references to ports in other transformations)
Parse default values Check spelling, correct number of arguments in functions, other syntactical errors
48
Character Functions
Used to manipulate character data CHRCODE returns the numeric value (ASCII or Unicode) of the first character of the string passed to this function CONCAT is for backward compatibility only. Use || instead
49
Conversion Functions
Used to convert datatypes
50
METAPHONE and SOUNDEX create indexes based on English pronunciation (2 different standards)
51
Date Functions
Used to round, truncate, or compare dates; extract one part of a date; or perform arithmetic on a date
To pass a string to a date function, first use the TO_DATE function to convert it to an date/time datatype
52
Scientific Functions
Used to calculate geometric values of numeric data
53
IIF(Condition,True,False)
Test Functions
54
Variable Ports
Use in another variable port or an output port expression Local to the transformation (a variable port cannot also be an input or output port)
55
Use for temporary storage Variable ports can remember values across rows; useful for comparing values Variables are initialized (numeric to 0, string to ) when the Mapping logic is processed Variables Ports are not visible in Normal view, only in Edit view
56
Selected port
57
Informatica Datatypes
NATIVE DATATYPES TRANSFORMATION DATATYPES
Specific to the source and target database types Display in source and target tables within Mapping Designer
Native
Transformation
Native
Transformation datatypes allow mix and match of source and target database types When connecting ports, native and transformation datatypes must be compatible (or must be explicitly converted)
58
For further information, see the PowerCenter Client Help > Index > port-to-port data conversion
59
Mapping Validation
60
Connection Validation
Examples of invalid connections in a Mapping:
Connecting ports with incompatible datatypes
Connecting output ports to a Source Connecting a Source to anything but a Source
61
Mapping Validation
Mappings must: Be valid for a Session to run Be end-to-end complete and contain valid expressions Pass all data flow rules Mappings are always validated when saved; can be validated without being saved
62
63
Workflows
Workflows
By the end of this section, you will be familiar with: The Workflow Manager GUI interface Creating and configuring Workflows Workflow properties
Workflow components
Workflow tasks
65
Workspace
Status Bar
Output Window
66
Task Developer
Create Session, Shell Command and Email tasks Tasks created in the Task Developer are reusable
Worklet Designer
Creates objects that represent a set of tasks Worklet objects are reusable
67
Workflow Structure
A Workflow is set of instructions for the Informatica Server to perform data transformation and load
Combines the logic of Session Tasks, other types of Tasks and Worklets
The simplest Workflow is composed of a Start Task, a Link and one other Task
Link
Start Task
Session Task
68
Session Task
Server instructions to run the logic of ONE specific mapping
e.g. source and target data location specifications, memory allocation, optional Mapping overrides, scheduling, processing and load instructions
Becomes a component of a
Workflow (or Worklet) If configured in the Task Developer, the Session Task is reusable (optional)
69
Command
Email Decision
Assignment
Timer Control Event Wait Event Raise
70
Sample Workflow
Session 1
Command Task
Session 2
71
Concurrent
Combined
Note: Although only session tasks are shown, can be any tasks
72
Creating a Workflow
Select a Server
73
Workflow Properties
Customize Workflow Properties
Workflow log displays
74
Workflow Scheduler
75
76
Workflow Links
Required to connect Workflow Tasks Can be used to create branches in a Workflow All links are executed unless a link condition is used which makes a link false
Link 1 Link 3
Link 2
77
Conditional Links
78
Workflow Variables 1
Used in decision tasks and conditional links edit task or link:
Pre-defined variables User-defined variables (see separate slide)
Task-specific variables
Workflow Variables 2
User-defined variables are set in Workflow properties, Variables tab can persist across sessions
80
Workflow Summary
1. Add Sessions and other Tasks to the Workflow
2.
3.
4.
81
Session Tasks
Session Tasks
After this section, you will be familiar with:
83
Session Tasks can be created in the Task Developer (reusable) or Workflow Developer (Workflowspecific)
To create a Session Task
Select the Session button from the Task Toolbar
Or Select menu Tasks | Create and select Session from the drop-down menu
84
85
86
Set properties
87
Monitoring Workflows
Monitoring Workflows
By the end of this section you will be familiar with: The Workflow Monitor GUI interface Monitoring views Server monitoring modes
89
Workflow Monitor
The Workflow Monitor is the tool for monitoring Workflows and Tasks Choose between two views: Gantt chart Task view
Task view
90
Displays real-time information from the Informatica Server and the Repository Server about current workflow runs
91
Monitoring Operations
Perform operations in the Workflow Monitor
Stop, Abort, or Restart a Task, Workflow or Worklet Resume a suspended Workflow after a failed Task is corrected Reschedule or Unschedule a Workflow
committing data during the timeout period, the threads and processes associated with the Session are killed
92
Status Bar
93
Monitoring filters can be set using drop down menus. Minimizes items displayed in Task View
Right-click on Session to retrieve the Session Log (from the Server to the local PC Client)
94
Filter Toolbar
Select type of tasks to filter Select servers to filter Filter tasks by specified criteria Display recent runs
95
Repository Manager Repository Managers Truncate Log option clears the Workflow Monitor logs
96
97
98
Debugger
Debugger
By the end of this section you will be familiar with:
100
Debugger Features
Wizard driven tool that runs a test session
Initialize variables
Manually change variable values Data can be loaded or discarded Debug environment can be saved for later use
101
Debugger Interface
Edit Breakpoints Debugger Mode indicator Solid yellow arrow is current transformation indicator
102
Set Breakpoints
1. Edit breakpoint
2. Choose global or specific transformation 3. Choose to break on data condition or error. Optionally skip rows. 4. Add breakpoint(s) 5. Add data conditions
Debugger Tips
Server must be running before starting a Debug Session When the Debugger is started, a spinning icon displays. Spinning stops when the Debugger Server is ready The flashing yellow/green arrow points to the current active Source Qualifier. The solid yellow arrow points to the current Transformation instance Next Instance proceeds a single step at a time; one row moves from transformation to transformation Step to Instance examines one transformation at a time, following successive rows through the same transformation
104
105
Filter Transformation
Filter Transformation
Drops rows conditionally
107
108
Sorter Transformation
Sorter Transformation
Can sort data from relational tables or flat files
110
Sorter Transformation
Sorts data from any source, at any point in a data flow
Sort Keys
Ports Input/Output Define one or more sort keys Define sort order for each key
Example of Usage Sort data before Aggregator to improve performance
Sort Order
111
Sorter Properties
112
Aggregator Transformation
Aggregator Transformation
By the end of this section you will be familiar with:
Aggregator properties
Using sorted data
114
Aggregator Transformation
Performs aggregate calculations
Ports Mixed I/O ports allowed Variable ports allowed Group By allowed Create expressions in variable and output ports Usage Standard aggregations
115
Aggregate Expressions
Aggregate functions are supported only in the Aggregator Transformation
Conditional Aggregate expressions are supported: Conditional SUM format: SUM(value, condition)
116
Aggregator Functions
AVG COUNT FIRST LAST MAX MEDIAN MIN PERCENTILE STDDEV SUM VARIANCE
117
Aggregator Properties
Sorted Input Property
Instructs the Aggregator to expect the data to be sorted Set Aggregator cache sizes for Informatica Server machine
118
Sorted Data
The Aggregator can handle sorted or unsorted data
Sorted data can be aggregated more efficiently, decreasing total processing time
The Server will cache data from each group and release the cached data upon reaching the first record of the next group Data must be sorted according to the order of the Aggregators Group By ports Performance gain will depend upon varying factors
119
No rows are released from Aggregator until all rows are aggregated
120
Each separate group (one row) is released as soon as the last row in the group is aggregated
121
Active transformation
Can operate on groups of data rows AND/OR
122
Passive T T T
Active
Example holds true with Normalizer instead of Source Qualifier. Exceptions are: Mapplet Input and sorted Joiner transformations
123
Joiner Transformation
Joiner Transformation
By the end of this section you will be familiar with:
Nested joins
125
126
127
Joiner Transformation
Performs heterogeneous joins on different data flows
Active Transformation Ports All input or input / output M denotes port comes from master source Examples Join two flat files Join two tables from different databases Join a flat file with a relational table
128
Joiner Conditions
129
Joiner Properties
Join types:
Normal (inner) Master outer Detail outer Full outer Set Joiner Caches
Joiner can accept sorted data (configure the join condition to use the sort origin ports)
130
Nested Joins
Used to join three or more heterogeneous sources
131
132
133
Lookup Transformation
Lookup Transformation
By the end of this section you will be familiar with:
Lookup principles
Lookup properties Lookup conditions
Lookup techniques
Caching considerations Persistent caches
135
Return value(s)
136
Lookup Transformation
Looks up values in a database table or flat file and provides data to other components in a mapping
Ports Mixed L denotes Lookup port R denotes port used as a return value (unconnected Lookup only see later) Specify the Lookup Condition Usage Get related values Verify if records exists or if data has changed
137
Lookup Conditions
138
Lookup Properties
Lookup table name
Lookup condition
139
Policy on multiple match: Use first value Use last value Report error
140
Lookup Caching
Caching can significantly impact performance Cached
Lookup table data is cached locally on the Server Mapping rows are looked up against the cache
Uncached
Each Mapping row needs one SQL SELECT
Rule Of Thumb: Cache if the number (and size) of records in the Lookup table is small relative to the number of mapping rows requiring the lookup
141
Persistent Caches
By default, Lookup caches are not persistent; when the session completes, the cache is erased
The next time Session runs, cached data is loaded fully or partially into RAM and reused
A named persistent cache may be shared by different sessions
142
143
145
Target Options
Target Options
By the end of this section you will be familiar with:
Constraint-based loading
147
148
Target Properties
Edit Tasks: Mappings Tab Session Task
Select target instance Target load type Row loading operations Error handling
149
Delete SQL
DELETE from <target> WHERE <primary key> = <pkvalue>
Constraint-based Loading
PK1
FK1 PK2
FK2
To maintain referential integrity, primary keys must be loaded before their corresponding foreign keys here in the order Target1, Target2, Target 3
151
152
can change the number of rows on the data flow Examples: Source Qualifier, Aggregator, Joiner, Sorter, Filter
Active source
Active transformation that generates rows Cannot match an output row with a distinct input row Examples: Source Qualifier, Aggregator, Joiner, Sorter (The Filter is NOT an active source)
Active group
Group of targets in a mapping being fed by the same active
source
153
Example 1
With only one Active source, rows for Targets1, 2, and 3 will be loaded properly and maintain referential integrity
FK1 PK2
FK2
PK1
Example 2
With two Active sources, it is not possible to control whether rows for Target3 will be loaded before or after those for Target2
FK1 PK2
FK2
154
155
Ports All input / output Specify the Update Strategy Expression IIF or DECODE logic determines how to handle the record
Example Updating Slowly Changing Dimensions
157
Appropriate SQL (DML) is submitted to the target database: insert, delete or update
DD_REJECT means the row will not have SQL written for it. Target will not see that row Rejected rows may be forwarded through Mapping
158
159
160
161
Router Transformation
Router Transformation
Rows sent to multiple filter conditions
Ports All input/output Specify filter conditions for each Group Usage Link source data in one pass to multiple filter conditions
163
Router Groups
Input group (always one) User-defined groups
Default group (always one) can capture rows that fail all Group conditions
164
165
Lab 13 Router
166
Ports Two predefined output ports, NEXTVAL and CURRVAL No input ports allowed Usage Generate sequence numbers Shareable across mappings
168
169
System variables
Mapping parameters and variables Parameter files
171
System Variables
SYSDATE
SESSSTARTTIME
$$$SessStartTime
Returns the system date value as a string. Uses system clock on machine hosting Informatica Server
Format of the string is database type dependent Used in SQL override Has a constant value
172
173
Set datatype User-defined names Set aggregation type Set optional initial value
SETMINVARIABLE($$Variable,value) Sets the specified variable to the lower of of the current value or the specified value
SETVARIABLE($$Variable,value) Sets the specified variable to the specified value SETCOUNTVARIABLE($$Variable) Increases or decreases the specified variable by the number of rows leaving the function(+1 for each inserted row, -1 for each deleted row, no change for updated or rejected rows)
176
Parameter Files
You can specify a parameter file for a session in the session editor Parameter file contains folder.session name and initializes each parameter and variable for that session. For example:
[Production.s_m_MonthlyCalculations] $$State=MA $$Time=10/1/2000 00:00:00 $InputFile1=sales.txt $DBConnection_target=sales $PMSessionLogFile=D:/session logs/firstrun.txt
177
2. Repository value
3. Declared initial value
4. Default value
178
Unconnected Lookups
Unconnected Lookups
By the end of this section you will know:
180
Unconnected Lookup
Physically unconnected from other transformations NO data flow arrows leading to or from an unconnected Lookup Lookup data is called from the point in the Mapping that needs it Lookup function can be set within any transformation that supports expressions
Function in the Aggregator calls the unconnected Lookup
181
IIF ( ISNULL(customer_id),:lkp.MYLOOKUP(order_no))
Lookup function
Condition is evaluated for each row but Lookup function is called only if condition satisfied
182
Must check a Return port in the Ports tab, else fails at runtime
184
Part of the mapping data flow Returns multiple values (by linking output ports to another transformation) Executed for every record passing through the transformation More visible, shows where the lookup values are used Default values are used
185
Separate from the mapping data flow Returns one value - by checking the Return (R) port option for the output port that provides the return value Only executed when the lookup function is called Less visible, as the lookup is called from an expression within another transformation Default values are ignored
186
187
Heterogeneous Targets
Heterogeneous Targets
By the end of this section you will be familiar with:
189
Relational database
Flat file XML Targets supported by PowerCenter Connects Heterogeneous targets are targets within a single Session Task that have different types or have different database connections
190
Oracle table
Tables are EITHER in two different databases, or require different (schemaspecific) connect strings One target is a flat file load
Oracle table
Flat file
191
The two database connections are different Flat file requires separate location information
192
The following overrides are supported: Relational target to flat file target Relational target to any other relational database type
CAUTION: If target definition datatypes are not compatible with datatypes in newly selected database type, modify the target definition
193
194
Mapplets
Mapplets
By the end of this section you will be familiar with:
Mapplet Designer
Mapplet advantages Mapplet types Mapplet rules Active and Passive Mapplets Mapplet Parameters and Variables
196
Mapplet Designer
197
Mapplet Advantages
Useful for repetitive tasks / logic
198
199
200
Unsupported Transformations
Do not use the following in a mapplet:
201
External Sources
Mapplet contains a Mapplet Input transformation
Mixed Sources
Mapplet contains one or more of either of a Mapplet Input transformation AND one or more Source Qualifiers Receives data from the Mapping it is used in, AND from the Mapplet
202
Passive Transformation Connected Ports Output ports only Usage Only those ports connected from an Input transformation to another transformation will display in the resulting Mapplet
203
Transformation
Transformation
Connecting the same port to more than one transformation is disallowed Pass to an Expression transformation first
Resulting Mapplet HAS input ports When used in a Mapping, the Mapplet may occur at any point in mid-flow
204
Mapplet
Source Qualifier
Mapplet
Usage Only those ports connected to an Output transformation (from another transformation) will display in the resulting Mapplet One (or more) Mapplet Output transformations are required in every Mapplet
206
208
CAUTION: Changing a passive Mapplet into an active Mapplet may invalidate Mappings which use that Mapplet so do an impact analysis in Repository Manager first
209
Passive
Active
Multiple Active Mapplets or Active and Passive Mapplets cannot populate the same target instance
210
Lab 17 Mapplets
212
Reusable Transformations
Reusable Transformations
By the end of this section you will be familiar with:
Transformation Developer
Reusable transformation rules Promoting transformations to reusable Copying reusable transformations
214
Transformation Developer
Make a transformation reusable from the outset, or test it in a mapping first
Reusable transformations
215
Reusable Transformations
Define once, reuse many times Reusable Transformations
Can be a copy or a shortcut Edit Ports only in Transformation Developer Can edit Properties in the mapping
216
217
3. Drop the transformation into the mapping 4. Save the changes to the Repository
218
219
221
Error Types
Transformation error
Data row has only passed partway through the mapping
transformation logic
An error occurs within a transformation
Data reject
Data row is fully transformed according to the mapping
logic
Due to a data issue, it cannot be written to the target A data reject can be forced by an Update Strategy
222
Logging ON
Appended to flat file or relational tables. Only fatal errors written to session log.
Data rejects
Appended to reject file Written to row error (one .bad file per target) tables or file
223
Error Log Type Log Row Data Log Source Row Data
224
225
Transformation Error
X X
226
Session metadata
Reader, transformation, writer and user-defined errors For errors on input, logs row data for I and I/O ports For errors on output, logs row data for I/O and O ports
228
229
PMERR_DATA: The row data of the error row as well as the source row data is logged here. The row data is in a string format such as [indicator1: data1 | indicator2: data2]
230
231
232
233
234
Workflow Configuration
236
237
(Custom)
(External Database Loaders)
238
239
240
FTP Connection
Create an FTP connection Instructions to the Server to ftp flat files Used in Session Tasks
241
242
243
244
245
246
Session Configuration
Define properties to be reusable across different sessions Defined at folder level Must have one of these tools open in order to access
247
248
249
250
251
Reusable Tasks
Reusable Tasks
Three types of reusable Tasks
Session Set of instructions to execute a specific Mapping Command Specific shell commands to run during any Workflow Email Sends email during the Workflow
253
Reusable Tasks
Use the Task Developer to create reusable tasks
These tasks will then appear in the Navigator and can be dragged and dropped into any workflow
254
Reusable
Non-reusable
255
Command Task
Specify one or more Unix shell or DOS commands to run during the Workflow
Runs in the Informatica Server (UNIX or Windows)
environment
Command task status (successful completion or failure) is held in the pre-defined task variable $command_task_name.STATUS Each Command Task shell command can execute before the Session begins or after the Informatica Server executes a Session
256
Command Task
Specify one (or more) Unix shell or DOS (NT, Win2000) commands to run at a specific point in the workflow Becomes a component of a workflow (or worklet) If created in the Task Developer, the Command task is reusable
257
258
259
Email Task
Configure to have the Informatica Server to send email at any point in the Workflow
Emails can also be invoked under the Components tab of a Session task to run pre- or post-session
260
261
262
263
Non-Reusable Tasks
Non-Reusable Tasks
Six additional Tasks are available in the Workflow Designer
Decision Assignment Timer Control Event Wait Event Raise
265
Decision Task
Specifies a condition to be evaluated in the Workflow Use the Decision Task in branches of a Workflow Use link conditions downstream to control execution flow by testing the Decision result
266
Assignment Task
Assigns a value to a Workflow Variable Variables are defined in the Workflow object
General Tab
Expressions Tab
267
Timer Task
Waits for a specified period of time to execute the next Task
General Tab Timer Tab
Relative Time
268
Control Task
Stop or ABORT the Workflow
Properties Tab General Tab
269
the workflow
270
271
Events Tab
272
General Tab
Properties Tab
273
Worklets
Worklets
An object representing a set or grouping of Tasks Can contain any Task available in the Workflow Manager Worklets expand and execute inside a Workflow A Workflow which contains a Worklet is called the parent Workflow Worklets CAN be nested Reusable Worklets create in the Worklet Designer Non-reusable Worklets create in the Workflow Designer
275
Re-usable Worklet
In the Worklet Designer, select Worklets | Create
276
277
Non-Reusable Worklet
1. Create worklet task in Workflow Designer
Right-click on new worklet and select Open Worklet Workspace switches to Worklet Designer
2.
3.
278
279
280
281
282
Built-in, pre-defined.
Workflow or worklet properties. Reset in Assignment tasks. Parameter file. Constant for session.
$DBConnectionORCL $InputFile1
283
284
Name and address cleansing functionality, including directories for US and certain international countries
Server group management, automatic workflow distribution across multiple heterogeneous servers ZL Engine, always-on non-stop sessions, JMS connectivity, and real-time Web Services provider Data smart parallelism, pipeline and data parallelism, partitioning Version control, deployment groups, configuration management, automatic promotion Server engine, metadata repository, unlimited designers, workflow scheduler, all APIs and SDKs, unlimited XML and flat file sourcing and targeting, object export to XML file, LDAP authentication, role-based object-level security, metadata reporter, centralized monitoring
PowerCenter
285
Virtual Classes
Watch for short web-based virtual classes on most PowerCenter options and XML support
286
287
Allows development of new PowerCenter Connect products Available on the Informatica Developer Network
288
289