ETL Resume
ETL Resume
ETL Resume
Seven Plus (7+) years of Total IT experience which includes Business Requirements Analysis, Data
Modeling, Development, Implementations and Testing of Data Warehousing in various industries.
Five Plus (5+) years of Data Warehousing experience using Informatica Power Center 9.1/9.0/8.6,
Informatica Power Exchange 9.5, IDQ 9.1,Warehouse Designer, Datamart, OLAP, OLTP,
Data Processing experience in designing and implementing Data Mart applications, mainly
transformation processes using Informatica
Strong in Data warehousing concepts, Star Schema and Snowflakes Schema methodologies
Proficient in using Informatica workflow manager, Workflow monitor to create, schedule and
control workflows, tasks, and sessions
Extensive work in ETL process consisting of data transformation, data sourcing, mapping,
conversion and loading.
Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica
sessions as well as performance tuning of mappings and sessions
Five (5+) years of database experience using Oracle 11g/10g/9i, SQL Server 2008/2010
Extensively used SQL and PL/SQL to write Stored Procedures, Functions, Packages and Triggers.
Extensively worked with SQL Loader for bulk loading into the oracle database.
Experience in complete life- cycle of Test case design, Test plans, Test execution, Defect
Management.
Outstanding skills in analyzing business process and end-user needs, detail oriented and
committed to delivering superior quality work.
Data
Environment
PROFESSIONAL SUMMARY
KPMG
INFORMATICA IDQ DEVELOPER
KPMG implemented SAP CRM 7.0 EHP1 as part of its Global template which was further adapted by local member
firms including KPMG US and customized as per business requirements. Sales and Marketing module have been
implemented and the system is integrated with other SAP modules of HCM, SD, FI and BI along with other thirdparty applications. UsingInformatica Power Connect for SAP to pull data from SAP R/3.
.
Client: KPMG
Tool: Informatica 9.1.0, SQL Server 10.50,
Role: Sr. Informatica Developer
DW/ETL RESPONSIBILITIES:
Performed the data profiling and analysis making use of Informatica Data Quality
(IDQ).
Used DQ
transformation
Labialization)
to
generate
duplicate
file( Match,
Standardization,
notification
and
Worked with staging area to perform data cleansing, data filtering and data standardization
process.
Handled Users, Data Security issues and automation of schedules and sessions.
Worked on different types of products like: Shares, Equity, Bonds, Options etc.
Strong working experience in Informatica Power Center (PWC) Data Integration Tools
Repository Manager, Designer, Workflow Manager, Workflow Monitor.
Worked closely with DBA team to tune the performance of the Database.
Designed, Developed, Deployed and implemented ETL mappings using Informatica
Worked on data validation.
Interacting with client to understand the requirement and explain them to downstream impact of
fix the issue
Extensively worked on all the transformations like Filter, Aggregator, Expression, Router, Lookup,
Update Strategy, Sequence generator, Rank, Union, Joiner, Source Qualifier, etc.,
Assisted in handling production support issues.
Credit-Suisse
INFORMATICA DEVELOPER
LCDB is a centralized global data repository of securities, trades, positions, and reference data for most of the
investment banks product types. This data is consumed by several General Counsel IT (GCIT) systems for Legal,
Risk, and Regulatory purposes. The need for a dedicated task force is in response to a series of data quality issues
that have been uncovered in LCDB and necessitated the need for a comprehensive review to mitigate potential risks
that inaccurate data would have on any Legal, Regulatory, or Control Room function.
Client: Credit-Suisse
Tool: Informatica 8.6.1, PL/SQL Developer 10.2,Control-M,SVN Process
Role: Sr. Informatica Developer
DW/ETL RESPONSIBILITIES:
Worked with staging area to perform data cleansing, data filtering and data standardization
process.
Handled Users, Data Security issues and automation of schedules and sessions.
Worked on different types of products like: Shares, Equity, Bonds, Options etc.
Worked closely with DBA team to tune the performance of the Database.
Did design and code reviews for the EDW Production Support project
Interacting with client to understand the requirement and explain them to downstream impact of
fix the issue
Tune SQL Query using explain plans, hints, indexes, and partitions.
Extensively worked on all the transformations like Filter, Aggregator, Expression, Router, Lookup,
Update Strategy, Sequence generator, Rank, Union, Joiner, Source Qualifier, etc.,
Extensively used Debugger Process to modify data and applying Break Points while Session is
running.
Worked closely with DBA team to tune the performance of the Database.
New Jersey Motor Vehicle Commission TRENTON, New Jersey AUG 2013 Aug 2014
INFORMATICA DEVELOPER
MATRX Application is intended to replace the existing 30 year old mainframe based New Jersey MVC (Motor
Vehicle Corporation) applications. MATRX consists of intranet and internet versions of the application featuring all
of the motor vehicle corporation needs including State Police Interface, Customer Management, Vehicle Titles and
Registrations, Web based online support, Third party interfaces for external organizations to address their business
needs.
The objective of the project is to re-architect and modernize the functionality based on the current regulations. Also,
it is based on real time processing and hence the current mainframe database will be replaced by Oracle database
and utilizing Informatica using Change Data Capture. Being government project it is very critical, has to
maintain highest of confidentiality and zero tolerance.
Designed and developed Oracle PL/SQL Procedures and wrote SQL, PL/SQL scripts code for
extracting data to system.
Handled Users, Data Security issues and automation of schedules and sessions.
Worked with staging area to perform data cleansing, data filtering and data standardization
process.
Worked closely with DBA team to tune the performance of the Database.
Developed several Mappings and Mapplets using corresponding Source, Targets and
Transformations.
Used Informatica Workflow Manager for creating, running the Workflows and Sessions and
scheduling them to run at specified time.
Used various partitioning schemes in order to improve the overall session performances.
Worked on the performance tuning of databases (dropping & re-building indexes, partitioning on
tables)
Involved in developing UNIX shell script to invoke the workflows and Informatica batches including
the pre and post analysis.
Analyzing the source data and deciding on appropriate extraction, transformation and loading
strategy
Extensively worked on all the transformations like Filter, Aggregator, Expression, Router, Lookup,
Update Strategy, Sequence generator, Rank, Union, Joiner, Source Qualifier, etc.,
Developed slowly changing dimensions mapping to accommodate the passive mode phase.
Developing number of Complex Informatica Mappings, Mapplets and Reusable Transformations for
weekly Loading of Data.
Developed several Mappings and Mapplets using corresponding Source, Targets and
Transformations.
Wells Fargo, AZ
INFORMATICA DEVELOPER
Wells Fargo Financial Services will support a Service Providers expansion of business solution by
offering and remarketing services. Decision support system enhances the value of technology by
providing data warehousing and business intelligence capability for banks to make strategic decision
supported by real life data. The target warehouse is built by accessing the data from geographically
spread, heterogeneous data environment and assimilating it into information view of decision making.
DW/ETL RESPONSIBILITIES:
Understand the overall functional architecture of pending orders process & Coordinated with IM
Business group to gather functional & business requirements.
Developed Logical and Physical Data Model using Erwin followed Star Schema to build the
Datamart.
Parsing high-level design spec to simple ETL coding and mapping standards.
Extracting the data from different sources that involved flat files, XML, Oracle, MS SQL Server
relational tables.
Used different transformations for Extraction/Transformation, data cleansing and loading data into
staging areas and Presentation Tables.
Worked with workflow Manager and workflow monitor to schedule batches and run the workflow
and monitor session logs.
Worked on Data Extraction, Data Transformations, Data Loading, Data Conversions and Data
Analysis.
Extensively designed Data mapping using filters, Expression, Update Strategy Transformations in
Power Center Designer.
Created target load order group mappings and scheduled them for Daily Loads.
Preparation of Unit Test Plans and verification of functional specifications and review of
deliverables.
Migrated Mappings, Sessions, Workflows and Common Objects from Development to Test and to
Production.
Wrote Unix Scripts and SQL Commands to test and analyze the data
Extensively used UNIX commands within Informatica for Pre Session and Post Session Data Loading
Process.
Did design and code reviews for the EDW Production Support project
Developed QA/QC processes using shell scripts to help identify data quality issues prior to
production load and provide pre-processed data source
Extensive use of Informatica metadata manager for data lineage and where-used analysis,
metadata browsing, metadata reporting & metadata document.
Performed various testing processes unit testing, end to end test, and regression testing
Extensively worked with production support and end user team in migrating, promoting and data
validation in production servers.
Manager,
The main purpose of this project was to design and build a Datamart for Sales and Marketing
department to analyze sales of Prescription Drugs. The Datamart was designed and implemented to
analyze the sales growth and market share for the products. Reports were created for Sales Managers
and end users to give information on Sales growth categorized as per product and different modes of
sales (dealers, distributors, physicians, healthcare Organization, etc.,)
Another objective of the Datamart was to determine the market share of these products compared to
Competitors products and generate relevant reports. This Datamart also helped in making decisions
pertaining to the Sales Force alignment by specifically looking at different locations and the
performance of the Sales Force. Based on this analysis, the Sales Managers forecasted the future
product sales. This Datamart also helped in designing a comprehensive sales force compensation
packages.
DW/ETL RESPONSIBILITIES:
Analyzed the business systems, gathered requirements from the users and documented business
needs for decision support data.
Interpreted logical and physical data models for Business users to determine common data
definitions and establish referential integrity of the system.
Created the (ER) Entity Relationship diagrams & maintained corresponding documentation for
corporate data dictionary with all attributes, table names and constraints.
Extensively used Erwin for data modeling and Dimensional Data Modeling.
Migrated Workflows, Mappings, and other repository objects from Development to QA and then to
production.
Created Informatica sessions in workflow manager to load the data from staging to Target
database.
Using Aggregator transformation calculated SUM, AVG of monthly sales for different products.
Designed and Developed several mappings to Load the Dimensions and the fact tables.
Created Informatica mappings to extract data from sources and staged in Oracle and populated
the warehouse
Created sessions and workflows for processing and to populate the dimensions and facts in the
star schema.
Extensively used various transformations like XML, Union, Expression, Filter, Aggregator, Lookup
and Router Transformations.
Generated XML files as target to load into the vendor customized application to generate the
reports.
Worked with connected and unconnected look-up for implementing complex logic.
Identified the bottle necks and tuned the mappings and sessions to improve performance.
Created sessions and workflows for processing and to populate the dimensions and facts in the
star schema.
Worked closely with DBA team to tune the performance of the Database.
Analyzed IMS data for Sales & Marketing analysis and reporting.
Tune code packages using explain plans, hints, indexes, and partitions.
Used Control-M for scheduling jobs.
Designing ETL strategy for both Initial and Incremental & CDC Loads.
Responsible for writing procedures drop and create partitions of large volume tables for the
archival process.
Developed unit test cases and ensured that the results are right before moving to QA and
Production.
Designed and developed Oracle PL/SQL Procedures and wrote SQL, PL/SQL scripts code for
extracting data to system
Performed integration testing for various mappings. Tested the data and data integrity among
various sources and targets.
CARDINAL HEALTH
DW DATABASE DEVELOPER / ETL DEVELOPER
Responsibilities: