0% found this document useful (0 votes)
595 views

Informatica Developer

This professional summary describes an experienced ETL developer with extensive experience developing and maintaining ETL processes using Informatica PowerCenter to extract, transform, and load data from various sources like Oracle, SQL Server, and flat files into data warehouses. They have experience in data modeling, requirements analysis, mapping design, performance tuning, testing, documentation, and supporting BI applications. Key skills include Informatica, Oracle, SQL Server, PL/SQL, Unix scripting, and QlikView.

Uploaded by

raaman
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
595 views

Informatica Developer

This professional summary describes an experienced ETL developer with extensive experience developing and maintaining ETL processes using Informatica PowerCenter to extract, transform, and load data from various sources like Oracle, SQL Server, and flat files into data warehouses. They have experience in data modeling, requirements analysis, mapping design, performance tuning, testing, documentation, and supporting BI applications. Key skills include Informatica, Oracle, SQL Server, PL/SQL, Unix scripting, and QlikView.

Uploaded by

raaman
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 42

PROFESSIONAL SUMMARY:

 Extensively worked on data extraction, Transformation and loading data from various sources
like Oracle, SQL Server and Flat files.
 Responsible for all activities related to the development, implementation, administration and
support of ETL processes for large scale data warehouses using Informatica Power Center.
 Strong experience in Data Warehousing and ETL using Informatica Power Center 8.6.
 Had experience in data modeling using Erwin, Star Schema Modeling,
and Snowflake modeling, FACT and Dimensions tables, physical and logical modeling.
 Strong skills in Data Analysis, Data Requirement Analysis and Data Mapping for ETL
processes.
 Had knowledge on Kimball/Inmon methodologies.
 Hands on experience in tuning mappings, identifying and resolving performance bottlenecks in
various levels like sources, targets, mappings and sessions.
 Extensive experience in ETL design, development and maintenance using Oracle SQL,
PL/SQL, SQL Loader, Informatica Power Center v 5.x/6.x/7.x/8.x.
 Experience in testing the Business Intelligence applications developed in Qlikview.
 Well versed in developing the complex SQL queries, unions and multiple table joins and
experience with Views.
 Experience in database programming in PL/SQL (Stored Procedures, Triggers and
Packages).
 Well versed in UNIX shell scripting.
 Experienced at Creating effective Test data and development thorough Unit test cases to
ensure successful execution of the data & used pager for notifying the alerts after successful
completion.
 Excellent communication, documentation and presentation skills using tools like Visio and
PowerPoint.

TECHNICAL SKILLS:
Data warehousing Tools : Informatica Power Center 8.6/8.1, Data Stage
Databases : Oracle10g/9i/ 8i/ 8.0/ 7.x, MS SQL Server 2005/ 2000/ 7.0/ 6.0, MS Access, MySQL,
Sybase.
Programming GUI : SQL, PL/SQL, SQL Plus, Java, HTML, C and UNIX Shell Scripting
BI Tools : QlikView 8.x
Tools/Utilities : TOAD, Benthic golden, PL/SQL developer
Operating Systems : Windows XP/NT/2003, UNIX
Configuration Management Tool : Surround SCM, Visual Source Safe
EDUCATION:
Master of Science in Computer Science.

PROFESSIONAL EXPERIENCE:
Confidential
sanofi-aventis- NJ Oct’11- till date
Informatica Developer
USMM implementation project is the upgrade of the current sanofi-aventis 1.x series MCO medical
reps Quest application to latest 4.x .net series of applications. In this project the database was
upgraded and an enterprise data ware house was implemented for the MCO reps. Distributed data is
coming from the heterogeneous sources like SQL server, Oracle and in flat files from the clients.
Responsibilities:

 Analyzed the business requirements and functional specifications.


 Extracted data from oracle database and spreadsheets and staged into a single place and
applied business logic to load them in the central oracle database.
 Used Informatica Power Center 8.6 for extraction, transformation and load (ETL) of data in the
data warehouse.
 Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression and
Lookup, Update strategy and Sequence generator and Stored Procedure.
 Developed complex mappings in Informatica to load the data from various sources.
 Implemented performance tuning logic on targets, sources, mappings, sessions to provide
maximum efficiency and performance.
 Parameterized the mappings and increased the re-usability.
 Used Informatica Power Center Workflow manager to create sessions, workflows and batches
to run with the logic embedded in the mappings.
 Created procedures to truncate data in the target before the session run.
 Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the
performance of the conversion mapping.
 Used the PL/SQL procedures for Informatica mappings for truncating the data in target tables
at run time.
 Extensively used Informatica debugger to figure out the problems in mapping. Also involved in
troubleshooting existing ETL bugs.
 Created a list of the inconsistencies in the data load on the client side so as to review and
correct the issues on their side.
 Created the ETL exception reports and validation reports after the data is loaded into the
warehouse database.
 Written documentation to describe program development, logic, coding, testing, changes and
corrections.
 Created Test cases for the mappings developed and then created integration Testing
Document.
 Followed Informatica recommendations, methodologies and best practices.

Environment: Informatica Power Center 8.6.1, Oracle 10g/ 9i, MS-SQL Server, Toad, HP Quality
Center, Windows XP and MS Office Suite
Confidential, Aug’ 07- Sep’10
sanofi-aventis- NJ 
Informatica Developer
Sales force automation (SFA) system is a CRM solution that provides sales forces with a roboust set
of customer relationship management capabilities that promotes team selling, multi-channel
customer management, information sharing, field reporting, and analytics — all within a life science-
tailored mobile application that is easy to use. The Purpose of this project is to maintain a data
warehouse that would enable the home office to take corporate decisions. A decision support
system is built to compare and analyze their products with the competitor products and the sales
information at territory, district, region and Area level.
Responsibilities:

 Created mappings and sessions to implement technical enhancements for data warehouse by
extracting data from sources like Oracleand Delimited Flat files.
 Development of ETL using Informatica 8.6.
 Applied slowly changing dimensions like Type 1 and 2 effectively to handle the delta Loads.
 Prepared various mappings to load the data into different stages like Landing, Staging and
Target tables.
 Used various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter,
Lookup, Update Strategy Designing and optimizing the Mapping.
 Developed Workflows using task developer, worklet designer, and workflow designer in
Workflow manager and monitored the results using workflow monitor.
 Created various tasks like Session, Command, Timer and Event wait.
 Modified several of the existing mappings based on the user requirements and maintained
existing mappings, sessions and workflows.
 Tuned the performance of mappings by following Informatica best practices and also applied
several methods to get best performance by decreasing the run time of workflows.
 Prepared SQL Queries to validate the data in both source and target databases.
 Worked on TOAD and Oracle SQL Developer to develop queries and create procedures and
packages in Oracle.
 Worked extensively on PL/SQL as part of the process to develop several scripts to handle
different scenarios.
 Created Test cases for the mappings developed and then created integration Testing
Document.
 Prepared the error handling document to maintain the error handling process.
 Automated the Informatica jobs using UNIX shell scripting.
 Closely worked with the reporting team to ensure that correct data is presented in the reports.
 Interaction with the offshore team on a daily basis on the development activities.

Environment: Informatica Power Center 8.1, Oracle 9i, MS-SQL Server, PL/SQL Developer,
Bourne shell, Windows XP,TOAD, MS Office and Delimited Flat files
Confidential, Dec’05-Jul’07
Chicago- IL 
Data warehouse Developer 

The American Medical Association (AMA) plays a key information management role by collecting,
maintaining, and disseminating primary source physician data. The development and
implementation of AMA policy and support a variety of data driven products and services. This
repository of physician information is created, maintained, and customized for DEA.
Responsibilities:

 Member of warehouse design team assisted in creating fact and dimension tables based on
specifications provided by managers.
 Load operational data from Oracle, SQL Server, flat files, Excel Worksheets into various data
marts like PMS and DEA.
 Designed and created complex source to target mappings using various transformations
inclusive of but not limited to Aggregator, Look Up, Joiner, Source Qualifier, Expression,
Sequence Generator, and Router Transformations.
 Implemented effective date range mapping (Slowly Changing dimension type2) methodology
for accessing the full history of accounts and transaction information.
 Design complex mappings involving constraint based loading, target load order.
 Use debugger in identifying bugs in existing mappings by analyzing data flow, evaluating
transformations and Create mapplets that provides reusability in mappings.
 Involve in enhancements and maintenance activities of the data warehouse including
performance tuning, rewriting of stored procedures for code enhancements.
 Designed workflows with many sessions with decision, assignment task, event wait, and event
raise tasks, used Informatica scheduler to schedule jobs.
Environment: Informatica Power Center 6.2, Oracle, Business Objects 6.x, Windows 2000, SQL
Server 2000, Microsoft Excel, SQL * Plus
Confidential, Sep’04-Nov’05
AXA - NY 
ETL Consultant
A single electronic solution provides to employees of AXA Pacific and AXA Assurance surety
companies to access a centralized system. It also provides and intranet interface to all employees of
AXA surety bonds users across Canada.
Responsibilities:

 Designed and developed the data transformations for source system data extraction; data
staging, movement and aggregation; information and analytics delivery; and data quality
handling, system testing, performance tuning.
 Created Informatica Mappings to build business rules to load data using transformations like
Source Qualifier, Aggregator, Expression, Joiner, Connected and Unconnected lookups,
Filters and Sequence, External Procedure, Router and Update strategy.
 Stored reformatted data from relational, flat file, XML files using Informatica (ETL).
 Worked on Dimensional modeling to design and develop STAR schemas using ER-win 4.0,
Identifying Fact and Dimension Tables.
 Created various batch Scripts for scheduling various data cleansing scripts and loading
process.
 Extensively worked on Mapping Variables, Mapping Parameters and Session Parameters.
 Created post-session and pre-session shell scripts and mail-notifications
 Created Data Breakpoints and Error Breakpoints for debugging the mappings using Debugger
Wizard.
 Created Several Stored Procedures to update several tables and insert audit tables as part of
the process.
 Extensively used Joins, Triggers, Stored Procedures and Functions in Interaction with
backend database using PL/SQL.
 Written Unix Shell Scripts for getting data from various source systems to Data Warehouse.
 Performance tuning by optimizing the sources targets mappings and sessions.
 Tested mappings and sessions using various test cases in the test plans.

Environment: Informatica 6.2, Oracle 8i, TOAD, Windows NT and UNIX


Confidential, India Nov’02-Aug’04 
QA Consultant
HDFC Bank had various products like Accounts & Deposits, Loans, Insurance and Premium Banking
etc.
This project is mainly to test the Accounts & Deposits module functionality. The Accounts & Deposits
has different account types like Savings, Salary, Current, Demat, Deposits, Rural and Safe Deposit
locker.
Responsibilities:

 Analyzed the business requirements and functional specifications.


 Understanding the requirements specification and use case documents.
 Creation of test plan, test strategy and test approach.
 Created Test scripts, Traceability Matrix and mapped the requirements to the test cases.
 Done Peer review of test scripts and prepare the QA measurement forms.
 Attended the QA Audit meetings and test artifacts review.
 Participated in the Integration testing and Unit Testing along with the Development team.
 Conducted System, GUI, Smoke and Regression testing identified application errors and
interacted with developers to resolve technical issues.
 Extensively used SQL scripts/queries for data verification at the backend.
 Executed SQL queries, stored procedures and performed data validation as a part of backend
testing.
 Used SQL to test various reports and ETL Jobs load in development, testing and production
 Performed negative testing to test the application for invalid data, verified and validated for the
error messages generated.
 Responsible for Generating Progress Reports and Updates to Project Lead Weekly including
with test Scenarios Status, Concerns and Functionality outstanding.
 Involved in Daily and Weekly Status meetings.
 Creation of Test Summary report, traceability Matrix, Test Script Index.
 Monitor the testing project in Quality Center and ensuring defects are being entered, tested,
and closed.
 Analyzed, documented and maintained Test Results and Test Logs.

Environment: Java, HTML, DHTML, Oracle8i, Data stage, Clarify and Benthic Golden.

PROFESSIONAL SUMMARY:

 Eight plus (8+) years of IT experience in the Analysis, Design, Development, Testing and
Implementation of business application systems for Health care, Pharmaceutical, Financial,
Telecom and Manufacturing Sectors.
 Strong experience in the Analysis, design, development, testing and Implementation of
Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, BI,
Client/Server applications.
 Strong Data Warehousing ETL experience of using Informatica 9.1/8.6.1/8.5/8.1/7.1
PowerCenter Client tools - Mapping Designer, Repository manager, Workflow
Manager/Monitor and Server tools � Informatica Server, Repository Server manager.
 Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with
project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL
Design, development, System testing, Implementation and production support.
 Extensive testing ETL experience using Informatica 9.1/8.6.1/8.58.1/7.1/6.2/5.1 (Power
Center/ Power Mart) (Designer, Workflow Manager, Workflow Monitor and Server Manager)
Teradata and Business Objects.
 Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying
Facts and Dimensions, Physical and logical data modeling using ERwin and ER-Studio.
 Expertise in working with relational databases such as Oracle 11g/10g/9i/8x, SQL Server
2008/2005, DB2 8.0/7.0, UDB, MS Access and Teradata.
 Strong experience in Extraction, Transformation and Loading (ETL) data from various sources
into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager,
Designer, Workflow Manager, Workflow Monitor, Metadata Manger), Power Exchange, Power
Connect as ETL tool on Oracle, DB2 and SQL Server Databases.
 Experience using SAS ETL tool, Talend ETL tool and SAS Enterprise Data Integration Server
highly preferred.
 Extensive experience in developing Stored Procedures, Functions, Views and Triggers,
Complex SQL queries using SQL Server, TSQL and Oracle PL/SQL.
 Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica
sessions as well as performance tuning of mappings and sessions.
 Experience in all phases of Data warehouse development from requirements gathering for the
data warehouse to develop the code, Unit Testing and Documenting.
 Extensive experience in writing UNIX shell scripts and automation of the ETL processes using
UNIX shell scripting.
 Proficient in the Integration of various data sources with multiple relational databases like
Oracle11g /Oracle10g/9i, MS SQL Server, DB2, Teradata, VSAM files and Flat Files into the
staging area, ODS, Data Warehouse and Data Mart.
 Experience in using Automation Scheduling tools like Autosys and Control-M.
 Worked extensively with slowly changing dimensions.
 Hands-on experience across all stages of Software Development Life Cycle (SDLC) including
business requirement analysis, data mapping, build, unit testing, systems integration and user
acceptance testing.
 Excellent interpersonal and communication skills, and is experienced in working with senior
level managers, business people and developers across multiple disciplines.

Education:
Friends University, Wichita, KS

 B.S. Computer Information System with Mathematics as minor.


 MBA Student

Technical Skills:
Operating Systems: Windows 2008/2007/2005/NT/XP, UNIX, MS-DOS
ETL Tools: Informatica Power Center 9.1/8.6/8.5/8.1/7.1 (Designer, Workflow Manager, Workflow
Monitor, Repository manager and Informatica Server)
Databases: Oracle 11g/10g/9i/8i, MS SQL Server 2008/2005, DB2 v8.1, Teradata.
Data Modeling tools: Erwin, MS Visio
OLAP Tools: Cognos 8.0/8.1/8.2/8.4/7.0/, Business Objects XI r2/6.x/5.x, OBIEE 10.1.3.4/ 10.1.3.3
Languages: SQL, PL/SQL, UNIX, Shell scripts, C++
Scheduling Tools: Autosys ,Control-M
Testing Tools QTP, WinRunner, LoadRunner, Quality Center, Test Director
Professional Experience:
Confidential, Denver, CO Sept 2011 to May 2012
Sr. ETL/Informatica Developer
Description: Qwest is a large telecommunications carrier. Qwest Communications provides long-
distance services and broadband data, as well as voice and video communications globally. This
project includes developing Data warehouse from different data feeds and other operational data
sources.
Built a central Database where data comes from different sources like oracle, SQL server and flat
files. Actively involved as an Analyst for preparing design documents and interacted with the data
modelers to understand the data model and design the ETL logic.
Responsibilities:

 Responsible for Business Analysis and Requirements Collection.


 Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow
Manager, and Workflow Monitor.
 Parsed high-level design specification to simple ETL coding and mapping standards.
 Designed and customized data models for Data warehouse supporting data from multiple
sources on real time
 Involved in building the ETL architecture and Source to Target mapping to load data into Data
warehouse.
 Created mapping documents to outline data flow from sources to targets.
 Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to
design the business process, dimensions and measured facts.
 Extracted the data from the flat files and other RDBMS databases into staging area and
populated onto Data warehouse.
 Maintained stored definitions, transformation rules and targets definitions using Informatica
repository Manager.
 Used various transformations like Filter, Expression, Sequence Generator, Update Strategy,
Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
 Developed mapping parameters and variables to support SQL override.
 Created mapplets to use them in different mappings.
 Developed mappings to load into staging tables and then to Dimensions and Facts.
 Used existing ETL standards to develop these mappings.
 Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-
mail, command, worklets, Assignment, Timer and scheduling of the workflow.
 Created sessions, configured workflows to extract data from various sources, transformed
data, and loading into data warehouse.
 Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
 Extensively used SQL* loader to load data from flat files to the database tables in Oracle.
 Modified existing mappings for enhancements of new business requirements.
 Used Debugger to test the mappings and fixed the bugs.
 Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and
backup of repository and folder.
 Involved in Performance tuning at source, target, mappings, sessions, and system levels.
 Prepared migration document to move the mappings from development to testing and then to
production repositories.

Environment: Informatica Power Center 8.6.1, Workflow Manager, Workflow Monitor, Informatica


Power Connect / Power Exchange, Data Analyzer 8.1, PL/SQL, Oracle 10g/9i, Erwin, Autosys, SQL
Server 2005, Sybase, UNIX AIX, Toad 9.0, Cognos 8.
Confidential, Stamford, CT Nov 2010 to July 2011
Role: ETL Consultant
XL Global Services Inc. provides the backbone Information Technology support to the XL Capital
group of companies, a leading provider of insurance and reinsurance coverage, innovative risk
management and financial solutions. As part of providing financial solutions, XL Global Services Inc
generates various reports for presenting a comprehensive Credit and Risk analysis for its customers.
The project was designed to develop and maintain Data Marts. We have to upload the data from
various centers with the data in different systems using ETL Tools.
Responsibilities:

 Logical and Physical data modeling was done using Erwin for data warehouse database in
STAR SCHEMA
 Using Informatica PowerCenter Designer analyzed the source data to Extract & Transform
from various source systems(oracle 10g,DB2,SQL server and flat files) by incorporating
business rules using different objects and functions that the tool supports.
 Using Informatica PowerCenter created mappings and mapplets to transform the data
according to the business rules.
 Used various transformations like Source Qualifier, Joiner, Lookup, sql ,router, Filter,
Expression and Update Strategy.
 Implemented slowly changing dimensions (SCD) for some of the Tables as per user
requirement.
 Developed Stored Procedures and used them in Stored Procedure transformation for data
processing and have used data migration tools
 Documented Informatica mappings in Excel spread sheet.
 Tuned the Informatica mappings for optimal load performance.
 Have used BTEQ, FEXP, FLOAD, MLOAD Teradata utilities to export and load data to/from
Flat files.
 Created and Configured Workflows and Sessions to transport the data to target warehouse
Oracle tables using Informatica Workflow Manager.
 Have generated reports using OBIEE 10.1.3 for the future business utilities.
 This role carries primary responsibility for problem determination and resolution for each SAP
application system database server and application server.
 Worked along with UNIX team for writing UNIX shell scripts to customize the server
scheduling jobs.
 Constantly interacted with business users to discuss requirements.

Environment: Informatica PowerCenter Designer 8.6/8.1, Informatica Repository Manager,


Oracle10g/9i,DB2 6.1, Erwin, TOAD, SAP Version: 3.1.H,Unix- SunOS, PL/SQL,SQL Developer
Confidential, Pittsburgh, PA Aug 2008 � Sept 2009
Sr. ETL/Informatica Developer
Project(s): EPC BI / EQUITRANS BI/ EGC BI / ETRM BI
Description: EQT Corporation is an integrated energy company, supplying natural gas, crude oil, and
gas related services to the customers. The main objective of the project was to help the decision
making team of the organization, to monitor/improve the sales and to explore avenues for new
business opportunities. DW team is responsible for building Global Data Warehouse and providing
reports for Production and Midstream groups. Worked on four capital projects EPC BI, EQUITRANS
BI, EGC BI and ETRM BI. The data is extracted from Flat files, Oracle, SQL and DB2 into the
Operational Data Source (ODS) and the data from Operational Data Source was extracted,
transformed and applied business logic to load them in the Global Data Warehouse Using
Informatica PowerCenter 9.1.0 tools.
Responsibilities:

 Interacted with Data Modelers and Business Analysts to understand the requirements and the
impact of the ETL on the business.
 Designed ETL specification documents for all the projects.
 Created Tables, Keys (Unique and Primary) and Indexes in the SQL server.
 Extracted data from Flat files, DB2, SQL and Oracle to build an Operation Data Source.
Applied business logic to load the data into Global Data Warehouse.
 Extensively worked on Facts and Slowly Changing Dimension (SCD) tables.
 Maintained source and target mappings, transformation logic and processes to reflect the
changing business environment over time.
 Used various transformations like Filter, Router, Expression, Lookup (connected and
unconnected), Aggregator, Sequence Generator, Update Strategy, Joiner, Normalizer, Sorter
and Union to develop robust mappings in the Informatica Designer.
 Extensively used the Add Currently Processed Flat File Name port to load the flat file name
and to load contract number coming from flat file name into Target.
 Worked on complex Source Qualifier queries, Pre and Post SQL queries in the Target.
 Worked on different tasks in Workflow Manager like Sessions, Events raise, Event wait,
Decision, E-mail, Command, Worklets, Assignment, Timer and Scheduling of the workflow.
 Extensively used workflow variables, mapping parameters and mapping variables.
 Created sessions, batches for incremental load into staging tables and scheduled them to run
daily.
 Used shortcuts to reuse objects without creating multiple objects in the repository and inherit
changes made to the source automatically.
 Implemented Informatica recommendations, methodologies and best practices.
 Implemented performance tuning logic on Targets, Sources, Mappings and Sessions to
provide maximum efficiency and performance.
 Involved in Unit, Integration, System, and Performance testing levels.
 Written documentation to describe program development, logic, coding, testing, changes and
corrections.
 Migrated the code into QA (Testing) and supported QA team and UAT (User).
 Created detailed Unit Test Document with all possible Test cases/Scripts.
 Conducted code reviews developed by my team mates before moving the code into QA.
 Provided support to develop the entire warehouse architecture and plan the ETL process.
 Modified existing mappings for enhancements of new business requirements.
 Prepared migration document to move the mappings from development to testing and then to
production repositories.
 Involved in production support.
 Works as a fully contributing team member, under broad guidance with independent planning
& execution responsibilities.

Environment: Informatica PowerCenter 9.1.0, Oracle 11g, SQLServer2008, IBM ISeries (DB2), MS


Access, Windows XP, Toad, Tidal, Cognos 8.4.1., SQL developer.
Confidential, NJ October 2007 - July 2008
Role: Sr ETL Developer
NYK Lines is one of the world's premier full service intermodal carriers. The company utilizes a vast
network of ocean vessels, barges, and railroad and motor carriers to link the international shipper
with the consignee and also services offered include Intermodal services, terminals and
warehousing, insurance, as well as repair and maintenances.
Modules: Shipment Data Mart, Job order Cost Mart, Net Contribution Mart, DnD Mart (Detention &
Demurrage)
Responsibilities:
 Involved in the analysis of the user requirements and identifying the sources.
 Created technical specification documents based on the requirements by using S2T
Documents.
 Involved in the preparation of High level design documents and Low level design documents.
 Involved in Design, analysis, Implementation, Testing and support of ETL processes for
Stage, ODS and Mart.
 Prepared ETL standards, Naming conventions and wrote ETL flow documentation for Stage,
ODS and Mart.
 Followed Ralph Kimball approach (Bottom Up Data warehouse Methodology in which
individual data marts like Shipment Data Mart, Job order Cost Mart, Net Contribution Mart,
Detention & Demurrage Mart are provides the views into organizational data and later combined
into Management Information System (MIS)).
 Prepared Level 2 Update plan to assign work to team members. This plan is very helpful to
know the status of each task.
 Administered the repository by creating folders and logins for the group members and
assigning necessary privileges.
 Designed and developed Informatica�s Mappings and Sessions based on business user
requirements and business rules to load data from source flat files and oracle tables to target
tables.
 Worked on various kinds of transformations like Expression, Aggregator, Stored Procedure,
Lookup, Filter, Joiner, Rank, Router and Update Strategy.
 Developed reusable Mapplets and Transformations.
 Used debugger to debug mappings to gain troubleshooting information about data and error
conditions.
 Involved in monitoring the workflows and in optimizing the load times.
 Used Change Data Capture (CDC) to simplify ETL in data warehouse applications.
 Involved in writing procedures, functions in PL/SQL.
 Developed mappings in Informatica using BAPI and ABAP function calls in SAP.
 Used the Remote functional call RFC as the SAP interface for communication between
systems
 Implemented RFC�s for the caller and the called functions module for running in the same
sytem.
 Involved in extensive performance tuning by determining bottlenecks at various points like
targets, sources, mappings, sessions or system. This led to better session performance.
 Worked with SQL*Loader tool to load the bulk data into Database.
 Prepared UNIX Shell Scripts and these shell scripts will be scheduled in AUTOSYS for
automatic execution at the specific timings.
 Rational Clear case is used to Controlling versions of all files & Folders (Check-out, Check-
in)
 Prepared test Scenarios and Test cases in HP Quality Center and involved in unit testing of
mappings, system testing and user acceptance testing.
 Defect Tracking and reports are done by Rational Clear Quest.
Environment: Informatica Power Center 8.6/8.1, SQL*Loader, IDOC, RFC, HP Quality Center,
Oracle9i/10g, AUTOSYS, Rational Clear case, Rational Clear Quest, Windows XP, TOAD, UNIX.
Confidential, Chicago June 2006 to Sept 2007
Role: Sr. ETL Developer
This project responsibility is to develop an Enterprise DataWarehouse (EDW). This responsibility of
this project to completely integrate the business to single environment.This data warehouse is used
to access easily the detailed data on a single platform and also to facilitate the enterprise-wide data
analysis to Reporting within the business environment. This Data Warehouse is build using
Informatica Power Center 8.6.1 for extracting data from various sources including flat-files, SAP-
ABAP, Teradata and Oracle.
Responsibilities:

 Analyzed the requirements and framed the business logic for the ETL process.
 Extracted data from Oracle as one of the source databases.
 Involved in JAD sessions for the requirements gathering and understanding.
 Involved in the ETL design and its documentation.
 Interpreted logical and physical data models for business users to determine common data
definitions and establish referential integrity of the system using ER-STUDIO.
 Followed Star Schema to design dimension and fact tables.
 Experienced in handling slowly changing dimensions.
 Collect and link metadata from diverse sources, including relational databases Oracle, XML
and flat files.
 Responsible for the development, implementation and support of the databases.
 Extensive experience with PL/SQL in designing, developing functions, procedures, triggers
and packages.
 Developed mappings in Informatica to load the data including facts and dimensions from
various sources into the Data Warehouse, using different transformations like Source
Qualifier, JAVA, Expression, Lookup, Aggregate, Update Strategy and Joiner.
 Developed reusable Mapplets and Transformations.
 Used data integrator tool to support batch and for real time integration and worked on staging
and integration layer.
 Optimized the performance of the mappings by various tests on sources, targets and
transformations
 Design, develop and Informaica mappings and workflows; Identify and Remove Bottlenecks in
order to improve the performance of mappings and workflows
 Review existing code, lead efforts to tweak and tune the performance of existing Informatica
processes
 Scheduling the sessions to extract, transform and load data in to warehouse database on
Business requirements.
 Scheduled the tasks using Autosys.
 Loaded the flat files data using Informatica to the staging area.
 Created SHELL SCRIPTS for generic use.
 Created high level design documents, technical specifications, coding, unit testing and
resolved the defects using Quality Center 10.
 Developed unit/assembly test cases and UNIX shell scripts to run along with
daily/weekly/monthly batches to reduce or eliminate manual testing effort.

Environment: Windows XP/NT, Informatica Powercenter 9.1/8.6, UNIX, Teradata V-14, Oracle 11g,
Oracle Data Integrator, SQL, PL/SQL,SQL Developer, ER-win, Oracle Designer, MS VISIO,
Autosys, Korn Shell, Quality Center 10.
Confidential, NJ Feb 2005 to Mar 2006
Role: Informatica Developer
Merrill Lynch is the wealth management division of Bank of America providing corporate finance
&investment banking services. The objective of the project was to build data Warehouse for
Customers Investment Deposit, funding accounts and Corporate Services. The data for Customers,
Accounts and Transactional related information were extracted from multiple sources, transformed
and loaded into the target database using ETL tool.
Responsibilities:

 Understanding the Business requirements based on Functional specification to design the


ETL methodology in technical specifications.
 Developed data conversion/quality/cleansing rules and executed data cleansing activities
such as data
 Consolidation, standardization, matching Trillium for the unstructured flat file data.
 Responsible for developing, support and maintenance for the ETL (Extract, Transform and
Load) processes using Informatica Power Center 8.5.
 Experience in integration of heterogeneous data sources like Oracle, DB2, SQL Server and
Flat Files (Fixed & delimited) into Staging Area.
 Wrote SQL-Overrides and used filter conditions in source qualifier thereby improving the
performance of the mapping.
 Designed and developed mappings using Source Qualifier, Expression, Lookup, Router,
Aggregator, Filter, Sequence Generator, Stored Procedure, Update Strategy, joiner and Rank
transformations.
 Managed the Metadata associated with the ETL processes used to populate the Data
Warehouse.
 Implemented complex business rules in Informatica Power Center by creating re-usable
transformations, and robust Mapplets.
 Implemented performance tuning of Sources, Targets, Mappings and Sessions by identifying
bottlenecks and used Debugger to debug the complex mappings and fix them.
 Improved session Performance by enabling property incremental aggregation to load
incremental data into target table.
 Worked with Functional team to make sure required data has been extracted and loaded and
performed the Unit Testing and fixed the errors to meet the requirements.
 Copied/Exported/Imported the mappings/sessions/ worklets /workflows from development to
Test Repository and promoted to Production.
 Used Session parameters, Mapping variable/parameters and created Parameter files for
imparting flexible runs of workflows based on changing variable values.
 Worked with Static, Dynamic and Persistent Cache in lookup transformation for better
throughput of Sessions.
 Used PMCMD command to automate the Power Center sessions and workflows through
UNIX.

Environment: Informatica Power Center 8.5 , Oracle 10g , SQL Server 2005 , DB2,
SQL*Plus, SQL Loader ,SQL Developer, Autosys, Flat files, UNIX, Windows 2000
Confidential, Seattle, WA Aug 2003 to Jan 2005
Role: ETL DEVELOPER
T-mobile is the one of the largest Telecom Companies in the USA. Joined existing onshore BI
team as ETL Developer and successfully designed, developed business solutions. The project
which aims in fulfilling T-mobile�s need for reporting to better understand the market trends,
behavior, future opportunities and to improve there decision making process. Coordinated with
the business and P&A team to understand the system requirements and then analyzing and
designing ETL solutions to accomplish the same. Involved in various successful releases to
accomplish T-mobile�s reporting needs under order-activation (OA) functional area.
Responsibilities:

o Gathered business requirements from Business Analyst.


o Designed and implemented appropriate ETL mappings to extract and transform data
from various sources to meet requirements.
o Designed and developed Informatica ETL mappings to extract master and transactional
data from heterogeneous data feeds and load
o Installed and Configured the Informatica Client tools.
o Worked on loading of data from several flat files to XML Targets.
o Designed the procedures for getting the data from all systems to Data Warehousing
system.
o Created the environment for Staging area, loading the Staging area with data from
multiple sources.
o Analyzed business process workflows and assisted in the development of ETL
procedures for moving data from source to target systems.
o Used workflow manager for session management, database connection management
and scheduling of jobs.
o Created UNIX shell scripts for Informatica ETL tool to automate sessions.
o Monitored sessions using the workflow monitor, which were scheduled, running,
completed or failed. Debugged mappings for failed sessions.

Environment: Informatica Power Center 5.1.2/7.1, Erwin 4.5, Oracle 9i, Windows NT, Flat
files, SQL, Relational Tools, Clear Case, UNIX (HP-UX, Sun Solaris, AIX) and UNIX Shell
Scripts.

Informatica ETL Developer


Extensively developed various mappings to perform ETL of source data into the
OLAP/data warehouse.

 Worked with SQL, PL/SQL procedures and functions, stored procedures and packages within
the mappings.
 Tuned Informatica mappings and sessions for optimum performance.
 Assisted the other ETL developers in solving complex scenarios and coordinated with source
systems owners with day-to-day ETL progress monitoring.

The PDI project is built in the process of expanding the scope of Information Factory.
The Project acquires and Integrates Member, Eligibility, Claims, Benefits and
Accumulations data from Proclaim into the Information Factory. The Effort included
stage, integrate and positioning the data making it available for the downstream data
consumers.
 Created Data Maps / Extraction groups in Power Exchange Navigator for legacy IMS Parent
sources.
 Staged Data from legacy Proclaim IMS system into Oracle 11g Master Tables
 Performed CDC capture registrations
 Assisted in building the ETL source to Target specification documents by understanding the
business requirements
 Developed mappings that perform Extraction, Transformation and load of source data into
Derived Masters schema using various power center transformations like Source Qualifier,
Aggregator, Filter, Router, Sequence Generator, look up, Rank, Joiner, Expression, Stored
Procedure, SQL, Normalizer and update strategy to meet business logic in the mappings
 Reusable transformations and Mapplets are built wherever redundancy is needed
 Performance tuning is performed at the Mapping level as well as the Database level to increase
the data throughput
 Designed the Process Control Table that would maintain the status of all the CDC jobs and
thereby drive the load of Derived Master Tables.
 Used Teradata utilities like BTEQ, fast load, fast export, multiload for data conversion.
 Created Post UNIX scripts to perform operations like gunzip, remove and touch files.

Candidate Info
8
Years In 
Workforce

1
Year 
At This Job
Computer Science

3
INFORMATICA ETL Developer
 Worked on Informatica Power Center tool - Source Analyzer, Data warehousing designer,
Mapping & Mapplet Designer and Transformation Designer.
 Extracted data from SAP system to Staging Area and loaded the data to the target database by
ETL process using Informatica Power Center.
 Performed the performance and tuning at source, Target levels using Indexes, Hints and
Partitioning in DB2, ORACLE and Informatica.
 Designed and developed various PL/SQL stored procedures to perform various calculations
related to fact measures.
 Converted the PL/SQL Procedures to Informatica mappings and at the same time created
procedures in the database level for optimum performance of the mappings.
 Investigating and fixing the bugs occurred in the production environment and providing the on-
call support.
 Performed Unit testing and maintained test logs and test cases for all the mappings.
 Maintained warehouse metadata, naming standards and warehouse standards for future
application development.
 Parsing high-level design specification to simple ETL coding along with mapping standards.

Candidate Info
3
Years In 
Workforce

1
Year 
At This Job

BS
Computer Science

4
Veterans Administration, D.c. - Lead Sql/informatica
ETL Developer
The five lines of business for VBA are Compensation and Pension, Loan Guaranty,
Education, Vocational Rehabilitation/Education and Insurance Services having
individual data marts. The OLTP systems are VETSNET, BDN and the various data
marts are CPMR, BIRLS, and VOR.

 Worked as an Adhoc developer and created reports for Adhoc request from individual senators
and congressmen, the press, GAO, inspector general offices and VA secretary using the EDW.
 Worked on End to End analysis of the EDW to report the questions/clarifications of Adhoc
requests with supporting data and findings.
 Created high performance Sql scripts to reduce the time taken for producing ongoing Adhoc
reports monthly and weekly.
 Thoroughly tested the Sql scripts by validating the data and cross comparing the data across
five lines of business for adhoc reports
 Shared the business/technical knowledge gained within the team in adhoc meetings, peer code
reviews, VOR/BIRLS code reviews
 Worked on defect fixes in VOR to produce business solution using ETL (Informatica) tool.
Offered various approaches in brain storming session, so as to decide the best approach for a
FIX.
 Worked on converting some of the PL/Sql scripts into Informatica mappings.
 Worked on building informatica mappings for the HAIISS project - VHA -Essence data mart.

Candidate Info
5
Years In 
Workforce

2
Years 
At This Job

MS
Business Intelligence
Electronics And Instrumentation

5
Informatica ETL Developer
Designed extensive ETL mappings with Informatica Designer

 Worked with filter transformations and flat files to identify source and target bottlenecks
 Worked with various transformations including router transformation, update strategy,
expression transformation, lookup transformation, sequence generator, aggregator
transformation and sorter transformation
 Used Oracle to write SQL queries that create/alter/delete tables and to extract the necessary
data
 Used UNIX to navigate around the system and check for specific files, the files’ content, change
permissions and see who the current users are

Candidate Info
7
Years In 
Workforce

4
Years 
At This Job
BA
Information Technology And Informatics

6
Informatica ETL Developer
Automate PL/SQL scripts using Informatica within a framework

 Parameterize hard-coded values


 Implement PL/SQL requirements or rules into the ETL tool
 Participate in Design Reviews for the overall Automation Framework and Architecture

Candidate Info
10
Years In 
Workforce

6
Months 
At This Job

BS
Computer Information Systems

MBA
Business Administration

7
Senior Informatica ETL Developer
Develop and maintain data marts on an enterprise data warehouse to support various
UHC defined dashboards such as Imperative for Quality (IQ) program.
 Designated owner and accountable for major tasks and took responsibility for actions and
outcomes to ensure timely and cost-effective results for our team.
 Coach new team members on technical tools such as Informatica Designer, Workflow Manager,
Workflow Monitor and UHC Data Models.
 Analyze data and build reports using Informatica data profiling tool & Toad for Data Analyst tool
so that UHC members can make informed decisions.
 Set and follow Informatica best practices, such as creating shared objects in shared for
reusability and standard naming convention of ETL objects, design complex Informatica
transformations, mapplets, mappings, reusable sessions, worklets and workflows.
 Evaluate business requirements to come up with Informatica mapping design that adheres to
Informatica standards.
 Implement performance tuning on a variety of Informatica maps to improve their throughput.
 Work with peers from various functional areas to define IT wide processes like code reviews,
unit test documentation and knowledge sharing sessions.
 Collaborate and work with business analysts and data analysts to support their data
warehousing and data analysis needs.

Sr. Informatica Developer


Sr. Informatica Developer - T-Mobile USA
• 7+ years of experience in analysis, design, development and maintenance in IT industry. 
• 6 years of extensive experience in Data Warehouse applications using Informatica, Oracle, DB2, MS
SQL server on Windows, IBM and UNIX platforms. 
• Developed tactical and strategic plans to implement technology solutions and effectively manage client
expectations. 
• Developed effective working relationships with client team to understand support requirements. 
• Experienced to work in Development team, Production support teams in handling critical situations to
meet the deadlines for successful completion of the tasks/projects. 
• Excellent interpersonal and communication skills, technically competent and result-oriented with
problem solving skills and ability to work independently and use sound judgment. 
• Strong expertise in designing and developing Business Intelligence solutions in staging, populating
Operational Data Store (ODS), Enterprise Data Warehouse (EDW), Data Marts / Decision Support
Systems using Informatica Power Center 9.x/8.x/7.x/6.x ETL tool. 
• Expertise in Data Modeling using Star Schema/Snowflake Schema, OLAP/ROLAP tools, Fact and
Dimensions tables, Physical and logical data modeling using ERWIN 4.x/3.x 
• Experience in documenting High Level Design, Low level Design, STM's, Unit test plan, Unit test cases
and Deployment documents. 
• Experienced in Repository Configuration/using Transformations, creating Informatica Mappings,
Mapplets, Sessions, Worklets, Workflows, Processing tasks using Informatica Designer / Workflow
Manager to move data from multiple source systems into targets. 
• Experienced in Installation, Configuration, and Administration of Informatica Power Center 8.x/7.x/6.x. 
• Experienced in Performance tuning of Informatica (sources, mappings, targets and sessions) and tuning
the SQL queries. 
• Experienced in integration and transforming of various data sources from Databases like MS Access,
Oracle, DB2, SQL Server and formats like flat-files, COBOL files, XML, etc. 
• Experienced in using ETL tools like Informatica (Power Center) Designer, Repository Manager,
Administration console and Workflow Manager. 
• Experience in Oracle and MS SQL Server environments using Triggers, functions, SQL, T-SQL and
PL/SQL. 
• Expertise in scheduling Informatica jobs using Informatica, Windows scheduler and with Unix. 
• Expertise in creating Unix shell scripts. 
• Experienced in working for the post development cycle and applications in Production Support.
Work Experience
Sr. Informatica Developer
T-Mobile USA 

Bellevue, WA
June 2012 to Present
Description: 
T-Mobile USA is a national provider of wireless voice, messaging and data services capable of reaching
over 293 million Americans where they live, work and play. The project ventures to enhance the
performance and controls of the accounting processes associated with the pre-paid business by
enhancing or replacing some of the underlying systems, such as PRS (Prepaid Reporting System),
currently used to create the accounting entries. 
This project is focused exclusively on improving the IT ecosystem within the Prepaid business, which
includes the subscriber types Prepaid, Flex Pay and Wal-Mart Family Mobile. Prepaid subscribers are in
scope as well as subscribers that are part of a hybrid account in Samson, and that in/out of scope
account type/sub type refers to Postpaid (Samson) accounts only. 
 
Responsibilities: 
• Involved in gathering and analyzing the requirements and preparing business rules. 
• Designed and developed complex mappings by using Lookup, Expression, Update, Sequence
generator, Aggregator, Router, Stored Procedure, etc., transformations to implement complex logics while
coding a mapping. 
• Worked with Informatica power center Designer, Workflow Manager, Workflow Monitor and Repository
Manager. 
• Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract the data
from multiple source systems like Oracle, SQL server and Flat files and loaded into Oracle. 
• Developed Informatica Workflows and sessions associated with the mappings using Workflow
Manager. 
• Involved in creating new table structures and modifying existing tables and fit into the existing Data
Model. 
• Extracted data from different databases like Oracle and external source systems like flat files using ETL
tool. 
• Involved in debugging Informatica mappings, testing of Stored Procedures and Functions, Performance
and Unit testing of Informatica Sessions, Batches and Target Data. 
• Developed Mapplets, Reusable Transformations, Source and Target definitions, mappings using
Informatica 9.1.0. 
• Generated queries using SQL to check for consistency of the data in the tables and to update the tables
as per the Business requirements. 
• Involved in Performance Tuning of mappings in Informatica. 
• Good understanding of source to target data mapping and Business rules associated with the ETL
processes. 
 
Environment: Informatica 9.1, Oracle 11g, SQL server 2008 R2, SQL, T-SQL, PL/SQL, Toad 10.6, SQL
Loader, Tidal Enterprise Scheduler 5.3.1, Accurev, Unix, Flat files.
Informatica Developer
ING US 
-

West Chester, PA
January 2011 to May 2012
Description: 
ING Group operations in the U.S are performed through ING. U.S which provide large financial services
to retail and institutional clients, which includes annuities, retirement plans, mutual funds, life insurance,
direct banking etc. This project involved extraction of data from multiple sources and transform the data
as per the business logic and is loaded into the data Warehouse which is finally used for reporting. 
 
Responsibilities: 
• Prepared technical design/specifications for data Extraction, Transformation and Loading. 
• Worked on Informatica Utilities Source Analyzer, warehouse Designer, Mapping Designer, Mapplet
Designer and Transformation Developer. 
• Analyzing the sources, transforming data, mapping the data and loading the data into targets using
Informatica Power Center Designer. 
• Created reusable transformations to load data from operational data source to Data Warehouse and
involved in capacity planning and storage of data. 
• Developed complex mappings such as Slowly Changing Dimensions Type II-Time stamping in the
Mapping Designer. 
• Used various transformations like Stored Procedure, Connected and Unconnected lookups, Update
Strategy, Filter transformation, Joiner transformations to implement complex business logic. 
• Used Informatica Workflow Manager to create workflows, database connections, sessions and batches
to run the mappings. 
• Used Variables and Parameters in the mappings to pass the values between mappings and sessions. 
• Created Stored Procedures, Functions, Packages and Triggers using PL/SQL. 
• Implemented restart strategy and error handling techniques to recover failed sessions. 
• Used Unix Shell Scripts to automate pre-session and post-session processes. 
• Did performance tuning to improve Data Extraction, Data process and Load time. 
• Wrote complex SQL Queries involving multiple tables with joins. 
• Implemented best practices as per the standards while designing technical documents and developing
Informatica ETL process. 
 
Environment: Informatica 8.6/9.1, Oracle 10g, SQL server 2005, SQL, T-SQL, PL/SQL, Toad, Erwin4.x,
Unix, Tortoise SVN, Flat files.
Informatica Developer
Bank Of America 

Charlotte, NC
October 2009 to December 2010
Description: 
Bank of America is one of the largest bank holding companies in the United States with its head quarters
located in Charlotte, NC. Bank of America operates in all the states of U.S. and 40 other countries. It
covers approximately 57 million consumers and small business relationships having 5900 banking
centers and 18, 000 ATMs. This project is developed for maintaining a catalogue of all the products that
are offered online to the customers. It involved extraction of data from multiple Source systems and
loading them through an ETL process in to staging tables and then in to Target tables. 
 
Responsibilities: 
• Involved in creating Detail design documentation to describe program development, logic, coding,
testing, changes and corrections. 
• Extensively involved in writing ETL Specifications for Development and conversion projects. 
• Created shortcuts for reusable source/target definitions, Reusable Transformations, mapplets in Shared
folder. 
• Involved in requirement definition and analysis in support of Data Warehouse. 
• Worked extensively on different types of transformations like Source qualifier, expression, Aggregator,
Router, filter, update strategy, lookup, sorter, Normalizer, sequence generator, etc. 
• Worked with XSD and XML files generation through ETL process. 
• Defined and worked with mapping parameters and variables. 
• Designed and developed transformation rules (business rules) to generate consolidated (fact/summary)
data using Informatica ETL tool. 
• Performed the performance evaluation of the ETL for full load cycle. 
• Checked Sessions and error logs to troubleshoot problems and also used debugger for complex. 
• Worked on Parameterize of all variables, connections at all levels in UNIX. 
• Created test cases for unit testing and functional testing. 
• Coordinated with testing team to make testing team understand Business and transformation rules
being used throughout ETL process. 
 
Environment: Informatica Power center 8.6.1, Oracle 10g, Windows XP, Unix Shell Scripts, SQL, PL/SQL,
Flat files.
Informatica Developer
FedEx 

Memphis, TN
July 2008 to September 2009
Description: 
FedEx is the world's leading delivery company for overnight packages with its headquarters located in
Memphis, TN. It is one of the global companies, present in 220 countries and about 75, 000 cities across
the world. The main aim of the project is to have the details of all the orders and transactions of FedEx.
This project is developed for Executive business management, financial and profitability analysis, Sales
force management, Marketing Campaign and product analysis. 
 
Responsibilities: 
• Used Informatica ETL to load data from flat files, which includes fixed-length as well as delimited files
and SQL Server to the Data mart on Oracle database. 
• Used reverse engineering in Erwin 4.x to understand the existing data model of the data warehouse. 
• Worked with creating Dimensions and Fact tables for the data mart. 
• Created Informatica mappings, sessions, workflows, etc., for loading fact and dimension tables for data
mart presentation layer. 
• Have implemented SCD (Slowly Changing Dimensions) Type I and II for data load. 
• Did performance tuning of Informatica components for daily and monthly incremental loading tables. 
• Developed Mapplets, reusable transformations, source and target definitions and mappings using
Informatica 7.1. 
• Developed mapping using parameters and variables. 
• Created complex workflows, with multiple sessions, worklets with consecutive or concurrent sessions. 
• Used Timer, Event Raise, Event Wait, Decisions, and Email tasks in Informatica Workflow Manager. 
• Used Workflow Manager for creating validating, testing and running sequential and concurrent batches. 
• Implemented source and target based partitioning for existing workflows in production to improve
performance so as to cut back the running time. 
• Analyzed workflow, session, event and error logs for trouble shooting Informatica ETL process. 
• Worked with Informatica Debugger to debug the mappings in Informatica Designer. 
• Involved in creating test plans, test cases to unit test Informatica mappings, sessions and workflows. 
• Involved in migrating Informatica ETL application and Database objects through various environments
such as Development, Testing, UAT and Production environments. 
• Documented and presented the production/support documents for the components developed when
handing-over the application to the production support team. 
 
Environment: Informatica Power Center 8.1, Workflow Manager, Workflow Monitor, Erwin 4.0/3.5.2,
TOAD 8.6.1.0, PL/SQL, Flat files, XML, Oracle 10g/9i
ETL Developer
Janus Capital Group 

Denver, CO
June 2007 to June 2008
Description: 
Janus Capital Group is the 16th largest mutual fund company in the US with more than 4 million mutual
fund investors. For nearly four decades the Denver- based firm has taken a long-term view, company-by-
company investment approach to gain a differentiated view in the market place. In addition to growth,
core and international equity funds, Janus manages balanced, alternative, fixed-income and money
market funds. Project Description included creation of Data Warehouse. It included extraction and
transformation of data from various sources such as Flat Files, SQL Server, Oracle and loading them
back into the Target tables. 
 
Responsibilities: 
• Participated in requirement gathering meetings with business analysts and ETL architect to understand
Source Data, Data Warehouse data model, Technical Metadata, etc. 
• Documented required technical metadata such as source and target definitions. 
• Created technical design specifications for mappings, sessions, workflows, etc. 
• Worked on different OLTP data sources such as Oracle, SQL Server and Flat files for data extraction. 
• Created complex mappings in Informatica Power Center Designer using Aggregate, Expression, Filter,
Sequence Generator, Update Strategy, Rank, Sorter, Lookup, Joiner transformations etc., 
• Implemented Type II slowly changing dimensions using date-time stamping. 
• Created reusable transformations and mapplets to reuse while creating ETL mappings. 
• Configured and used the Debugger to troubleshoot the Informatica mappings. 
• Proficient in using Informatica workflow manager, workflow monitor, pmcmd (Informatica command line
utility) to create, schedule and control workflows, tasks, and sessions. 
• Worked with DBA team to fix performance issues at Database. 
• Wrote SQL, PL/SQL, stored procedures for implementing business rules and transformations. 
• Worked with export and import utilities in Repository manager. 
• Involved in upgrading from Informatica Power Center 6.1 to Informatica Power Center 7.1 which is used
as ETL tool for loading data into EDW from transactional databases. 
 
Environment: Informatica Power center 7.1, Oracle 9i, SQL Server 2005, SQL, PL/SQL, TOAD, Windows
NT, Unix.
Informatica Developer
Sprint Mobile 
-

Kansas City, MO
June 2006 to May 2007
Description: 
Sprint Nextel is one of the largest telecommunication companies in the world. With around 53.7 million
subscribers, Sprint Nextel operates the third largest wireless telecommunications network in the United
States (based on total wireless customers) .The company grew steadily through acquisitions and changed
its name to United Telecommunications at which time it provided local telephone service in many areas of
the Midwest and Southern part of USA. 
Responsibilities: 
• Involved in Relational and Dimensional Data Modeling Techniques to design ERWIN data models. 
• Worked on Informatica power center tool - Source Analyzer, Warehouse Designer, Mapping Designer,
Mapplets and Transformation Developer 
• Extraction, Transformation and Load was performed using Informatica Power Center to build Data
warehouse. 
• Accomplished automated data extraction from various RDBMS via UNIX shell scripts, ETL processing
using Informatica and loading into Data Mart. 
• Developed complex mappings in Informatica Power Center to load the data from various sources using
different transformations like Source Qualifier, Look up (connected and unconnected), Expression,
Aggregate, Update Strategy, Joiner, Filter and Router. 
• Created and scheduled Sessions and Batches through the Informatica Server Manager. 
• Designed and documented validation rules, error handling and test strategy of ETL process. 
• Tuned Informatica mappings/sessions for better ETL performance by eliminating bottlenecks. 
 
Environment: Informatica Power center 6.1/7.1, Oracle 9i, SQL Server 2000, SQL, PL/SQL, TOAD,
Windows NT, Unix.
ETL Developer
Eli Lilly and Company 

Indianapolis, IN
October 2005 to May 2006
Description: 
Eli Lilly and Company, a leading pharmaceutical company in Indianapolis. Project involved in Data
Warehouse development for sales division, which contains the sales data for pharmaceuticals and
medical products of the company. This Data Warehouse project will enable management to better
leverage information collected within current operational systems to help in their decision making
process. 
 
Responsibilities: 
• Involved in analyzing and development of the Data Warehouse. 
• Worked on Informatica power center tool - Source Analyzer, Warehouse Designer, Mapping Designer,
Mapplets and Transformation Developer 
• Created various mappings using Aggregate, Filter, Join, Expression, Lookup, Update Strategy and
Router. 
• Extensively used ETL to load data from different databases and flat files to Oracle. 
• Involved in the development of Informatica mappings and also tuned them for better performance. 
• Created and scheduled Sessions and Batches through the Informatica Server Manager. 
• Worked with sessions and batches using Server Manager to load data into the target database. 
• Testing for Data Integrity and Consistency. 
 
Environment: Informatica Power Center 6.1, Oracle 8i, SQL, Windows 98/2000, Shell scripts,
Additional Information
Technical Skills: 
 
ETL Tools Informatica Power Center 9.x/8.x/7.x/6.x 
Data Modeling Erwin 4.0/3.5, Toad, SQL Station, MS Visio, SQL Developer 
DBMS Oracle […] IBM DB2, MS SQL Server 2008 […] Microsoft Access, Excel, ODBC 
Programming Languages SQL, PL/SQL, T-SQL, Unix Shell scripting 
Operating Systems […] UNIX, LINUX

ETL Informatica Developer Resume


Datawarehousing, ETL, Informatica Resumes

PROFESSIONAL SUMMARY:

 6+ years of IT experiencein Analysis, Design, Development, Implementation, Testing and Supportof Data

Warehousing and Data Integration Solutions using Informatica Powercenter.

 4+ years of experience in using Informatica PowerCenter (7.1.3/8.6.1)

 1+ years of experience in Reporting tool COGNOS ( 8.4)

 Proficiency in developing SQL with various relational databases like Oracle, SQL Server.

 Knowledge in Full Life Cycle development of Data Warehousing.

 Have extensively worked in developing ETL program for supporting Data Extraction, transformations

and loading using Informatica Power Center.

 Experience with dimensional modeling using star schema and snowflake models.

 Understand the business rules completely based on High Level document specifications and

implements the data transformation methodologies.

 Created UNIX shell scripts to run the Informatica workflows and controlling the ETL flow.

 Developed OLAP applications using Cognos 8BI - (Frame Work Manager, Cognos Connection, Report

Studio, Query Studio, and Analysis Studio) and extracted data from the enterprise data warehouse to

support the analytical and reporting for Corporate Business Units.

 Strong with relational database design concepts.

 Extensively worked with Informatica performance tuning involving source level, target level and map

level bottlenecks.
 Strong business understanding of verticals like Banking, Brokerage, Insurance, Mutual funds and

Pharmaceuticals.

 Independently perform complex troubleshooting, root-cause analysis and solution development.

 Ability to meet deadlines and handle multiple tasks, decisive with strong leadership qualities, flexible in

work schedules and possess good communication skills.

 Team player, Motivated, able to grasp things quickly with analytical and problem solving skills.

 Comprehensive technical, oral, written and communicational skills

SOFTWARE KNOWLEDGE:

Operating Systems: Windows, Linux, HP-UX


Software / Applications: MS XP, MS 2000, MS Word,MS Excel, MS Access,Outlook,PowerPoint 
Database: SQL Server 2008/2005/2000, Oracle 11g/10g/9i 
ETL: Informatica PowerCenter 7.1.3/8.6.1, Informatica PowerExchange 8.6.1
Modeling: Framework Manager, PowerPlay Transformer
OLAP/BI Tools: Cognos 8 Series 
Languages: Java, HTML, XML,SQL, PL/SQL.
Web/Apps Servers: IBM Web Sphere 4x, Sun iPlanet Server 6.0. , IIS, Tomcat
Tools: TOAD, Visio, Eclipse

Client: Confidential, Louisville,KY


Duration: October 2011 to till date.
Role: ETL Informatica developer

Description:
Humana Inc., headquartered in Louisville, Kentucky, is a leading health care company that offers a wide
range of insurance products and health wellness services, Humana provides Medicare Advantage plans and
prescription drug coverage to more than3.5 million members throughout the US. 
The main objective of this project shared data Repository is to capture new vitality program customers data ,
policies, group policies,HumanaOne and non HumanaOne medicare plans.
Data is coming from various sources like SQL Server, Mainframe etc which will be loaded in to EDW based
on different frequencies as per the requirement. The entire ETL process consists of source systems, staging
area, Datawarehouse and Datamart.

Responsibilities:

 Developed ETL programs using Informatica to implement the business requirements.

 Communicated with business customers to discuss the issues and requirements.

 Created shell scripts to fine tune the ETL flow of the Informatica workflows.

 Used Informatica file watch events to pole the FTP sites for the external mainframe files.

 Production Support has been done to resolve the ongoing issues and troubleshoot the problems.

 Performance tuning was done at the functional level and map level. Used relational SQL wherever

possible to minimize the data transfer over the network.


 Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP

connections and relational connections.

 Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying

of stored procedures for code enhancements.

 Effectively worked in Informatica version based environment and used deployment groups to migrate

the objects.

 Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating

transformations.

 Effectively worked on Onsite and Offshore work model.

 Pre and post session assignment variables were used to pass the variable values from one session to

other.

 Designed workflows with many sessions with decision, assignment task, event wait, and event raise

tasks, used informatica scheduler to schedule jobs.

 Reviewed and analyzed functional requirements, mapping documents, problem solving and trouble

shooting.

 Performed unit testing at various levels of the ETL and actively involved in team code reviews.

 Identified problems in existing production data and developed one time scripts to correct them.

 Fixed the invalid mappings and troubleshoot the technical problems of the database.

Environment: Informatica 8.6.1 ,SQL Server 2008 R2, HP-UX ,

Client :Confidential, Chicago, IL 


Duration : August 2010 to Sep 2011
Role : ETL Informatica developer

Description:
Allstate is one of the fastest growing Auto/Property/ Life Insurance Company. It serves its customers by
offering a range of innovative products to individuals and group customers at more than 600 locations
through its company-owned offices.

The primary objective of this project is to capture different Customers, Policies, Claims Agents, Products
and financial related data from multiple OLTP Systems and Flat files. Extracted Transformed Loaded data in
to data warehouse using Informatica Powercentre and generated various reports on a daily, weekly monthly
and yearly basis. These reports give details of the various products of Allstate Insurance products that are
sold. The reports are used for identifying agents for various rewards and awards and performance, risk
analysis reports for Business development Managers.

Responsibilities:
 Involved in all phases of SDLC from requirement gathering, design, development, testing, Production,

user training and support for production environment.

 Create new mapping designs using various tools in Informatica Designer like Source Analyzer,

Warehouse Designer, Mapplet Designer and Mapping Designer.

 Develop the mappings using needed Transformations in Informatica tool according to technical

specifications

 Created complex mappings that involved implementation of Business Logic to load data in to staging

area.

 Used Informatica reusability at various levels of development.

 Developed mappings/sessions using Informatica Power Center 8.6 for data loading.

 Performed data manipulations using various Informatica Transformations like Filter, Expression, Lookup

(Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter and

Union.

 Developed Workflows using task developer, Worklet designer and workflow designer in Workflow

manager and monitored the results using workflow monitor.

 Building Reports according to user Requirement.

 Extracted data from Oracle and SQL Server then used Teradata for data warehousing.

 Implementedslowly changing dimensionmethodology for accessing the full history of accounts.

 Write Shell script running workflows in unix environment.

 Optimizing performance tuning at source, target,mapping and session level

 Participated inweeklystatus meetings, and conducting internal andexternal reviews as well as formal

walk through among various teams and documenting the proceedings.

Environment: Informatica 8.6 .1,Oracle 11g, SQL Server 2005, HP-UX.

Client :Confidential, San Jose, CA 


Duration :April 2009 to July 2010 
Role :ETL Developer

Description:
This position requires implementing data warehouse for Forecasting, Marketing, Sales performance reports.
The data is obtained from Relational tables and Flat files. I was involved in cleansing and transforming the
data in the staging area and then loading into Oracle data marts. This data marts/Data warehouse is an
integrated Data Mine that provides feed for extensive reporting.

Responsibilities:
 Used Informatica Power Center for (ETL) extraction, transformation and loading data from

heterogeneous source systems into target database.

 Created mappings using Designer and extracted data from various sources, transformed data according

to the requirement.

 Involved in extracting the data from the Flat Files and Relational databases into staging area.

 Mappings, Sessions, Workflows from Development to Test and then to UAT environment.

 Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of a

star schema.

 Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source

filter usage in Source qualifiers, and data flow management into multiple targets using Router.

 Created Sessions and extracted data from various sources, transformed data according to the

requirement and loading into data warehouse.

 Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner,

Router and Aggregator to create robust mappings in the Informatica Power Center Designer.

 Imported various heterogeneous files using Informatica Power Center 8.x Source Analyzer.

 Developed several reusable transformations and mapplets that were used in other mappings.

 Prepared Technical Design documents and Test cases.

 Involved in Unit Testing and Resolution of various Bottlenecks came across.

 Implemented various Performance Tuning techniques.

 Used Teradata as a source system

Environment: 
Informatica 8.1.1 Power Center, Teradata, Oracle 11g, Windows NT.

Client :Confidential, Newark, NJ 


Duration : July 2008 to March 2009 
Role :ETL Informatica developer

Description:
Prudential Financial companies serve individual and institutional customers worldwide and include The
Prudential Insurance Company of America, one of the largest life insurance companies in the U.S. These
companies offer a variety of products and services, including mutual funds, annuities, real estate brokerage
franchises, relocation services, and more. Involved in the development and implementation of goals,
policies, priorities, and procedures relating to financial management, budget, and accounting. Analyzes
monthly actual results versus plan and forecast

Responsibilities:

 Involved in design, development and maintenance of database for Data warehouse project.
 Involved in Business Users Meetings to understand their requirements.

 Designed, Developed and Supported Extraction, Transformation and Load Process (ETL) for data

migration with Informatica 7.X.

 Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter,

Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator

transformations.

 Created complex mappings which involved Slowly Changing Dimensions, implementation of Business

Logic and capturing the deleted records in the source systems.

 Worked extensively with the connected lookup Transformations using dynamic cache.

 Worked with complex mappings having an average of 15 transformations.

 Created and scheduled Sessions, Jobs based on demand, run on time and run only once

 Monitored Workflows and Sessions using Workflow Monitor.

 Performed Unit testing, Integration testing and System testing of Informatica mappings

 Coded PL/SQL scripts.

 Wrote Unix scripts, perl scripts for the business needs.

 Coded Unix Scripts to capture data from different relational systems to flat files to use them as source

file for ETL process

 Created Universes and generated reports on using Star Schema.

Environment: Informatica PowerCenter 7.1.3, Oracle 11g, UNIX

Client :Confidential, Dearborn, MI 


Duration :April 2007 to July 2008
Role :Cognos Developer 

Description:
The Oakwood Healthcare System serves 35 different communities in southeastern Michigan with over 40
primary and secondary care locations. Responsibilities include working with the clinical analytics team on
the measurement of provider performance, quality improvement initiatives, and various ad-hoc requests. The
reports are created, distributed and published using various Cognos BI tools like ReportNet, Impromptu,
Power Play, IWR, and UpFront to the end-users. The application had OLAP features like Drill Down analysis,
Multidimensional analysis, Prompts, Exception Highlighting and User Privileges.

Responsibilities:

 Developed models in Framework Manager.

 Published packages and managed the distribution / setup of the environment.

 Used Query Studio for creating Ad-hoc Reports.

 Created complex and multi-page reports using Report Studio


 Performed migration from Impromptu to Reportnet.

 Used Schedule Management in Cognos Connection.

 Performed Bursting Reports and Multilingual Reports using Report Studio

 Developed Layout, Pages, Object Containers and Packages using Report Studio.

 Created reports using ReportNet with multiple Charts and Reports.

 Responsible for assigning user Sign-Ons for the new users.

 Provided guidance to report creators for enhancement opportunities.

 Created Multidimensional Cubes using PowerPlay and published on the UpFront Portal using PowerPlay

Enterprise Server.

 Developed PowerPlay Cubes, used multiple queries, calculated measures, customized cube content and

optimized cube creation time.

 Fine-tuned the Cubes and checked the database space issue and cube growth periodically.

 Responsible in the creation of new User Groups and User Classes using Access Manager.

Environment: CogonsBI(Frame work manager, Cognos Connection, Report Studio, Query Studio), Oracle
11g, SQL server 2005.

Client :Confidential, Bridgewater, NJ 


Duration : Nov 2006 to April 2007
Role :ETL Analyst

Description:
Aventis is a Pharmaceutical company, which provides new and improved biotech drugs for various diseases
and their symptoms. The objective of the project is to extract data stored in different databases and load into
oracle system which is the staging area and the business logic is applied to transform the tables in the
required way. The data warehouse is fed by marketing data, sample data, market (competitor) data,
prescription data and others.

Responsibilities:

 Extensively used ETL to load data from Flat Files, XML, Oracle to oracle 8i

 Involved in Designing of Data Modeling for the Data warehouse

 Involved in Requirement Gathering and Business Analysis

 Developed data Mappings between source systems and warehouse components using Mapping

Designer

 Worked extensively on different types of transformations like source qualifier, expression, filter,

aggregator, rank, update strategy, lookup, stored procedure, sequence generator, joiner, XML.

 Setup folders, groups, users, and permissions and performed Repository administration using

Repository Manager.
 Involved in the performance tuning of the Informatica mappings and stored procedures and the sequel

queries inside the source qualifier.

 Created, launched & scheduled sessions.

 Involved in the Performance Tuning of Database and Informatica. Improved performance by identifying

and rectifying the performance bottle necks.

 Used Server Manager to schedule sessions and batches.

 Involved in creating Business Objects Universe and appropriate reports

 Wrote PL/SQL Packages and Stored procedures to implement business rules and validations.

Environment: Informatica 7.1.3, ORACLE 10g, UNIX, Windows NT 4.0, UNIX Shell Programming, PL/SQL,
TOAD Quest Software

Client :Confidential, Bangalore, India 


Duration :May 2005 to Oct 2006
Role :Oracle Developer

Description:
Core project focused on designing and implementing scalable solutions that would support the company's
continued dramatic growth, under girded by the corporate data warehouse.

Responsibilities:

 Developed Oracle PL/SQL packages, procedures and functions

 Coded Oracle SQL to create ad-hoc reports on an as-needed basis

 Used Oracle Warehouse Builder to implement changes to the operational data store, as well as create

data marts

 Involved in the data analysis for source and target systems. Good understanding of Data warehousing

concepts, Star schema and Snow-flake schema.

 Involved in supporting and maintaining Oracle Import, Export and SQL*Loader jobs

 Involved in supporting and maintaining Unix shell script jobs

Technical Details: Oracle 9, PL/SQL, Windows 98

<
  Next >
Prev

Professional Summary:
 Over 7 years of programming experience as an Oracle PL/SQL Developer in Analysis,
Design and Implementation of Business Applications using the Oracle Relational Database
Management System (RDBMS).
 Involved in all phases of the SDLC (Software Development Life Cycle) from analysis,
design, development, testing, implementation and maintenance with timely delivery against
aggressive deadlines.
 Experience with Data flow diagrams, Data dictionary, Database normalization theory
techniques, Entity relation modeling and design techniques.
 Expertise in Client-Server application development using Oracle 11g/10g/9i/8i, PL/SQL, SQL
*PLUS, TOAD and SQL*LOADER.
 Effectively made use of Table Functions, Indexes, Table Partitioning, Collections,
Analytical functions, Materialized Views, Query Re-Write and Transportable table
spaces.
 Strong experience in Data warehouse concepts, ETL.
 Good knowledge on logical and physical Data Modeling using normalizing Techniques.
 Created Tables, Views, Constraints, Index (B Tree, Bitmap and Function Based).
 Developed Complex database objects like Stored Procedures, Functions, Packages and
Triggers using SQL and PL/SQL.
 Developed materialized views for data replication in distributed environments.
 Excellent technical and analytical skills with clear understanding of design goals of ER
modeling for OLTP and dimension modeling for OLAP.
 Experience in Oracle supplied packages,Dynamic SQL, Records and PL/SQL Tables.
 Loaded Data into Oracle Tables using SQL Loader.
 Partitioned large Tables using range partition technique.
 Experience with Oracle Supplied Packages such as DBMS_SQL,DBMS_JOB and UTL_FILE.
 Created Packages and Procedures to automatically drop table indexes and create indexes
for the tables.
 Worked extensively on Ref Cursor, External Tables and Collections.
 Expertise in Dynamic SQL, Collections and Exception handling.
 Experience in SQL performance tuning using Cost-Based Optimization (CBO).
 Good knowledge of key Oracle performance related features such as Query Optimizer,
Execution Plans and Indexes.
 Experience with Performance Tuning for Oracle RDBMS using Explain Plan and HINTS.
 Experience in ETL techniques and Analysis and Reporting including hands on experience with
the Reporting tools such as Cognos.
 Created Shell Scripts for invoking SQL scripts and scheduled them using crontab.
 Excellent communication, interpersonal, analytical skills and strong ability to perform as part of
a team.

Technical Skills:
Databases: Oracle 7.x/8i/9i/10g/11g (SQL, PL/SQL, Stored Procedures, Triggers), MS SQL
SERVER 2000/2005/2008, DB2/UDB, Teradata, SAP Tables and MS Access.
ETL Tools: Informatica (PowerCenter 5.1/6.2/7.1/8.6.1/9.1.0, PowerMart 5.1, Power Connect/Power
Exchange for SAP R/3, MainFrame and Oracle Change Data Capture (CDC), AB Initio 1.8 and
SQL*Loader.
Reporting Tools: Business Objects Developer Suite 5.1/BO XIR2, Cognos Suite, Cognos Report
Net 1.1MR2, Crystal Reports, Oracle Reports 2.5
Operating Systems: UNIX(Sun Solaris, LINUX, HP UNIX, AIX), Windows NT/98/95/2000 &
Windows XP.
Data Modeling: Erwin 3.5.2,4.0
Languages/Utilities: SQL, PL/SQL, Unix shell scripts, Java, XML, C and Cobol.
Other Tools: AutoSys, Control-M, PVCS, WIN CVS, Informatica Data Quality, B2B Data
Transformation, Informatica Power Exchange Informatica 9.1.0 Developer/Analyst, TPump, Fast
Load, BTEQ
Education Qualifications:
Bachelor’s in Computer Science
Professional Experience:
Project Name: Confidential
Client: Confidential, NJ
Role: Oracle PL/SQL Developer
Duration: Sep’ 11 – PresentDescription:
The PIAS (Property Insurance Application System) used by Personal lines department which
involves in providing homeowners insurance application. This Product does not enable users to
capture Dwelling applications beyond creation of Reference number. In this Release the product is
enabled to capture Dwelling new business applications through PIAS.
Responsibilities:

 Coordinated with the front end design team to provide them with the necessary stored
procedures and packages and the necessary insight into the data.
 Worked on SQL*Loader to load data from flat files obtained from various facilities every day.
 Created and modified several UNIX shellScripts according to the changing needs of the
project and client requirements.
 Wrote Unix Shell Scripts to process the files on daily basis like renaming the file, extracting
date from the file, unzipping the file and remove the junk characters from the file before
loading them into the base tables.
 Involved in the continuous enhancements and fixing of production problems.
 Generated server side PL/SQL scripts for data manipulation and validation and materialized
views for remote instances.
 Developed PL/SQL triggers and master tables for automatic creation of primary keys.
 Created PL/SQL stored procedures, functionsandpackages for moving the data from
staging area to data mart.
 Created scripts to create new tables, views, queries for new enhancement in the application
using TOAD.
 Created indexes on the tables for faster retrieval of the data to enhance
database performance.
 Involved in data loading using PL/SQL and SQL*Loader calling UNIX scripts to download and
manipulate files.
 Performed SQL and PL/SQL tuning and Application tuning using various tools like EXPLAIN
PLAN, SQL*TRACE, TKPROF andAUTOTRACE.
 Extensively involved in using hints to direct the optimizer to choose an optimum query
execution plan.
 Used Bulk Collections for better performance and easy retrieval of data, by reducing
context switching between SQL and PL/SQL engines.
 Created PL/SQL scripts to extract the data from the operational database into simple flat text
files using UTL_FILE package.
 Creation of database objects like tables, views, materialized views, procedures and
packages using oracle tools like Toad, PL/SQL Developer and SQL* plus.
 Partitioned the fact tables and materialized views to enhance the performance.
 Extensively used bulk collection in PL/SQL objects for improving the performing.
 Created records, tables, collections (nested tables and arrays) for improving Query
performance by reducing context switching.
 Used Pragma Autonomous Transaction to avoid mutating problem in database trigger.
 Extensively used the advanced features of PL/SQL like Records, Tables, Object
types and Dynamic SQL.
 Handled errors using Exception Handling extensively for the ease of debugging and
displaying the error messages in the application.

Environment: Oracle 11g, SQL * Plus, TOAD, SQL*Loader, SQL Developer, Shell Scripts, UNIX,
Windows XP
Project Name: Confidential
Client: Confidential, DE
Role: Oracle PL/SQL Developer
Duration: Jun’ 07 to Aug’ 11
Description:
Bank of America is the leading financial institution in USA. Customer information systems hold
customer, account, address, offer, service, statement related details in IMS and DB2 databases. CIS
systems passes customer information to ATM, call centers , online banking , mobile banking
systems in online and batch mode.
Responsibilities:

 Developed Advance PL/SQLpackages, procedures, triggers, functions,


Indexes and Collections to implement business logic using SQLNavigator. Generated
server side PL/SQLscripts for data manipulation and validation and materialized views for
remote instances.
 Created management analysis reporting using ParallelQueries, Javastoredprocedure,
HTPPackage and WEB.SHOW_DOCUMENT. Participated in change and code reviews to
understand the testing needs of the change components. Worked on troubleshooting defects
in timely manner.
 Involved in creating UNIXshellScripting. Defragmentation of tables, partitioning, compressing
and indexes for improved performance and efficiency. Involved in table redesigning with
implementation of Partitions Table and Partition Indexes to make Database Faster and easier
to maintain.
 Experience in DatabaseApplicationDevelopment, Query Optimization, Performance
Tuning and DBA solutions and implementation experience in complete
SystemDevelopmentLifeCycle.
 Used SQL Server SSIStool to build high performance data integration solutions
including extraction, transformation and load packages for datawarehousing. Extracted
data from the XML file and loaded it into the database.
 Designed and developed Oracle forms & reports generating up to 60 reports.
 Performed modifications on existing form as per change request and maintained it.
 Used Crystal Reports to track logins, mouse overs, click-through, session durations and
demographical comparisons with SQL database of customer information.
 Worked on SQL*Loader to load data from flat files obtained from various facilities every day.
Used standard packages like UTL_FILE, DMBS_SQL, and PL/SQL Collections and
used BULK Binding involved in writing database procedures, functions and packages for
Front End Module.
 Used principles of Normalization to improve the performance. Involved in ETL code
using PL/SQL in order to meet requirements for Extract, transformation, cleansing and loading
of data from source to target datastructures.
 Involved in the continuous enhancements and fixing of production problems. Designed,
implemented and tuned interfaces and batch jobs using PL/SQL. Involved in data replication
and high availability design scenarios with Oracle Streams. Developed UNIXShellscripts to
automate repetitive databaseprocesses.

Environment: VB 6, JAVA, Oracle 10g/11g, PL/SQL, SQL*LOADER, Oracle Streams 10g


(Replication), SQL PLUS, HTML, SQL Server SSIS, TOAD, XML, HP-UNIX shell scripting.
Project Name: Confidential
Client:Confidential
Role: Oracle Developer/Analyst
Duration: Apr’ 06 to May’ 07
Description:
Worked in an application which is a corporate customer transaction facility created for the RBS
customers to do payments. The Project covered addition of new enhancements to the application
like financial transfers, Inter Account Transfers (IAT) and development of the same.
Responsibilities:

 Used Oracle JDeveloper to support JA Built complex queries using SQL and wrote stored
procedures using PL/SQL in Various API’s like Java, .Net and Hierarchical databases like
Oracle and Access.
 Developed and modified a number of Forms and for various modules. Also responsible for
following up bugs reported by various users and suggesting possible patches to be applied.
 Wrote Shell Scripts for Data loading and DDL Scripts.
 Worked in Production Support Environment as well as QA/TEST environments for projects,
work orders, maintenance requests, bug fixes, enhancements, data changes, etc.
 Used Oracle JDeveloper to support JAVA, JSP and HTML codes used in modules.
 Wrote conversion scripts using SQL, PL/SQL, stored procedures,
functions and packages to migrate data from SQL server database to Oracle database.
 Performed Database Administration of all database objects including tables, clusters,
indexes, views, sequences packages and procedures.
 Implemented 11g and upgraded the existing database from Oracle 9i to Oracle 11g.
 Involved in Logical & Physical Database Layout Design.
 Set-up and Design of Backup and Recovery Strategy for various databases.
 Performance tuning of Oracle Databases and User applications.
 Used SQL*Loader as an ETL tool to load data into the staging tables.
 Used DTS Packages as ETL tool for migrating Data from SQL Server 2000 to Oracle 10g.
 Provided user training and production support.
 Improved the performance of the application by rewriting the SQL queries.
 Wrote packages to fetch complex data from different tables in remote databases using joins,
sub queries and database links.

Environment: VB 6, Oracle 9i/10g/11g SQL, PL/SQL, Forms 9i, Reports 9i, SQL*Loader, SQL
Navigator, Crystal Reports, TOAD.
Project Name: Confidential, Mumbai, India (Contractor to Accenture)
Client: Confidential
Role: Oracle Developer/Analyst
Duration: Feb’ 04 to Mar’ 06
Description:
This system keeps track of day-to-day bank operations like deposits, withdrawals, demand drafts,
different types of loans (mortgage business) for the customers as well as the employees.
Responsibilities:

 Involved in full development cycle of Planning, Analysis, Design, Development,


Testing and Implementation.
 Designed logical and physical data models for star and snowflake schemas using Erwin.
 Wrote sequences for automatic generation of unique keys to support primary and foreign
key constraints in data conversions.
 Created and modified SQL*Plus, PL/SQL and SQL*Loader scripts for data conversions.
 Upgraded Oracle 9i to 10g software in different environments for latest features and also
tested databases.
 Developed and modified triggers, packages, functions and stored procedures for data
conversions and PL/SQL procedures to create database objects dynamically based on user
inputs.
 Wrote SQL, PL/SQL, SQL*Plus programs required to retrieve data using cursors and
exception handling.
 Worked on XML along with PL/SQL to develop and modify web forms.
 Designed Data Modeling, Design Specifications and to analyze Dependencies.
 Creating indexes on tables to improve the performance by eliminating the full table scans and
views for hiding the actual tables and to eliminate the complexity of the large queries.
 Fine-tuned procedures/SQL queries for maximum efficiency in various databases
using Oracle Hints, for Rule based optimization.
 Used Discoverer 2000 to provide end users easy access to data and help in data analysis.
 Created some Custom reports and Forms for the end users to check the details and errors.

Environment: Oracle 9i, 10g, SQL*Plus, PL/SQL, Erwin 4.1, Oracle Designer 2000, Windows 2000,
Toad.
Oracle PLSQL Developer Resume - Sample 1

Aayush Tandon

Email ID: ***@gmail.com 


Contact No.: +91-********

Career Objective

To obtain a position where I can utilize my technical skills for the enhancement of
Oracle database systems.

Career Summary

A logical, analytical thinker with exceptional database skills possesses rich


experience of 2+ year as Oracle PL/SQL developer.
Expertise in Database designing and creation.
Excellent exposure to Microsoft Technologies, JAVA, ORACLE 9i, PL/SQL and
STRUTS.
Expertise in Software Development Life Cycle.
Expertise in requirement analysis, designing, developing, implementing and
executing a project.
Personal Qualities

Strong motivational and leadership skills.


Excellent programming skills.
Excellent communication skills in writing and verbal both.
Ability to handle the pressure situation.
Key Responsibilities Handled

Extensive use of Unix Shell Scripts, Cron jobs and Autosys to automate process.
Used PL/SQL to create Packages, Functions, and Procedure.
Used PL/SQL and SQL*Loader to create ETL packages for flat file loading and
error capturing into log tables.
Involved in testing the application for Oracle 9i to 10g upgrade.
Tuning of the SQL queries, which takes long time to process the request using
Explain Plan, Hints to reduce the response time.
Involved in all phases of database development, from needs assessment to
QA/QC, design, and support.
Train the new recruiters.
Technical Experience

Databases: Oracle 7/8/9 (RAC/OPS), MS-SQL, MySQL, DB2


Operating Systems: Windows 2000/2003/NT/XP, UNIX, Linux, Solaris
Applications: SQL-Loader, SQL Navigator, SSH, Forte for Java
Languages: Java, JSP, JDBC, XML, HTML, C, C++, Perl, VB, NET, ASAP.NET

Achievements

Made a Hotel Management System using VB.Net and Postgres as the database
tool.
Made a JAVA Chat Server using JAVA and JUnit as the testing tool of the
product.
Receive many appreciations and receive awards for excellent work.
Employer

Working as Qracle PL/SQL Developer in ASD Solutions from 2010 - Present.


Academia

Certification in oracle 8i, oracle 9i


B. Tech. in Information Technology
Personal Details

Date Of Birth: xx/xx/19xx
Languages Known: Hindi, English
Address: JKSJKDS
Oracle PLSQL Developer Resume - Sample 2

QMC

15 street, Maharaja road,


Kerala-123456
Contact no.: 9769******
Email ID: qmc***@gmail.com

Career Objective

To associate with an organization, that allows me to utilize my competencies to the


best use contributes to my overall growth as an individual.

Technical Skills

Operating System: Window NT 4.0, Solaris, Red Hat Linux 5.0, 5.4


RDBMS: SQL Server - 2005, 2008, 2008 R2, Oracle - v5, v6, 7, 8i, 9i, 10g, 11g
Language: SQL, PL/SQL, Java, VB
Web Technology: HTML, XML
Virtual Machine: VM Ware server 2.0 to 2.2, Oracle VM Virtual box 4.0
BPM: PEGA 5.5v

Skill Sets

Confident
Detail oriented
Time management skills
Analytical skills
Excellent communication and interpersonal skills
Key Responsibilities Handled

Coordinated physical changes to computer databases; and codes, tested and


implemented Oracle and SQL database, applying knowledge of database
management system.
Established physical database parameters.
Coded Oracle and SQL database descriptions and specified identifiers of Oracle
database to database management system or directs others in coding database
descriptions.
Calculated optimum values for Oracle and SQL database parameters, such as
amount of computer memory to be used by database, following manuals and
using calculator.
Specified user access level for each segment of one or more data items, such as
insert, replace, retrieve, or delete data.
Specified which users can access databases and what data can be accessed by
users.
Tested and corrected errors and refined changes to database.
Entered codes to create production database.
Selected and entered codes of utility program to monitor database performance,
such as distribution of records and amount of available memory.
Directed programmers and analysts to make changes to database management
system.
Reviewed and corrected programs.
Conferred with coworkers to determine impact of database changes on other
systems and staff cost for making changes to database.
Modified database programs to increase processing performance, referred to as
performance tuning.
Employers

Working as an Oracle Database Administrator (DBA) in SKA Ltd. from 20** till
date.
Worked as a Jr. Oracle Database Administrator in PMP Ltd. from 20** to 20**.
Achievements

Won best developer award in SKA Pvt. Ltd.


President of MHT association in year 20**.
Best student of the year during graduation.
Won marathon competition during graduation.
Won cycling tournament during graduation.
Personal Details
DOB: 10/01/19**
Languages known: English, Marathi, Hindi, Tamil
Hobbies: Hiking, Painting, Reading

COMPUTER SKILLSProgramming Languages


: C++, C, Java, and C#.
Web Programming
: HTML, JavaScript, PHP, and XML.
Scripting Languages
: VB Script, Perl, and Shell Script.
Source Control Tool
: SVN
Testing and Tracking Tools
: HP QuickTest Pro, JUnit, Selenium IDE, and Quality Center.
Database Languages
: SQL, JDBC, PRO*C, PL/SQL, and Developer 2000.
Operating Systems
: Windows, UNIX, Linux, and Macintosh.
WORK EXPERIENCEABC Inc., Any Town, NY May 2009
 –
PresentSoftware Developer
Maintained and revised functional, technical, and screen design documentation.Created xsd file format
specification document.Maintained and updated database design schema.Created stored procedures
using PL/SQL Developer.Created test scripts for application modules.Performed alpha testing of
application.
XYZ Corp., Any Town, NY Jan 1998
 –
Aug 1999Application Developer
Create and maintain a unified, multi-tiered reporting system to replace ad-hoc individual reports.Create
and maintain essential business systems on corporate intranet.Develop many custom software solutions
to meet various specific business needs.Cultivate inter-
departmental relationships as a “service department” for the co
mpany.Train new hires on department policies and procedures.
BCD Inc., Any Town, NY Jan 1998
 –
Aug 1999SQL Developer
Worked as a SQL developer to support business applications using Oracle SQL and PL/SQL.Designed and
created database tables, views and stored procedures.Built and maintained SQL scripts, indexes, reports
and queries for data analysis and extraction.Developed new processes to facilitate import and
normalization.Worked with business users and application developers to identify business needs and
provide solutions.Wrote and performed test procedures.
OTHER
: References available upon request.
EDUCATIONMaster of Science in Computer Science
New York University, Dec 1998
Bachelor of Engineering in Electronics and Communication
New York University, Aug 1995

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy