Informatica Developer
Informatica Developer
Extensively worked on data extraction, Transformation and loading data from various sources
like Oracle, SQL Server and Flat files.
Responsible for all activities related to the development, implementation, administration and
support of ETL processes for large scale data warehouses using Informatica Power Center.
Strong experience in Data Warehousing and ETL using Informatica Power Center 8.6.
Had experience in data modeling using Erwin, Star Schema Modeling,
and Snowflake modeling, FACT and Dimensions tables, physical and logical modeling.
Strong skills in Data Analysis, Data Requirement Analysis and Data Mapping for ETL
processes.
Had knowledge on Kimball/Inmon methodologies.
Hands on experience in tuning mappings, identifying and resolving performance bottlenecks in
various levels like sources, targets, mappings and sessions.
Extensive experience in ETL design, development and maintenance using Oracle SQL,
PL/SQL, SQL Loader, Informatica Power Center v 5.x/6.x/7.x/8.x.
Experience in testing the Business Intelligence applications developed in Qlikview.
Well versed in developing the complex SQL queries, unions and multiple table joins and
experience with Views.
Experience in database programming in PL/SQL (Stored Procedures, Triggers and
Packages).
Well versed in UNIX shell scripting.
Experienced at Creating effective Test data and development thorough Unit test cases to
ensure successful execution of the data & used pager for notifying the alerts after successful
completion.
Excellent communication, documentation and presentation skills using tools like Visio and
PowerPoint.
TECHNICAL SKILLS:
Data warehousing Tools : Informatica Power Center 8.6/8.1, Data Stage
Databases : Oracle10g/9i/ 8i/ 8.0/ 7.x, MS SQL Server 2005/ 2000/ 7.0/ 6.0, MS Access, MySQL,
Sybase.
Programming GUI : SQL, PL/SQL, SQL Plus, Java, HTML, C and UNIX Shell Scripting
BI Tools : QlikView 8.x
Tools/Utilities : TOAD, Benthic golden, PL/SQL developer
Operating Systems : Windows XP/NT/2003, UNIX
Configuration Management Tool : Surround SCM, Visual Source Safe
EDUCATION:
Master of Science in Computer Science.
PROFESSIONAL EXPERIENCE:
Confidential
sanofi-aventis- NJ Oct’11- till date
Informatica Developer
USMM implementation project is the upgrade of the current sanofi-aventis 1.x series MCO medical
reps Quest application to latest 4.x .net series of applications. In this project the database was
upgraded and an enterprise data ware house was implemented for the MCO reps. Distributed data is
coming from the heterogeneous sources like SQL server, Oracle and in flat files from the clients.
Responsibilities:
Environment: Informatica Power Center 8.6.1, Oracle 10g/ 9i, MS-SQL Server, Toad, HP Quality
Center, Windows XP and MS Office Suite
Confidential, Aug’ 07- Sep’10
sanofi-aventis- NJ
Informatica Developer
Sales force automation (SFA) system is a CRM solution that provides sales forces with a roboust set
of customer relationship management capabilities that promotes team selling, multi-channel
customer management, information sharing, field reporting, and analytics — all within a life science-
tailored mobile application that is easy to use. The Purpose of this project is to maintain a data
warehouse that would enable the home office to take corporate decisions. A decision support
system is built to compare and analyze their products with the competitor products and the sales
information at territory, district, region and Area level.
Responsibilities:
Created mappings and sessions to implement technical enhancements for data warehouse by
extracting data from sources like Oracleand Delimited Flat files.
Development of ETL using Informatica 8.6.
Applied slowly changing dimensions like Type 1 and 2 effectively to handle the delta Loads.
Prepared various mappings to load the data into different stages like Landing, Staging and
Target tables.
Used various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter,
Lookup, Update Strategy Designing and optimizing the Mapping.
Developed Workflows using task developer, worklet designer, and workflow designer in
Workflow manager and monitored the results using workflow monitor.
Created various tasks like Session, Command, Timer and Event wait.
Modified several of the existing mappings based on the user requirements and maintained
existing mappings, sessions and workflows.
Tuned the performance of mappings by following Informatica best practices and also applied
several methods to get best performance by decreasing the run time of workflows.
Prepared SQL Queries to validate the data in both source and target databases.
Worked on TOAD and Oracle SQL Developer to develop queries and create procedures and
packages in Oracle.
Worked extensively on PL/SQL as part of the process to develop several scripts to handle
different scenarios.
Created Test cases for the mappings developed and then created integration Testing
Document.
Prepared the error handling document to maintain the error handling process.
Automated the Informatica jobs using UNIX shell scripting.
Closely worked with the reporting team to ensure that correct data is presented in the reports.
Interaction with the offshore team on a daily basis on the development activities.
Environment: Informatica Power Center 8.1, Oracle 9i, MS-SQL Server, PL/SQL Developer,
Bourne shell, Windows XP,TOAD, MS Office and Delimited Flat files
Confidential, Dec’05-Jul’07
Chicago- IL
Data warehouse Developer
The American Medical Association (AMA) plays a key information management role by collecting,
maintaining, and disseminating primary source physician data. The development and
implementation of AMA policy and support a variety of data driven products and services. This
repository of physician information is created, maintained, and customized for DEA.
Responsibilities:
Member of warehouse design team assisted in creating fact and dimension tables based on
specifications provided by managers.
Load operational data from Oracle, SQL Server, flat files, Excel Worksheets into various data
marts like PMS and DEA.
Designed and created complex source to target mappings using various transformations
inclusive of but not limited to Aggregator, Look Up, Joiner, Source Qualifier, Expression,
Sequence Generator, and Router Transformations.
Implemented effective date range mapping (Slowly Changing dimension type2) methodology
for accessing the full history of accounts and transaction information.
Design complex mappings involving constraint based loading, target load order.
Use debugger in identifying bugs in existing mappings by analyzing data flow, evaluating
transformations and Create mapplets that provides reusability in mappings.
Involve in enhancements and maintenance activities of the data warehouse including
performance tuning, rewriting of stored procedures for code enhancements.
Designed workflows with many sessions with decision, assignment task, event wait, and event
raise tasks, used Informatica scheduler to schedule jobs.
Environment: Informatica Power Center 6.2, Oracle, Business Objects 6.x, Windows 2000, SQL
Server 2000, Microsoft Excel, SQL * Plus
Confidential, Sep’04-Nov’05
AXA - NY
ETL Consultant
A single electronic solution provides to employees of AXA Pacific and AXA Assurance surety
companies to access a centralized system. It also provides and intranet interface to all employees of
AXA surety bonds users across Canada.
Responsibilities:
Designed and developed the data transformations for source system data extraction; data
staging, movement and aggregation; information and analytics delivery; and data quality
handling, system testing, performance tuning.
Created Informatica Mappings to build business rules to load data using transformations like
Source Qualifier, Aggregator, Expression, Joiner, Connected and Unconnected lookups,
Filters and Sequence, External Procedure, Router and Update strategy.
Stored reformatted data from relational, flat file, XML files using Informatica (ETL).
Worked on Dimensional modeling to design and develop STAR schemas using ER-win 4.0,
Identifying Fact and Dimension Tables.
Created various batch Scripts for scheduling various data cleansing scripts and loading
process.
Extensively worked on Mapping Variables, Mapping Parameters and Session Parameters.
Created post-session and pre-session shell scripts and mail-notifications
Created Data Breakpoints and Error Breakpoints for debugging the mappings using Debugger
Wizard.
Created Several Stored Procedures to update several tables and insert audit tables as part of
the process.
Extensively used Joins, Triggers, Stored Procedures and Functions in Interaction with
backend database using PL/SQL.
Written Unix Shell Scripts for getting data from various source systems to Data Warehouse.
Performance tuning by optimizing the sources targets mappings and sessions.
Tested mappings and sessions using various test cases in the test plans.
PROFESSIONAL SUMMARY:
Eight plus (8+) years of IT experience in the Analysis, Design, Development, Testing and
Implementation of business application systems for Health care, Pharmaceutical, Financial,
Telecom and Manufacturing Sectors.
Strong experience in the Analysis, design, development, testing and Implementation of
Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, BI,
Client/Server applications.
Strong Data Warehousing ETL experience of using Informatica 9.1/8.6.1/8.5/8.1/7.1
PowerCenter Client tools - Mapping Designer, Repository manager, Workflow
Manager/Monitor and Server tools � Informatica Server, Repository Server manager.
Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with
project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL
Design, development, System testing, Implementation and production support.
Extensive testing ETL experience using Informatica 9.1/8.6.1/8.58.1/7.1/6.2/5.1 (Power
Center/ Power Mart) (Designer, Workflow Manager, Workflow Monitor and Server Manager)
Teradata and Business Objects.
Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying
Facts and Dimensions, Physical and logical data modeling using ERwin and ER-Studio.
Expertise in working with relational databases such as Oracle 11g/10g/9i/8x, SQL Server
2008/2005, DB2 8.0/7.0, UDB, MS Access and Teradata.
Strong experience in Extraction, Transformation and Loading (ETL) data from various sources
into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager,
Designer, Workflow Manager, Workflow Monitor, Metadata Manger), Power Exchange, Power
Connect as ETL tool on Oracle, DB2 and SQL Server Databases.
Experience using SAS ETL tool, Talend ETL tool and SAS Enterprise Data Integration Server
highly preferred.
Extensive experience in developing Stored Procedures, Functions, Views and Triggers,
Complex SQL queries using SQL Server, TSQL and Oracle PL/SQL.
Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica
sessions as well as performance tuning of mappings and sessions.
Experience in all phases of Data warehouse development from requirements gathering for the
data warehouse to develop the code, Unit Testing and Documenting.
Extensive experience in writing UNIX shell scripts and automation of the ETL processes using
UNIX shell scripting.
Proficient in the Integration of various data sources with multiple relational databases like
Oracle11g /Oracle10g/9i, MS SQL Server, DB2, Teradata, VSAM files and Flat Files into the
staging area, ODS, Data Warehouse and Data Mart.
Experience in using Automation Scheduling tools like Autosys and Control-M.
Worked extensively with slowly changing dimensions.
Hands-on experience across all stages of Software Development Life Cycle (SDLC) including
business requirement analysis, data mapping, build, unit testing, systems integration and user
acceptance testing.
Excellent interpersonal and communication skills, and is experienced in working with senior
level managers, business people and developers across multiple disciplines.
Education:
Friends University, Wichita, KS
Technical Skills:
Operating Systems: Windows 2008/2007/2005/NT/XP, UNIX, MS-DOS
ETL Tools: Informatica Power Center 9.1/8.6/8.5/8.1/7.1 (Designer, Workflow Manager, Workflow
Monitor, Repository manager and Informatica Server)
Databases: Oracle 11g/10g/9i/8i, MS SQL Server 2008/2005, DB2 v8.1, Teradata.
Data Modeling tools: Erwin, MS Visio
OLAP Tools: Cognos 8.0/8.1/8.2/8.4/7.0/, Business Objects XI r2/6.x/5.x, OBIEE 10.1.3.4/ 10.1.3.3
Languages: SQL, PL/SQL, UNIX, Shell scripts, C++
Scheduling Tools: Autosys ,Control-M
Testing Tools QTP, WinRunner, LoadRunner, Quality Center, Test Director
Professional Experience:
Confidential, Denver, CO Sept 2011 to May 2012
Sr. ETL/Informatica Developer
Description: Qwest is a large telecommunications carrier. Qwest Communications provides long-
distance services and broadband data, as well as voice and video communications globally. This
project includes developing Data warehouse from different data feeds and other operational data
sources.
Built a central Database where data comes from different sources like oracle, SQL server and flat
files. Actively involved as an Analyst for preparing design documents and interacted with the data
modelers to understand the data model and design the ETL logic.
Responsibilities:
Logical and Physical data modeling was done using Erwin for data warehouse database in
STAR SCHEMA
Using Informatica PowerCenter Designer analyzed the source data to Extract & Transform
from various source systems(oracle 10g,DB2,SQL server and flat files) by incorporating
business rules using different objects and functions that the tool supports.
Using Informatica PowerCenter created mappings and mapplets to transform the data
according to the business rules.
Used various transformations like Source Qualifier, Joiner, Lookup, sql ,router, Filter,
Expression and Update Strategy.
Implemented slowly changing dimensions (SCD) for some of the Tables as per user
requirement.
Developed Stored Procedures and used them in Stored Procedure transformation for data
processing and have used data migration tools
Documented Informatica mappings in Excel spread sheet.
Tuned the Informatica mappings for optimal load performance.
Have used BTEQ, FEXP, FLOAD, MLOAD Teradata utilities to export and load data to/from
Flat files.
Created and Configured Workflows and Sessions to transport the data to target warehouse
Oracle tables using Informatica Workflow Manager.
Have generated reports using OBIEE 10.1.3 for the future business utilities.
This role carries primary responsibility for problem determination and resolution for each SAP
application system database server and application server.
Worked along with UNIX team for writing UNIX shell scripts to customize the server
scheduling jobs.
Constantly interacted with business users to discuss requirements.
Interacted with Data Modelers and Business Analysts to understand the requirements and the
impact of the ETL on the business.
Designed ETL specification documents for all the projects.
Created Tables, Keys (Unique and Primary) and Indexes in the SQL server.
Extracted data from Flat files, DB2, SQL and Oracle to build an Operation Data Source.
Applied business logic to load the data into Global Data Warehouse.
Extensively worked on Facts and Slowly Changing Dimension (SCD) tables.
Maintained source and target mappings, transformation logic and processes to reflect the
changing business environment over time.
Used various transformations like Filter, Router, Expression, Lookup (connected and
unconnected), Aggregator, Sequence Generator, Update Strategy, Joiner, Normalizer, Sorter
and Union to develop robust mappings in the Informatica Designer.
Extensively used the Add Currently Processed Flat File Name port to load the flat file name
and to load contract number coming from flat file name into Target.
Worked on complex Source Qualifier queries, Pre and Post SQL queries in the Target.
Worked on different tasks in Workflow Manager like Sessions, Events raise, Event wait,
Decision, E-mail, Command, Worklets, Assignment, Timer and Scheduling of the workflow.
Extensively used workflow variables, mapping parameters and mapping variables.
Created sessions, batches for incremental load into staging tables and scheduled them to run
daily.
Used shortcuts to reuse objects without creating multiple objects in the repository and inherit
changes made to the source automatically.
Implemented Informatica recommendations, methodologies and best practices.
Implemented performance tuning logic on Targets, Sources, Mappings and Sessions to
provide maximum efficiency and performance.
Involved in Unit, Integration, System, and Performance testing levels.
Written documentation to describe program development, logic, coding, testing, changes and
corrections.
Migrated the code into QA (Testing) and supported QA team and UAT (User).
Created detailed Unit Test Document with all possible Test cases/Scripts.
Conducted code reviews developed by my team mates before moving the code into QA.
Provided support to develop the entire warehouse architecture and plan the ETL process.
Modified existing mappings for enhancements of new business requirements.
Prepared migration document to move the mappings from development to testing and then to
production repositories.
Involved in production support.
Works as a fully contributing team member, under broad guidance with independent planning
& execution responsibilities.
Analyzed the requirements and framed the business logic for the ETL process.
Extracted data from Oracle as one of the source databases.
Involved in JAD sessions for the requirements gathering and understanding.
Involved in the ETL design and its documentation.
Interpreted logical and physical data models for business users to determine common data
definitions and establish referential integrity of the system using ER-STUDIO.
Followed Star Schema to design dimension and fact tables.
Experienced in handling slowly changing dimensions.
Collect and link metadata from diverse sources, including relational databases Oracle, XML
and flat files.
Responsible for the development, implementation and support of the databases.
Extensive experience with PL/SQL in designing, developing functions, procedures, triggers
and packages.
Developed mappings in Informatica to load the data including facts and dimensions from
various sources into the Data Warehouse, using different transformations like Source
Qualifier, JAVA, Expression, Lookup, Aggregate, Update Strategy and Joiner.
Developed reusable Mapplets and Transformations.
Used data integrator tool to support batch and for real time integration and worked on staging
and integration layer.
Optimized the performance of the mappings by various tests on sources, targets and
transformations
Design, develop and Informaica mappings and workflows; Identify and Remove Bottlenecks in
order to improve the performance of mappings and workflows
Review existing code, lead efforts to tweak and tune the performance of existing Informatica
processes
Scheduling the sessions to extract, transform and load data in to warehouse database on
Business requirements.
Scheduled the tasks using Autosys.
Loaded the flat files data using Informatica to the staging area.
Created SHELL SCRIPTS for generic use.
Created high level design documents, technical specifications, coding, unit testing and
resolved the defects using Quality Center 10.
Developed unit/assembly test cases and UNIX shell scripts to run along with
daily/weekly/monthly batches to reduce or eliminate manual testing effort.
Environment: Windows XP/NT, Informatica Powercenter 9.1/8.6, UNIX, Teradata V-14, Oracle 11g,
Oracle Data Integrator, SQL, PL/SQL,SQL Developer, ER-win, Oracle Designer, MS VISIO,
Autosys, Korn Shell, Quality Center 10.
Confidential, NJ Feb 2005 to Mar 2006
Role: Informatica Developer
Merrill Lynch is the wealth management division of Bank of America providing corporate finance
&investment banking services. The objective of the project was to build data Warehouse for
Customers Investment Deposit, funding accounts and Corporate Services. The data for Customers,
Accounts and Transactional related information were extracted from multiple sources, transformed
and loaded into the target database using ETL tool.
Responsibilities:
Environment: Informatica Power Center 8.5 , Oracle 10g , SQL Server 2005 , DB2,
SQL*Plus, SQL Loader ,SQL Developer, Autosys, Flat files, UNIX, Windows 2000
Confidential, Seattle, WA Aug 2003 to Jan 2005
Role: ETL DEVELOPER
T-mobile is the one of the largest Telecom Companies in the USA. Joined existing onshore BI
team as ETL Developer and successfully designed, developed business solutions. The project
which aims in fulfilling T-mobile�s need for reporting to better understand the market trends,
behavior, future opportunities and to improve there decision making process. Coordinated with
the business and P&A team to understand the system requirements and then analyzing and
designing ETL solutions to accomplish the same. Involved in various successful releases to
accomplish T-mobile�s reporting needs under order-activation (OA) functional area.
Responsibilities:
Environment: Informatica Power Center 5.1.2/7.1, Erwin 4.5, Oracle 9i, Windows NT, Flat
files, SQL, Relational Tools, Clear Case, UNIX (HP-UX, Sun Solaris, AIX) and UNIX Shell
Scripts.
Worked with SQL, PL/SQL procedures and functions, stored procedures and packages within
the mappings.
Tuned Informatica mappings and sessions for optimum performance.
Assisted the other ETL developers in solving complex scenarios and coordinated with source
systems owners with day-to-day ETL progress monitoring.
The PDI project is built in the process of expanding the scope of Information Factory.
The Project acquires and Integrates Member, Eligibility, Claims, Benefits and
Accumulations data from Proclaim into the Information Factory. The Effort included
stage, integrate and positioning the data making it available for the downstream data
consumers.
Created Data Maps / Extraction groups in Power Exchange Navigator for legacy IMS Parent
sources.
Staged Data from legacy Proclaim IMS system into Oracle 11g Master Tables
Performed CDC capture registrations
Assisted in building the ETL source to Target specification documents by understanding the
business requirements
Developed mappings that perform Extraction, Transformation and load of source data into
Derived Masters schema using various power center transformations like Source Qualifier,
Aggregator, Filter, Router, Sequence Generator, look up, Rank, Joiner, Expression, Stored
Procedure, SQL, Normalizer and update strategy to meet business logic in the mappings
Reusable transformations and Mapplets are built wherever redundancy is needed
Performance tuning is performed at the Mapping level as well as the Database level to increase
the data throughput
Designed the Process Control Table that would maintain the status of all the CDC jobs and
thereby drive the load of Derived Master Tables.
Used Teradata utilities like BTEQ, fast load, fast export, multiload for data conversion.
Created Post UNIX scripts to perform operations like gunzip, remove and touch files.
Candidate Info
8
Years In
Workforce
1
Year
At This Job
Computer Science
3
INFORMATICA ETL Developer
Worked on Informatica Power Center tool - Source Analyzer, Data warehousing designer,
Mapping & Mapplet Designer and Transformation Designer.
Extracted data from SAP system to Staging Area and loaded the data to the target database by
ETL process using Informatica Power Center.
Performed the performance and tuning at source, Target levels using Indexes, Hints and
Partitioning in DB2, ORACLE and Informatica.
Designed and developed various PL/SQL stored procedures to perform various calculations
related to fact measures.
Converted the PL/SQL Procedures to Informatica mappings and at the same time created
procedures in the database level for optimum performance of the mappings.
Investigating and fixing the bugs occurred in the production environment and providing the on-
call support.
Performed Unit testing and maintained test logs and test cases for all the mappings.
Maintained warehouse metadata, naming standards and warehouse standards for future
application development.
Parsing high-level design specification to simple ETL coding along with mapping standards.
Candidate Info
3
Years In
Workforce
1
Year
At This Job
BS
Computer Science
4
Veterans Administration, D.c. - Lead Sql/informatica
ETL Developer
The five lines of business for VBA are Compensation and Pension, Loan Guaranty,
Education, Vocational Rehabilitation/Education and Insurance Services having
individual data marts. The OLTP systems are VETSNET, BDN and the various data
marts are CPMR, BIRLS, and VOR.
Worked as an Adhoc developer and created reports for Adhoc request from individual senators
and congressmen, the press, GAO, inspector general offices and VA secretary using the EDW.
Worked on End to End analysis of the EDW to report the questions/clarifications of Adhoc
requests with supporting data and findings.
Created high performance Sql scripts to reduce the time taken for producing ongoing Adhoc
reports monthly and weekly.
Thoroughly tested the Sql scripts by validating the data and cross comparing the data across
five lines of business for adhoc reports
Shared the business/technical knowledge gained within the team in adhoc meetings, peer code
reviews, VOR/BIRLS code reviews
Worked on defect fixes in VOR to produce business solution using ETL (Informatica) tool.
Offered various approaches in brain storming session, so as to decide the best approach for a
FIX.
Worked on converting some of the PL/Sql scripts into Informatica mappings.
Worked on building informatica mappings for the HAIISS project - VHA -Essence data mart.
Candidate Info
5
Years In
Workforce
2
Years
At This Job
MS
Business Intelligence
Electronics And Instrumentation
5
Informatica ETL Developer
Designed extensive ETL mappings with Informatica Designer
Worked with filter transformations and flat files to identify source and target bottlenecks
Worked with various transformations including router transformation, update strategy,
expression transformation, lookup transformation, sequence generator, aggregator
transformation and sorter transformation
Used Oracle to write SQL queries that create/alter/delete tables and to extract the necessary
data
Used UNIX to navigate around the system and check for specific files, the files’ content, change
permissions and see who the current users are
Candidate Info
7
Years In
Workforce
4
Years
At This Job
BA
Information Technology And Informatics
6
Informatica ETL Developer
Automate PL/SQL scripts using Informatica within a framework
Candidate Info
10
Years In
Workforce
6
Months
At This Job
BS
Computer Information Systems
MBA
Business Administration
7
Senior Informatica ETL Developer
Develop and maintain data marts on an enterprise data warehouse to support various
UHC defined dashboards such as Imperative for Quality (IQ) program.
Designated owner and accountable for major tasks and took responsibility for actions and
outcomes to ensure timely and cost-effective results for our team.
Coach new team members on technical tools such as Informatica Designer, Workflow Manager,
Workflow Monitor and UHC Data Models.
Analyze data and build reports using Informatica data profiling tool & Toad for Data Analyst tool
so that UHC members can make informed decisions.
Set and follow Informatica best practices, such as creating shared objects in shared for
reusability and standard naming convention of ETL objects, design complex Informatica
transformations, mapplets, mappings, reusable sessions, worklets and workflows.
Evaluate business requirements to come up with Informatica mapping design that adheres to
Informatica standards.
Implement performance tuning on a variety of Informatica maps to improve their throughput.
Work with peers from various functional areas to define IT wide processes like code reviews,
unit test documentation and knowledge sharing sessions.
Collaborate and work with business analysts and data analysts to support their data
warehousing and data analysis needs.
Bellevue, WA
June 2012 to Present
Description:
T-Mobile USA is a national provider of wireless voice, messaging and data services capable of reaching
over 293 million Americans where they live, work and play. The project ventures to enhance the
performance and controls of the accounting processes associated with the pre-paid business by
enhancing or replacing some of the underlying systems, such as PRS (Prepaid Reporting System),
currently used to create the accounting entries.
This project is focused exclusively on improving the IT ecosystem within the Prepaid business, which
includes the subscriber types Prepaid, Flex Pay and Wal-Mart Family Mobile. Prepaid subscribers are in
scope as well as subscribers that are part of a hybrid account in Samson, and that in/out of scope
account type/sub type refers to Postpaid (Samson) accounts only.
Responsibilities:
• Involved in gathering and analyzing the requirements and preparing business rules.
• Designed and developed complex mappings by using Lookup, Expression, Update, Sequence
generator, Aggregator, Router, Stored Procedure, etc., transformations to implement complex logics while
coding a mapping.
• Worked with Informatica power center Designer, Workflow Manager, Workflow Monitor and Repository
Manager.
• Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract the data
from multiple source systems like Oracle, SQL server and Flat files and loaded into Oracle.
• Developed Informatica Workflows and sessions associated with the mappings using Workflow
Manager.
• Involved in creating new table structures and modifying existing tables and fit into the existing Data
Model.
• Extracted data from different databases like Oracle and external source systems like flat files using ETL
tool.
• Involved in debugging Informatica mappings, testing of Stored Procedures and Functions, Performance
and Unit testing of Informatica Sessions, Batches and Target Data.
• Developed Mapplets, Reusable Transformations, Source and Target definitions, mappings using
Informatica 9.1.0.
• Generated queries using SQL to check for consistency of the data in the tables and to update the tables
as per the Business requirements.
• Involved in Performance Tuning of mappings in Informatica.
• Good understanding of source to target data mapping and Business rules associated with the ETL
processes.
Environment: Informatica 9.1, Oracle 11g, SQL server 2008 R2, SQL, T-SQL, PL/SQL, Toad 10.6, SQL
Loader, Tidal Enterprise Scheduler 5.3.1, Accurev, Unix, Flat files.
Informatica Developer
ING US
-
West Chester, PA
January 2011 to May 2012
Description:
ING Group operations in the U.S are performed through ING. U.S which provide large financial services
to retail and institutional clients, which includes annuities, retirement plans, mutual funds, life insurance,
direct banking etc. This project involved extraction of data from multiple sources and transform the data
as per the business logic and is loaded into the data Warehouse which is finally used for reporting.
Responsibilities:
• Prepared technical design/specifications for data Extraction, Transformation and Loading.
• Worked on Informatica Utilities Source Analyzer, warehouse Designer, Mapping Designer, Mapplet
Designer and Transformation Developer.
• Analyzing the sources, transforming data, mapping the data and loading the data into targets using
Informatica Power Center Designer.
• Created reusable transformations to load data from operational data source to Data Warehouse and
involved in capacity planning and storage of data.
• Developed complex mappings such as Slowly Changing Dimensions Type II-Time stamping in the
Mapping Designer.
• Used various transformations like Stored Procedure, Connected and Unconnected lookups, Update
Strategy, Filter transformation, Joiner transformations to implement complex business logic.
• Used Informatica Workflow Manager to create workflows, database connections, sessions and batches
to run the mappings.
• Used Variables and Parameters in the mappings to pass the values between mappings and sessions.
• Created Stored Procedures, Functions, Packages and Triggers using PL/SQL.
• Implemented restart strategy and error handling techniques to recover failed sessions.
• Used Unix Shell Scripts to automate pre-session and post-session processes.
• Did performance tuning to improve Data Extraction, Data process and Load time.
• Wrote complex SQL Queries involving multiple tables with joins.
• Implemented best practices as per the standards while designing technical documents and developing
Informatica ETL process.
Environment: Informatica 8.6/9.1, Oracle 10g, SQL server 2005, SQL, T-SQL, PL/SQL, Toad, Erwin4.x,
Unix, Tortoise SVN, Flat files.
Informatica Developer
Bank Of America
Charlotte, NC
October 2009 to December 2010
Description:
Bank of America is one of the largest bank holding companies in the United States with its head quarters
located in Charlotte, NC. Bank of America operates in all the states of U.S. and 40 other countries. It
covers approximately 57 million consumers and small business relationships having 5900 banking
centers and 18, 000 ATMs. This project is developed for maintaining a catalogue of all the products that
are offered online to the customers. It involved extraction of data from multiple Source systems and
loading them through an ETL process in to staging tables and then in to Target tables.
Responsibilities:
• Involved in creating Detail design documentation to describe program development, logic, coding,
testing, changes and corrections.
• Extensively involved in writing ETL Specifications for Development and conversion projects.
• Created shortcuts for reusable source/target definitions, Reusable Transformations, mapplets in Shared
folder.
• Involved in requirement definition and analysis in support of Data Warehouse.
• Worked extensively on different types of transformations like Source qualifier, expression, Aggregator,
Router, filter, update strategy, lookup, sorter, Normalizer, sequence generator, etc.
• Worked with XSD and XML files generation through ETL process.
• Defined and worked with mapping parameters and variables.
• Designed and developed transformation rules (business rules) to generate consolidated (fact/summary)
data using Informatica ETL tool.
• Performed the performance evaluation of the ETL for full load cycle.
• Checked Sessions and error logs to troubleshoot problems and also used debugger for complex.
• Worked on Parameterize of all variables, connections at all levels in UNIX.
• Created test cases for unit testing and functional testing.
• Coordinated with testing team to make testing team understand Business and transformation rules
being used throughout ETL process.
Environment: Informatica Power center 8.6.1, Oracle 10g, Windows XP, Unix Shell Scripts, SQL, PL/SQL,
Flat files.
Informatica Developer
FedEx
Memphis, TN
July 2008 to September 2009
Description:
FedEx is the world's leading delivery company for overnight packages with its headquarters located in
Memphis, TN. It is one of the global companies, present in 220 countries and about 75, 000 cities across
the world. The main aim of the project is to have the details of all the orders and transactions of FedEx.
This project is developed for Executive business management, financial and profitability analysis, Sales
force management, Marketing Campaign and product analysis.
Responsibilities:
• Used Informatica ETL to load data from flat files, which includes fixed-length as well as delimited files
and SQL Server to the Data mart on Oracle database.
• Used reverse engineering in Erwin 4.x to understand the existing data model of the data warehouse.
• Worked with creating Dimensions and Fact tables for the data mart.
• Created Informatica mappings, sessions, workflows, etc., for loading fact and dimension tables for data
mart presentation layer.
• Have implemented SCD (Slowly Changing Dimensions) Type I and II for data load.
• Did performance tuning of Informatica components for daily and monthly incremental loading tables.
• Developed Mapplets, reusable transformations, source and target definitions and mappings using
Informatica 7.1.
• Developed mapping using parameters and variables.
• Created complex workflows, with multiple sessions, worklets with consecutive or concurrent sessions.
• Used Timer, Event Raise, Event Wait, Decisions, and Email tasks in Informatica Workflow Manager.
• Used Workflow Manager for creating validating, testing and running sequential and concurrent batches.
• Implemented source and target based partitioning for existing workflows in production to improve
performance so as to cut back the running time.
• Analyzed workflow, session, event and error logs for trouble shooting Informatica ETL process.
• Worked with Informatica Debugger to debug the mappings in Informatica Designer.
• Involved in creating test plans, test cases to unit test Informatica mappings, sessions and workflows.
• Involved in migrating Informatica ETL application and Database objects through various environments
such as Development, Testing, UAT and Production environments.
• Documented and presented the production/support documents for the components developed when
handing-over the application to the production support team.
Environment: Informatica Power Center 8.1, Workflow Manager, Workflow Monitor, Erwin 4.0/3.5.2,
TOAD 8.6.1.0, PL/SQL, Flat files, XML, Oracle 10g/9i
ETL Developer
Janus Capital Group
Denver, CO
June 2007 to June 2008
Description:
Janus Capital Group is the 16th largest mutual fund company in the US with more than 4 million mutual
fund investors. For nearly four decades the Denver- based firm has taken a long-term view, company-by-
company investment approach to gain a differentiated view in the market place. In addition to growth,
core and international equity funds, Janus manages balanced, alternative, fixed-income and money
market funds. Project Description included creation of Data Warehouse. It included extraction and
transformation of data from various sources such as Flat Files, SQL Server, Oracle and loading them
back into the Target tables.
Responsibilities:
• Participated in requirement gathering meetings with business analysts and ETL architect to understand
Source Data, Data Warehouse data model, Technical Metadata, etc.
• Documented required technical metadata such as source and target definitions.
• Created technical design specifications for mappings, sessions, workflows, etc.
• Worked on different OLTP data sources such as Oracle, SQL Server and Flat files for data extraction.
• Created complex mappings in Informatica Power Center Designer using Aggregate, Expression, Filter,
Sequence Generator, Update Strategy, Rank, Sorter, Lookup, Joiner transformations etc.,
• Implemented Type II slowly changing dimensions using date-time stamping.
• Created reusable transformations and mapplets to reuse while creating ETL mappings.
• Configured and used the Debugger to troubleshoot the Informatica mappings.
• Proficient in using Informatica workflow manager, workflow monitor, pmcmd (Informatica command line
utility) to create, schedule and control workflows, tasks, and sessions.
• Worked with DBA team to fix performance issues at Database.
• Wrote SQL, PL/SQL, stored procedures for implementing business rules and transformations.
• Worked with export and import utilities in Repository manager.
• Involved in upgrading from Informatica Power Center 6.1 to Informatica Power Center 7.1 which is used
as ETL tool for loading data into EDW from transactional databases.
Environment: Informatica Power center 7.1, Oracle 9i, SQL Server 2005, SQL, PL/SQL, TOAD, Windows
NT, Unix.
Informatica Developer
Sprint Mobile
-
Kansas City, MO
June 2006 to May 2007
Description:
Sprint Nextel is one of the largest telecommunication companies in the world. With around 53.7 million
subscribers, Sprint Nextel operates the third largest wireless telecommunications network in the United
States (based on total wireless customers) .The company grew steadily through acquisitions and changed
its name to United Telecommunications at which time it provided local telephone service in many areas of
the Midwest and Southern part of USA.
Responsibilities:
• Involved in Relational and Dimensional Data Modeling Techniques to design ERWIN data models.
• Worked on Informatica power center tool - Source Analyzer, Warehouse Designer, Mapping Designer,
Mapplets and Transformation Developer
• Extraction, Transformation and Load was performed using Informatica Power Center to build Data
warehouse.
• Accomplished automated data extraction from various RDBMS via UNIX shell scripts, ETL processing
using Informatica and loading into Data Mart.
• Developed complex mappings in Informatica Power Center to load the data from various sources using
different transformations like Source Qualifier, Look up (connected and unconnected), Expression,
Aggregate, Update Strategy, Joiner, Filter and Router.
• Created and scheduled Sessions and Batches through the Informatica Server Manager.
• Designed and documented validation rules, error handling and test strategy of ETL process.
• Tuned Informatica mappings/sessions for better ETL performance by eliminating bottlenecks.
Environment: Informatica Power center 6.1/7.1, Oracle 9i, SQL Server 2000, SQL, PL/SQL, TOAD,
Windows NT, Unix.
ETL Developer
Eli Lilly and Company
Indianapolis, IN
October 2005 to May 2006
Description:
Eli Lilly and Company, a leading pharmaceutical company in Indianapolis. Project involved in Data
Warehouse development for sales division, which contains the sales data for pharmaceuticals and
medical products of the company. This Data Warehouse project will enable management to better
leverage information collected within current operational systems to help in their decision making
process.
Responsibilities:
• Involved in analyzing and development of the Data Warehouse.
• Worked on Informatica power center tool - Source Analyzer, Warehouse Designer, Mapping Designer,
Mapplets and Transformation Developer
• Created various mappings using Aggregate, Filter, Join, Expression, Lookup, Update Strategy and
Router.
• Extensively used ETL to load data from different databases and flat files to Oracle.
• Involved in the development of Informatica mappings and also tuned them for better performance.
• Created and scheduled Sessions and Batches through the Informatica Server Manager.
• Worked with sessions and batches using Server Manager to load data into the target database.
• Testing for Data Integrity and Consistency.
Environment: Informatica Power Center 6.1, Oracle 8i, SQL, Windows 98/2000, Shell scripts,
Additional Information
Technical Skills:
ETL Tools Informatica Power Center 9.x/8.x/7.x/6.x
Data Modeling Erwin 4.0/3.5, Toad, SQL Station, MS Visio, SQL Developer
DBMS Oracle […] IBM DB2, MS SQL Server 2008 […] Microsoft Access, Excel, ODBC
Programming Languages SQL, PL/SQL, T-SQL, Unix Shell scripting
Operating Systems […] UNIX, LINUX
PROFESSIONAL SUMMARY:
6+ years of IT experiencein Analysis, Design, Development, Implementation, Testing and Supportof Data
Proficiency in developing SQL with various relational databases like Oracle, SQL Server.
Have extensively worked in developing ETL program for supporting Data Extraction, transformations
Experience with dimensional modeling using star schema and snowflake models.
Understand the business rules completely based on High Level document specifications and
Created UNIX shell scripts to run the Informatica workflows and controlling the ETL flow.
Developed OLAP applications using Cognos 8BI - (Frame Work Manager, Cognos Connection, Report
Studio, Query Studio, and Analysis Studio) and extracted data from the enterprise data warehouse to
Extensively worked with Informatica performance tuning involving source level, target level and map
level bottlenecks.
Strong business understanding of verticals like Banking, Brokerage, Insurance, Mutual funds and
Pharmaceuticals.
Ability to meet deadlines and handle multiple tasks, decisive with strong leadership qualities, flexible in
Team player, Motivated, able to grasp things quickly with analytical and problem solving skills.
SOFTWARE KNOWLEDGE:
Description:
Humana Inc., headquartered in Louisville, Kentucky, is a leading health care company that offers a wide
range of insurance products and health wellness services, Humana provides Medicare Advantage plans and
prescription drug coverage to more than3.5 million members throughout the US.
The main objective of this project shared data Repository is to capture new vitality program customers data ,
policies, group policies,HumanaOne and non HumanaOne medicare plans.
Data is coming from various sources like SQL Server, Mainframe etc which will be loaded in to EDW based
on different frequencies as per the requirement. The entire ETL process consists of source systems, staging
area, Datawarehouse and Datamart.
Responsibilities:
Created shell scripts to fine tune the ETL flow of the Informatica workflows.
Used Informatica file watch events to pole the FTP sites for the external mainframe files.
Production Support has been done to resolve the ongoing issues and troubleshoot the problems.
Performance tuning was done at the functional level and map level. Used relational SQL wherever
Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying
Effectively worked in Informatica version based environment and used deployment groups to migrate
the objects.
Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating
transformations.
Pre and post session assignment variables were used to pass the variable values from one session to
other.
Designed workflows with many sessions with decision, assignment task, event wait, and event raise
Reviewed and analyzed functional requirements, mapping documents, problem solving and trouble
shooting.
Performed unit testing at various levels of the ETL and actively involved in team code reviews.
Identified problems in existing production data and developed one time scripts to correct them.
Fixed the invalid mappings and troubleshoot the technical problems of the database.
Description:
Allstate is one of the fastest growing Auto/Property/ Life Insurance Company. It serves its customers by
offering a range of innovative products to individuals and group customers at more than 600 locations
through its company-owned offices.
The primary objective of this project is to capture different Customers, Policies, Claims Agents, Products
and financial related data from multiple OLTP Systems and Flat files. Extracted Transformed Loaded data in
to data warehouse using Informatica Powercentre and generated various reports on a daily, weekly monthly
and yearly basis. These reports give details of the various products of Allstate Insurance products that are
sold. The reports are used for identifying agents for various rewards and awards and performance, risk
analysis reports for Business development Managers.
Responsibilities:
Involved in all phases of SDLC from requirement gathering, design, development, testing, Production,
Create new mapping designs using various tools in Informatica Designer like Source Analyzer,
Develop the mappings using needed Transformations in Informatica tool according to technical
specifications
Created complex mappings that involved implementation of Business Logic to load data in to staging
area.
Developed mappings/sessions using Informatica Power Center 8.6 for data loading.
Performed data manipulations using various Informatica Transformations like Filter, Expression, Lookup
(Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter and
Union.
Developed Workflows using task developer, Worklet designer and workflow designer in Workflow
Extracted data from Oracle and SQL Server then used Teradata for data warehousing.
Participated inweeklystatus meetings, and conducting internal andexternal reviews as well as formal
Description:
This position requires implementing data warehouse for Forecasting, Marketing, Sales performance reports.
The data is obtained from Relational tables and Flat files. I was involved in cleansing and transforming the
data in the staging area and then loading into Oracle data marts. This data marts/Data warehouse is an
integrated Data Mine that provides feed for extensive reporting.
Responsibilities:
Used Informatica Power Center for (ETL) extraction, transformation and loading data from
Created mappings using Designer and extracted data from various sources, transformed data according
to the requirement.
Involved in extracting the data from the Flat Files and Relational databases into staging area.
Mappings, Sessions, Workflows from Development to Test and then to UAT environment.
Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of a
star schema.
Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source
filter usage in Source qualifiers, and data flow management into multiple targets using Router.
Created Sessions and extracted data from various sources, transformed data according to the
Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner,
Router and Aggregator to create robust mappings in the Informatica Power Center Designer.
Imported various heterogeneous files using Informatica Power Center 8.x Source Analyzer.
Developed several reusable transformations and mapplets that were used in other mappings.
Environment:
Informatica 8.1.1 Power Center, Teradata, Oracle 11g, Windows NT.
Description:
Prudential Financial companies serve individual and institutional customers worldwide and include The
Prudential Insurance Company of America, one of the largest life insurance companies in the U.S. These
companies offer a variety of products and services, including mutual funds, annuities, real estate brokerage
franchises, relocation services, and more. Involved in the development and implementation of goals,
policies, priorities, and procedures relating to financial management, budget, and accounting. Analyzes
monthly actual results versus plan and forecast
Responsibilities:
Involved in design, development and maintenance of database for Data warehouse project.
Involved in Business Users Meetings to understand their requirements.
Designed, Developed and Supported Extraction, Transformation and Load Process (ETL) for data
Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter,
Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator
transformations.
Created complex mappings which involved Slowly Changing Dimensions, implementation of Business
Worked extensively with the connected lookup Transformations using dynamic cache.
Created and scheduled Sessions, Jobs based on demand, run on time and run only once
Performed Unit testing, Integration testing and System testing of Informatica mappings
Coded Unix Scripts to capture data from different relational systems to flat files to use them as source
Description:
The Oakwood Healthcare System serves 35 different communities in southeastern Michigan with over 40
primary and secondary care locations. Responsibilities include working with the clinical analytics team on
the measurement of provider performance, quality improvement initiatives, and various ad-hoc requests. The
reports are created, distributed and published using various Cognos BI tools like ReportNet, Impromptu,
Power Play, IWR, and UpFront to the end-users. The application had OLAP features like Drill Down analysis,
Multidimensional analysis, Prompts, Exception Highlighting and User Privileges.
Responsibilities:
Developed Layout, Pages, Object Containers and Packages using Report Studio.
Created Multidimensional Cubes using PowerPlay and published on the UpFront Portal using PowerPlay
Enterprise Server.
Developed PowerPlay Cubes, used multiple queries, calculated measures, customized cube content and
Fine-tuned the Cubes and checked the database space issue and cube growth periodically.
Responsible in the creation of new User Groups and User Classes using Access Manager.
Environment: CogonsBI(Frame work manager, Cognos Connection, Report Studio, Query Studio), Oracle
11g, SQL server 2005.
Description:
Aventis is a Pharmaceutical company, which provides new and improved biotech drugs for various diseases
and their symptoms. The objective of the project is to extract data stored in different databases and load into
oracle system which is the staging area and the business logic is applied to transform the tables in the
required way. The data warehouse is fed by marketing data, sample data, market (competitor) data,
prescription data and others.
Responsibilities:
Extensively used ETL to load data from Flat Files, XML, Oracle to oracle 8i
Developed data Mappings between source systems and warehouse components using Mapping
Designer
Worked extensively on different types of transformations like source qualifier, expression, filter,
aggregator, rank, update strategy, lookup, stored procedure, sequence generator, joiner, XML.
Setup folders, groups, users, and permissions and performed Repository administration using
Repository Manager.
Involved in the performance tuning of the Informatica mappings and stored procedures and the sequel
Involved in the Performance Tuning of Database and Informatica. Improved performance by identifying
Wrote PL/SQL Packages and Stored procedures to implement business rules and validations.
Environment: Informatica 7.1.3, ORACLE 10g, UNIX, Windows NT 4.0, UNIX Shell Programming, PL/SQL,
TOAD Quest Software
Description:
Core project focused on designing and implementing scalable solutions that would support the company's
continued dramatic growth, under girded by the corporate data warehouse.
Responsibilities:
Used Oracle Warehouse Builder to implement changes to the operational data store, as well as create
data marts
Involved in the data analysis for source and target systems. Good understanding of Data warehousing
Involved in supporting and maintaining Oracle Import, Export and SQL*Loader jobs
<
Next >
Prev
Professional Summary:
Over 7 years of programming experience as an Oracle PL/SQL Developer in Analysis,
Design and Implementation of Business Applications using the Oracle Relational Database
Management System (RDBMS).
Involved in all phases of the SDLC (Software Development Life Cycle) from analysis,
design, development, testing, implementation and maintenance with timely delivery against
aggressive deadlines.
Experience with Data flow diagrams, Data dictionary, Database normalization theory
techniques, Entity relation modeling and design techniques.
Expertise in Client-Server application development using Oracle 11g/10g/9i/8i, PL/SQL, SQL
*PLUS, TOAD and SQL*LOADER.
Effectively made use of Table Functions, Indexes, Table Partitioning, Collections,
Analytical functions, Materialized Views, Query Re-Write and Transportable table
spaces.
Strong experience in Data warehouse concepts, ETL.
Good knowledge on logical and physical Data Modeling using normalizing Techniques.
Created Tables, Views, Constraints, Index (B Tree, Bitmap and Function Based).
Developed Complex database objects like Stored Procedures, Functions, Packages and
Triggers using SQL and PL/SQL.
Developed materialized views for data replication in distributed environments.
Excellent technical and analytical skills with clear understanding of design goals of ER
modeling for OLTP and dimension modeling for OLAP.
Experience in Oracle supplied packages,Dynamic SQL, Records and PL/SQL Tables.
Loaded Data into Oracle Tables using SQL Loader.
Partitioned large Tables using range partition technique.
Experience with Oracle Supplied Packages such as DBMS_SQL,DBMS_JOB and UTL_FILE.
Created Packages and Procedures to automatically drop table indexes and create indexes
for the tables.
Worked extensively on Ref Cursor, External Tables and Collections.
Expertise in Dynamic SQL, Collections and Exception handling.
Experience in SQL performance tuning using Cost-Based Optimization (CBO).
Good knowledge of key Oracle performance related features such as Query Optimizer,
Execution Plans and Indexes.
Experience with Performance Tuning for Oracle RDBMS using Explain Plan and HINTS.
Experience in ETL techniques and Analysis and Reporting including hands on experience with
the Reporting tools such as Cognos.
Created Shell Scripts for invoking SQL scripts and scheduled them using crontab.
Excellent communication, interpersonal, analytical skills and strong ability to perform as part of
a team.
Technical Skills:
Databases: Oracle 7.x/8i/9i/10g/11g (SQL, PL/SQL, Stored Procedures, Triggers), MS SQL
SERVER 2000/2005/2008, DB2/UDB, Teradata, SAP Tables and MS Access.
ETL Tools: Informatica (PowerCenter 5.1/6.2/7.1/8.6.1/9.1.0, PowerMart 5.1, Power Connect/Power
Exchange for SAP R/3, MainFrame and Oracle Change Data Capture (CDC), AB Initio 1.8 and
SQL*Loader.
Reporting Tools: Business Objects Developer Suite 5.1/BO XIR2, Cognos Suite, Cognos Report
Net 1.1MR2, Crystal Reports, Oracle Reports 2.5
Operating Systems: UNIX(Sun Solaris, LINUX, HP UNIX, AIX), Windows NT/98/95/2000 &
Windows XP.
Data Modeling: Erwin 3.5.2,4.0
Languages/Utilities: SQL, PL/SQL, Unix shell scripts, Java, XML, C and Cobol.
Other Tools: AutoSys, Control-M, PVCS, WIN CVS, Informatica Data Quality, B2B Data
Transformation, Informatica Power Exchange Informatica 9.1.0 Developer/Analyst, TPump, Fast
Load, BTEQ
Education Qualifications:
Bachelor’s in Computer Science
Professional Experience:
Project Name: Confidential
Client: Confidential, NJ
Role: Oracle PL/SQL Developer
Duration: Sep’ 11 – PresentDescription:
The PIAS (Property Insurance Application System) used by Personal lines department which
involves in providing homeowners insurance application. This Product does not enable users to
capture Dwelling applications beyond creation of Reference number. In this Release the product is
enabled to capture Dwelling new business applications through PIAS.
Responsibilities:
Coordinated with the front end design team to provide them with the necessary stored
procedures and packages and the necessary insight into the data.
Worked on SQL*Loader to load data from flat files obtained from various facilities every day.
Created and modified several UNIX shellScripts according to the changing needs of the
project and client requirements.
Wrote Unix Shell Scripts to process the files on daily basis like renaming the file, extracting
date from the file, unzipping the file and remove the junk characters from the file before
loading them into the base tables.
Involved in the continuous enhancements and fixing of production problems.
Generated server side PL/SQL scripts for data manipulation and validation and materialized
views for remote instances.
Developed PL/SQL triggers and master tables for automatic creation of primary keys.
Created PL/SQL stored procedures, functionsandpackages for moving the data from
staging area to data mart.
Created scripts to create new tables, views, queries for new enhancement in the application
using TOAD.
Created indexes on the tables for faster retrieval of the data to enhance
database performance.
Involved in data loading using PL/SQL and SQL*Loader calling UNIX scripts to download and
manipulate files.
Performed SQL and PL/SQL tuning and Application tuning using various tools like EXPLAIN
PLAN, SQL*TRACE, TKPROF andAUTOTRACE.
Extensively involved in using hints to direct the optimizer to choose an optimum query
execution plan.
Used Bulk Collections for better performance and easy retrieval of data, by reducing
context switching between SQL and PL/SQL engines.
Created PL/SQL scripts to extract the data from the operational database into simple flat text
files using UTL_FILE package.
Creation of database objects like tables, views, materialized views, procedures and
packages using oracle tools like Toad, PL/SQL Developer and SQL* plus.
Partitioned the fact tables and materialized views to enhance the performance.
Extensively used bulk collection in PL/SQL objects for improving the performing.
Created records, tables, collections (nested tables and arrays) for improving Query
performance by reducing context switching.
Used Pragma Autonomous Transaction to avoid mutating problem in database trigger.
Extensively used the advanced features of PL/SQL like Records, Tables, Object
types and Dynamic SQL.
Handled errors using Exception Handling extensively for the ease of debugging and
displaying the error messages in the application.
Environment: Oracle 11g, SQL * Plus, TOAD, SQL*Loader, SQL Developer, Shell Scripts, UNIX,
Windows XP
Project Name: Confidential
Client: Confidential, DE
Role: Oracle PL/SQL Developer
Duration: Jun’ 07 to Aug’ 11
Description:
Bank of America is the leading financial institution in USA. Customer information systems hold
customer, account, address, offer, service, statement related details in IMS and DB2 databases. CIS
systems passes customer information to ATM, call centers , online banking , mobile banking
systems in online and batch mode.
Responsibilities:
Used Oracle JDeveloper to support JA Built complex queries using SQL and wrote stored
procedures using PL/SQL in Various API’s like Java, .Net and Hierarchical databases like
Oracle and Access.
Developed and modified a number of Forms and for various modules. Also responsible for
following up bugs reported by various users and suggesting possible patches to be applied.
Wrote Shell Scripts for Data loading and DDL Scripts.
Worked in Production Support Environment as well as QA/TEST environments for projects,
work orders, maintenance requests, bug fixes, enhancements, data changes, etc.
Used Oracle JDeveloper to support JAVA, JSP and HTML codes used in modules.
Wrote conversion scripts using SQL, PL/SQL, stored procedures,
functions and packages to migrate data from SQL server database to Oracle database.
Performed Database Administration of all database objects including tables, clusters,
indexes, views, sequences packages and procedures.
Implemented 11g and upgraded the existing database from Oracle 9i to Oracle 11g.
Involved in Logical & Physical Database Layout Design.
Set-up and Design of Backup and Recovery Strategy for various databases.
Performance tuning of Oracle Databases and User applications.
Used SQL*Loader as an ETL tool to load data into the staging tables.
Used DTS Packages as ETL tool for migrating Data from SQL Server 2000 to Oracle 10g.
Provided user training and production support.
Improved the performance of the application by rewriting the SQL queries.
Wrote packages to fetch complex data from different tables in remote databases using joins,
sub queries and database links.
Environment: VB 6, Oracle 9i/10g/11g SQL, PL/SQL, Forms 9i, Reports 9i, SQL*Loader, SQL
Navigator, Crystal Reports, TOAD.
Project Name: Confidential, Mumbai, India (Contractor to Accenture)
Client: Confidential
Role: Oracle Developer/Analyst
Duration: Feb’ 04 to Mar’ 06
Description:
This system keeps track of day-to-day bank operations like deposits, withdrawals, demand drafts,
different types of loans (mortgage business) for the customers as well as the employees.
Responsibilities:
Environment: Oracle 9i, 10g, SQL*Plus, PL/SQL, Erwin 4.1, Oracle Designer 2000, Windows 2000,
Toad.
Oracle PLSQL Developer Resume - Sample 1
Aayush Tandon
Career Objective
To obtain a position where I can utilize my technical skills for the enhancement of
Oracle database systems.
Career Summary
Extensive use of Unix Shell Scripts, Cron jobs and Autosys to automate process.
Used PL/SQL to create Packages, Functions, and Procedure.
Used PL/SQL and SQL*Loader to create ETL packages for flat file loading and
error capturing into log tables.
Involved in testing the application for Oracle 9i to 10g upgrade.
Tuning of the SQL queries, which takes long time to process the request using
Explain Plan, Hints to reduce the response time.
Involved in all phases of database development, from needs assessment to
QA/QC, design, and support.
Train the new recruiters.
Technical Experience
Achievements
Made a Hotel Management System using VB.Net and Postgres as the database
tool.
Made a JAVA Chat Server using JAVA and JUnit as the testing tool of the
product.
Receive many appreciations and receive awards for excellent work.
Employer
Date Of Birth: xx/xx/19xx
Languages Known: Hindi, English
Address: JKSJKDS
Oracle PLSQL Developer Resume - Sample 2
QMC
Career Objective
Technical Skills
Skill Sets
Confident
Detail oriented
Time management skills
Analytical skills
Excellent communication and interpersonal skills
Key Responsibilities Handled
Working as an Oracle Database Administrator (DBA) in SKA Ltd. from 20** till
date.
Worked as a Jr. Oracle Database Administrator in PMP Ltd. from 20** to 20**.
Achievements