Thimmarayudu. Gangavaram 8007779596: Loading Unloading

Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 4

Thimmarayudu.

Gangavaram
Mobile: +91 8007779596
E-Mail: thimmarayudugangavaram@gmail.com

CAREER ABSTRACT
Over all 10. years of IT experience in heading teams for driving Data warehousing applications in Design, Develop-
ment and maintenance, including 3.5 years of experience in working with Snowflake cloud data warehouse, azure
blob and AWS S3.

 Implemented solutions using snowflake.

 Good experience in design data pipeline solutions using Azure, Snowflake.

 Experience in using Snowflake Clone and Time Travel and Worked with Streams & Tasks.

 Strong understanding of the principles of Data Warehousing using fact tables and dimension tables.

 knowledge in Data Loading (Full, Initial, Delta Upload), Scheduling and Monitoring.

 Directly responsible for Extraction, Cleansing, Transformation and Loading of data from multiple feeds and
sources into Data Warehouse.

 Extensive experience in all the Data warehouse/data mart testing phases – Unit testing, Integration testing, Sys-
tem Testing, UAT, Stress testing, break-the-box testing and creating corresponding documents – Unit test plans,
test cases, test scripts and test schedules.

 Experience with Snowflake Multi-Cluster Warehouses and Migrating Oracle objects into Snowflake environment.

 Experience with Snowflake Virtual Warehouses, building Snow pipe and Extensive experience on Data Sharing.

 In-depth knowledge on Snowflake Database, Schema and Table structures.

 Excellent knowledge and experience in configuring connections to various sources and creating source to target
mapping, edit rules and validation, transformations, and business rules and ensures adherence to locally defined
standards for all developed components
 In-depth knowledge on Snowflake performance activities and loading of Structured and Semi-structured files into
snowflake.
 Experience in working on Snowflake SQL all DDL, DML commands, Joins and aggregate & analytical functions.
 Experience in working different areas of RDBMS, Data loading through Sql loader.
 Experience in writing complex SQL scripts using Statistical Aggregate functions and Analytical functions to support
ETL in snowflake cloud data warehouse.
 Involved in performance tuning for the complex SQL by following various handling process like collecting stats, by
creating temporary tables, Indexes
 Created Views as per the Business recommendations with all the filters to help downstream applications to cap-
ture data and generate reports.
 Have Experience in Onsite Client Interaction, having good understanding of onsite and offshore model.
 Experience in leading and managing teams. Handled multiple roles – Tech Lead and Developer.
 Good Analytical, Strong interpersonal and excellent communication and managerial skills.
 Extensive experience with analysis, design, development, customizations, and implementation of software appli -
cations.
 Handling large and complex data sets like JSON, ORC, PARQUET CSV files from various sources like AWS S3.
 Hands-on experience in bulk loading and unloading data into Snowflake tables using COPY command.
 Proficient in SQL Performance tuning. Worked on databases like Oracle.
 Good experience in ETL informatica mappings, sessions, workflows.
 Knowledge on Power BI.
PROFICIENCY MATRIX

Worked as a Snowflake Developer in Mann-Hummel Pvt ltd From May 2021 to 28th Oct 2022.
Worked as a Business Analyst in Kelly services Pte ltd Singapore from Feb 2019 to Feb 2020.
Worked as a Project Lead in LTI InfoTech ltd, Chennai from Oct 2018 to Feb 2019.
Worked as a Associate Consultant in Atos Global IT Solution and Services Pvt ltd, Pune from July 2011 to Oct 2018.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Education Details

MBA from SV University-- 2004

Databases : SNOWFLAKE, Oracle.


ETL Tools : Informatica
Reporting tools : Power BI

CAREER CONTOUR
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

Project #1
Project Title : Mann-Hummel
Client : Mann-Hummel
Environment : SNOWFLAKE, azure Blob, Oracle database
Role : Snowflake Developer.
Duration : From May 2021 to Oct 2022

Responsibilities:

 Involved in Requirement Analysis & preparation of Functional Design documents.


 Table DDLs creation in snowflake development database.
 Have used COPY statements to ingest data from stage to Tables.
 Implemented snow pipe for real-time data ingestion.
 Cloned Production data for code modifications and testing.
 Work with multiple data sources.
 Created data sharing out of snowflake with testing team.
 Worked with streams for change data capture and implemented SCD.
 Implemented solutions using snowflake’s data sharing, cloning and time travel.
 Involved in unit testing and Integration Testing.
 Establish and ensure adoption of best practices and development standards.
 Communicate with peers and supervisors routinely, document work, meetings, and decisions.
 Bulk loading from external stage (AWS S3) to internal stage (snowflake) using COPY command.
 Loading data into Snowflake tables from internal stage and on local machine.
 Used COPY, LIST, PUT and GET commands for validating internal and external stage files.
 Used import and export from internal stage (Snowflake) VS external stage (S3 Bucket).
 Writing complex Snowflake sql scripts in Snowflake cloud data warehouse to Business Analysis and reporting.
 Responsible for task distribution among the team.
 Perform troubleshooting analysis and resolution of critical issues.
 Created database according to business requirements.
 Used COPY to bulk load the data.
 Created Snowpipe for continuous data load.
 Worked on Data validation between Oracle and Snowflake databases.
 Created internal, external stage and transformed data during load.
 Shared sample data using grant access to customer for User Acceptance Testing (UAT).
 Involved on Data cleansing, is the process of detecting and correcting or removing inaccurate  data or records
from a database. 
Project #2
Project Title : DBS Bank
Client : DBS Bank
Environment : SNOWFLAKE, azure Blob, File systems.
Role : Snowflake Developer.
Duration : From Feb 2019 to Feb 2020

Responsibilities:

 Understanding the business functionality of the system.


 Involved in Migrating Objects from SQL to Snowflake.
 Created Snowpipe for continuous data load and bulk data load purpose used COPY.
 Created internal and external stage and transformed data during load.
 Cloned Production data for code modifications and testing.
 Shared sample data using grant access to customer for UAT.
 Historical data retrieved by using Time travel and also used for missed data to recover.
 Heavily involved in testing Snowflake to understand best possible way to use the cloud resources

Project #3
Project Title : LTI Infotech
Environment : LTI Infotech
Role : ETL Developer (SQL and PLSQL)
Duration : From Oct 2018 to Feb 2019

Responsibilities:

 Understanding the business functionality of the system.


 Involved in Migrating Objects from SQL to Snowflake.
 Created Snowpipe for continuous data load and bulk data load purpose used COPY.
 Created internal and external stage and transformed data during load.
 Cloned Production data for code modifications and testing.
 Shared sample data using grant access to customer for UAT.
 Historical data retrieved by using Time travel and also used for missed data to recover.
 Heavily involved in testing Snowflake to understand best possible way to use the cloud resources

Project #4
Project Title : Nokia Siemens network
Environment : Nokia.
Role : ETL Developer (SQL and PLSQL)
Duration : From Nov 2016 to Oct 2018

Responsibilities:

 Analyzing the areas of service improvements and implement fix for the same
 Developed SQL queries based on customer requirement using various joins.
 Performing all SDLC phases to complete ETL development work (Requirement Gathering, Analysis, Design, Unit
Testing, Deployment)
 Created views, Mviews, indexes, hints.
 Executing the automated scripts and updating the logs. Involved in execution of the scripts with respect to func-
tionality.
 Committed developed script using Git repository.
 Issues reporting/clarifications, raising bugs if product issues.
 Attending daily stand-up meeting and give updates.
 Involved in daily run for Regression.
 Team Management in knowledge transfer as well as mentoring activities
 Providing weekly & monthly report on project activities and performance.
 Preparing solution approach documents, Design documents and Requirement Traceability Matrix for new SOWs
and share them with customer and other teams
 Working on effort estimation of change requests.
 Performing data analysis and resolution of issues as part of production support

Project #5
Project Title : KAS Bank (Netherland)
Environment : Oracle databse.
Role : ETL Developer (SQL and PLSQL)
Duration : From Jan 2014 to Oct 2016

Responsibilities:

 Involving in the full development cycle of Planning, analysis, design, development, testing, and implementation
 Created and modified SQL PLus, PL/SQL and SQL Loader scripts for data conversions
 Developed and modified triggers, packages, functions and stored procedures for data conversions and PL/SQL
procedures to create database objects dynamically based on user inputs.
 Wrote SQL, PL/SQL, SQL PLus programs required to retrieve data using cursors and exception handling. Executing
the automated scripts and updating the logs. Involved in execution of the scripts with respect to functionality.
 Committed developed script using Git repository.
 Issues reporting/clarifications, raising bugs if product issues.
 Attending daily stand-up meeting and give updates.

Project #6
Project Title : Vodafone India
Environment : Oracle database.
Role : ETL Developer (SQL and PLSQL)
Duration : From July 2011 to Dec 2013

Responsibilities:

 Involving in the full development cycle of Planning, analysis, design, development, testing, and implementation
 Performing all SDLC phases to complete ETL development work (Requirement Gathering, Analysis, Design, Unit
Testing, Deployment)
 Created views, Mviews, indexes, hints.
 Executing the automated scripts and updating the logs. Involved in execution of the scripts with respect to func-
tionality.
 Created and modified SQL PLus, PL/SQL and SQL Loader scripts for data conversions
 Developed and modified triggers, packages, functions and stored procedures for data conversions and PL/SQL
procedures to create database objects dynamically based on user inputs.
 Wrote SQL, PL/SQL, SQL PLus programs required to retrieve data using cursors and exception handling.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy