User File
User File
com
Azure Data Engineer Mobile: +91-9652225208
PROFESSIONAL SUMMARY
● Having a total 5+ years of experience in the IT industry.
● Having 4+ years of experience in Microsoft Azure Cloud technologies
● Having 2 years of experience in Mainframe Development
● A Self-starter with a positive attitude, willingness to learn new concepts/technology and acceptance of
challenges.
● Excellent Technical, Interpersonal and Management skills.
● Experienced in Azure Data Factory.
● Very strong experience in ETL design.
● Exposure to Azure Cloud computing technologies.
● Hands-on experience in Azure Services – Azure Data Lake Store (ADLS), Azure Data Lake Analytics
(ADLA), Azure Data Factory (ADF).
● Prepared Project Documentations, Such as Setup Documents, Test Scripts and Functional specification
Documents.
● Hands-on experience in Azure Data factory and its Core Concepts like Datasets, Pipelines and Activities,
Scheduling and Execution
● Excellent knowledge of ADF building components – Integration Runtime, Linked Services, Data Sets,
Pipelines, Activities
● Designed and developed data ingestion pipelines from on-premises to different layers into the ADLS using
Azure Data Factory (ADF V2)
● Experience with integration of data from multiple data sources
● Good knowledge on polybase external tables in SQL DW.
● Designed Azure Logic apps application to send pipeline success/failure alert emails, file unavailability
notifications etc.
● Knowledge on Data Extraction from On-Premise Sources and Delta Extraction methods from
Source Systems to ADLS
● Extensively Worked on Copy Data activity
● Worked on Get Metadata Activity, look up, Store Procedure, Foreach, IF and execute Pipeline activities
● Orchestrated data integration pipelines in ADF using various Activities like GetMetadata,
Lookup, ForEach, Wait, Execute Pipeline, Set Variable, Filter, until, etc.
● Implemented dynamic pipeline to extract the multiple files into multiple targets with the help of
single pipeline
● Manage data recovery for Azure Data Factory Pipelines
● Monitor and manage Azure Data Factory
● Experience in SQL – Joins/co-related queries/sub-queries etc.
● Good knowledge of stored procedures, functions, triggers, views, etc.
● Involved in Configuration of the system, Functional Testing of the application, Integrity of the modules,
Managing Business Users.
● Possess sound knowledge of Business process and Data flow.
● Able to work within an aggressive and demanding project schedule and environment.
● Experience in working with extended teams to investigate and resolve critical problems with systems
identified during system Implementation.
● Experience in code reviews, integration and end-user support.
1
● Automated execution of ADF pipelines using Triggers
● Experience in Production support
PROFESSIONAL QUALIFICATION
● B.Tech (Mechanical Engineering) from JNTUH in 2016.
TECHNIC AL PROFICIENCY
CAREER PROFILE
● Worked with COGNIZANT as an Azure Data Engineer from July-2020 to till date
● Worked with Zensar Technologies as a Mainframe Developer and Production Support Engineer from Apr-
2018 to Jan-2020
Project #3
Client Cognizant
Role Azure Data Engineer
Technologies & Tools Azure Data Factory V2, Logic App,Key Vault, Oracle, Files
Duration July-21 to till date
Project Description: -
This project is mainly for creating an INTEGRATED data to Cognizant. Azure Data Factory is being used as an
ETL tool. Data will be extracted from Oracle source to Data lake and Azure Synapse Analytics. This Data
warehouse will be used as a base to accommodate all the reporting requirements.
This project is used to capture all the campaign details for Cognizant. It is managing different types of ads with
Twitter, Facebook, Instagram and Admovils. We have created Dashboard to know the maximum impressions and
clicks with respective social media apps.
● Created pipelines to extract data from on premises source systems to azure cloud data lake storage;
Extensively worked on copy activities and implemented the copy behavior’s such as flatten hierarchy,
preserve hierarchy and Merge hierarchy. Implemented Error Handling concept through copy activity.
● Exposure to Azure Data Factory activities such as Lookups, Stored procedures, if condition, for each, Set
Variable, Append Variable, Get Metadata, Filter and wait.
● Create dynamic pipeline to handle multiple sources extracting to multiple targets; extensively used azure
key vaults to configure the connections in linked services.
● Configured and implemented the Azure Data Factory Triggers and scheduled the Pipelines; monitored
the scheduled Azure Data Factory pipelines and configured the alerts to get notification of failure
pipelines.
● Implemented delta logic extractions for various sources with the help of a control table; implemented
the Data Frameworks to handle the deadlocks, recovery, logging the data of pipelines.
2
● Reviewing individual work on ingesting data into azure data lake and provide feedbacks based on reference
architecture, naming conventions, guidelines and best practices
● Developing Spark (Python) notebooks to transform and partition the data and organize files in ADLS
● Involved End-End logging frameworks for Data factory pipelines.
● Implemented delta logic extractions for various sources with the help of a control table; implemented the
Data Frameworks to handle the deadlocks, recovery, logging the data of pipelines.
● Extracted data from different sources such as Flat files, Oracle to load into SQL database.
● Involved in preparation and execution of the unit, integration and end to end test cases.
● Used COPY to bulk load the data.
● Created internal and external stage and transformed data during load.
● Used FLATTEN table function to produce lateral view of VARIENT, OBECT and ARRAY column.
● Identifying areas for modification in existing programs and subsequently developing these modifications.
● Used Temporary and Transient tables on diff datasets.
● Cloned Production data for code modifications and testing.
● Shared sample data using grant access to customer for UAT.
Project #2
Client Cognizant
Role Azure Data Engineer
Technologies &Tools Azure Data Factory V2, Key Vault, SQL, Oracle, Files.
Duration July-20 to July-21
Project Description:-
This project is mainly for creating an INTEGRATED Data warehouse by depending on the existing servers. Azure
Data Factory is being used as an ETL tool. Data will be extracted from Oracle source to Data lake and SQL DW
database. This Data warehouse will be used as a base to accommodate all the reporting requirements.
This project is used to capture all the retail activity for Cognizant. It is managing different types of messages like
sales order, advance ship notice and invoice. Different reports are generated for end users like product wise
statistical information, order wise reports and client wise statistics reports.
● Created pipelines to extract data from on premises source systems to azure cloud data lake storage;
Extensively worked on copy activities and implemented the copy behavior’s such as flatten hierarchy,
preserve hierarchy and Merge hierarchy. Implemented Error Handling concept through copy activity.
● Exposure to Azure Data Factory activities such as Lookups, Stored procedures, if condition, for each, Set
Variable, Append Variable, Get Metadata, Filter and wait.
● Create dynamic pipeline to handle multiple sources extracting to multiple targets; extensively used azure
key vaults to configure the connections in linked services.
● Configured and implemented the Azure Data Factory Triggers and scheduled the Pipelines; monitored the
scheduled Azure Data Factory pipelines and configured the alerts to get notification of failure pipelines.
● Implemented delta logic extractions for various sources with the help of a control table; implemented the
Data Frameworks to handle the deadlocks, recovery, logging the data of pipelines.
● Reviewing individual work on ingesting data into azure data lake and provide feedbacks based on reference
architecture, naming conventions, guidelines and best practices
● Developing Spark (Python) notebooks to transform and partition the data and organize files in ADLS
3
● Involved End-End logging frameworks for Data factory pipelines.
● Implemented delta logic extractions for various sources with the help of a control table; implemented the
Data Frameworks to handle the deadlocks, recovery, logging the data of pipelines.
● Extracted data from different sources such as Flat files, Oracle to load into SQL database.
● Involved in preparation and execution of the unit, integration and end to end test cases.
Project #1