Ajit Kumar Profile

Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 10

AJIT KUMAR

Contact No : +16152828586 E-mail : ajitk@emids.com

EXECUTIVE SUMMARY

 Microsoft certified professional having over 15 years of extensive experience in Information Tech-
nology with special emphasis on design, development, and administration of ETL, Data Integra-
tion, Data validation, Data Analysis, Data Migration, Data Management and Governance,
Technology architecture, Application architecture.
 Experience in all phases of Data warehouse life cycle involving Data analysis / design / devel-
opment / testing using ETL, Data Modeling, Analytical processing & reporting tools.
 Strong hand on experience in developing ETL and data validation solution using Python and pan-
das.
 Extensive experience in providing technical solutions for business problems/scenario.
 Strong hands-on experience on ETL tools – Azure data factory, SSIS, Talend, MuleSoft for full
SDLC including strong emphasis on development, architecture along with project management.
 Strong Knowledge on various flavor of Database like- Azure Managed instance, Oracle, MSSQL,
MySQL, MS Access.
 Strong analytical and conceptual skills in writing SQL Queries. Good understanding of creating ta -
bles, table spaces, databases, and indexes, worked with Cubes and Metadata.
 Proficient in interaction with business users. Pioneered load strategies of loading staging area and
data marts. Dealt with SCD Type1/Type2/Type3 loads.
 Strong understanding of the principles of Data Warehouse using Fact Tables, Dimension Tables,
Star Schema modeling and Snowflake Schema modeling
 Strong hands-on experience in developing reports and Dashboards using SSRS, TABLEAU, Spot-
fire, Power BI, Salesforce Report Builder, Salesforce Einstein Analytics.
 Experience in automation of complete ETL process using enterprise schedulers / monitoring / re -
porting tools, like - Autosys, Splunk, Control-M.
 Extensive work in ETL process consisting of data transformation/migration, data sourcing, mapping,
conversion and loading and validation against various source systems.
 Extensive experience in database backend work, development and tuning of Database objects like
stored procedures, views, functions, Triggers, cursors and applications development in MS SQL
Server and Oracle environment finding report and ETL drawbacks in performance and tuning
 Acquired knowledge on Data mining and Data modeling concepts. Created Predictive GLM (General-
ized linear model), R-part Model and ARIMA model (FOR Time series Analysis) using R, ALGO-
RITHMIA also acquired knowledge of data mining concepts (scorecard model, trend analysis, fore-
casting).
 Worked on both Agile and waterfall model.
 Excellent team player and proven track record working in various team sizes performing
cross-functional roles.

Key Skill Set:


Data Integration Tools Azure Data Factory, SSIS, Talend Open Studio, MULESOFT
Cloud Platform Azure, AWS, Salesforce, S3 Buckets
Databases Azure MI, Azure SQL, ORACLE, MSSQL, SQLite MySQL, MS Access
Data Modeling Tools Erwin, toad data modeler, Visio, Informatica power designer
Reporting/Visualization Tools TABLEAU, Spotfire, Power BI, SSRS, Salesforce Einstein Analytics
Version control GitHub, Gitlab, TFS, Subversion, SVS
Languages PL/SQL, TSQL, MDX, java, python, Apex Classes
Statistical Tools R, ALGORITMIA
Web technologies HTML, XML
Operating System UNIX, Windows, MS-DOS
Scheduling Tools Autosys, Splunk, Control-M
Other Tools SSAS, SCOM, ClearQuest, JIRA, Top Team, HPQC
Certifications:
70-448: Microsoft SQL Server 2008, Business Intelligence Development and Maintenance
70-452: PRO: Designing a Business Intelligence Infrastructure Using Microsoft SQL Server 2008

Employment History:

Sl. No Organization Name Designation From To


1 Emids Technologies Ltd Architect Nov 2016 Till date
2 Happiest Minds Technology Pvt. Ltd Technical Lead May 2015 Nov 2016
3 Symphony Teleca (Now HARMAN) Lead Engineer Feb 2014 May 2015
4 Accenture Services Pvt. Ltd Senior Software Engineer Nov 2009 Feb 2014
5 Decision Studio Software Pvt Ltd Consultant Aug 2007 Nov 2009

Project Details:

Project: Compassus Data Engineering


Domain: Health Care
Client: Compassus, Brentwood, TN
Duration: October 2020 – Till date
Location of Work: Brentwood TN, USA
Role: Data Architect

Tools and Technologies: Azure data Factory, AZURE Managed Instance, SQL Server, SSIS, Power
BI, Salesforce, Workday

Description: 
Enterprise Warehouse: - Compassus delivers outcome-based value through high-quality patient-
centric care, expanding access to qualified individuals while putting compliance at the forefront and
providing health care system cost savings. It provides post-acute care services in hospice, home
health, palliative and home infusion therapy
This program intended to build comprehensive enterprise warehouse and data marts to cater data
needs for reporting products build in power BI.

Roles and responsibilities:


 Carried out design assessment & purposed new EDW architecture to client 
 Designed and developed of scalable data architectures
 Working on Enterprise warehouse/Data Mart/Interfaces integration design and build 
 Laid the foundation of parameterized framework using Azure data factory 
 Identified fixed performance bottlenecks (Managed Instance) in co-ordination with Microsoft 
 Worked closely with business and IT stakeholders on analyzing the requirements and docu-
ment the need.
Project: Executive Dashboard & MIPs Reporting
Domain: Health Care
Client: Cotiviti, Atlanta, GA
Duration: December 2019 – September 2020
Location of Work: Hartford CT, USA
Role: Data Architect

Tools and Technologies: SQL Server, SSIS, Tableau, MicroStrategy

Description: 
Executive Dashboard: - program provides end-to-end prospective and retrospective risk adjustment
services and support for health plans, combining technology, analytics, and deep subject-matter
expertise to ensure risk-associated revenue is optimized while maintaining appropriate compliance.
ETL using SSIS and SQL fetch data of various client and source systems and store in integrated
warehouse. Multiple MicroStrategy dashboards build on top of data marts provides better insight into
data.
MIPs Reporting: - program provides reporting platform for MIPS, PIQ and RPI systems on HRP
domain. This also involves in migrating Mediconnect data to HRP data mart. Several tableau
dashboards build on top of HRP data mart.

Roles and responsibilities:


 Laid the foundation of parameterized framework using SSIS and SQL server. This will enable to
load data for multiple clients in parallel also new clients can be added with no or minimal changes
in the existing workflow.

 Was involved in Analysis, Design and Implementation for the requirements raised.

 Involved in redesign and performance optimization of ETL (both SSIS and SQL queries).

 Involved in understanding business processes and coordinated with business analysts to get
specific requirements to build Data Marts.

 Created jobs to load data in various stages of data warehouse and data marts using SSIS SQL
server objects like procedures, views, functions, cursors, triggers.

 Created multiple tableau dashboard on top of HRP data mart.

 Held daily status meetings and conducted internal and external reviews.

Project: Connected patient Engagement


Domain: Health Care
Client: Emids
Duration: December 2018 – November 2019
Location of Work: Bangalore, India
Role: Data Architect

Tools and Technologies: Python, Salesforce health cloud, Apex Classes, Triggers, Process Builder,
HL7 & EDI integration, Einstein Analytics, ALGORITHMIA

Description: An In-house Emids initiative to build a platform on top of Salesforce Health Cloud to
provide “Connected Patient Engagement” solution. This platform can be plugged into healthcare facility
and can be used to provide key various features which will help the facility to reduce readmission cost
and improve health outcomes.
Roles and responsibilities:
 Carried out assessment in a capacity of an associate architect for problem solution, project
scope and requirements gathering.
 Documented, prepared and suggested technical solution for defined scope and project require -
ment as per approved and finalized solution.
 Has laid the foundation of both the framework by developing underlying code that is required
to wrap around the custom objects, datasets, and data integration framework.
 Was tasked with running a proof-of-concept implementation at client site.
 In addition to above, I have played a vital role in developing tools required for this implemen -
tation
 Python script to process EDI/HL7 files: -Wrote python script to parse EDI
and HL7 files
 Timeseries Scripts: Scripts developed in SAQL (Einstein Analytics) that will
helpful to predict or forecast the future values of a series based on the history
of that series. These are custom scripts developed specific to client
 Risk Stratification Algorithm – Script developed using ALGORITHMIA and
using custom Apex class result set loaded to health cloud objects which is fur-
ther used for provider and payer dashboards.
 Data migration with verification scripts – Migration scripts developed to
load data from parsed csv file system to the health cloud data model.

Project: ACS Data Migration


Domain: Clinical Health Care
Client: IQVIA (QuintilesIMS), North Carolina, USA
Duration: November 2016 – November 2018
Location of Work: Bangalore, India
Role: Associate Architect

Tools and Technologies: AWS, S3 Bucket, Talend, Oracle, AWS, JSON, XML files, Flat files, SQL and
Unix Shell Scripting.
Description: IQVIA is implementing Infosario Registry Platform for ACS. Where all previous registry
will be made live on the new integrated platform.

Roles and responsibilities:


 Carried out assessment in a capacity of senior technical lead for problem solution, project scope
and requirements gathering.
 Documented, prepared and suggested problem solution, project scope and requirements to client.
 Documented, prepared and suggested technical solution for defined scope and project requirement
as per approved and finalized solution.
 Developed the canonical data model in consultation with senior doctors and healthcare specialists.
 Has laid the foundation of both the framework by developing underlying code that is required to
wrap around the data model and ETL framework.
 Was tasked with running a proof-of-concept implementation at client site.
o In addition to above, I have played a vital role in developing tools required for this implementa-
tion
 DevOps Scripts: Scripts developed in Ansible and Docker that will reduce the effort re-
quired for setting up environments. These are custom scripts developed specific to client
 Data access layer Code generator – Tool to generate accurate code in data access layer
which eliminates the common user errors.
 Data migration with verification scripts – Migration scripts developed to load data from
customer system to the canonical data model
 Data Masking Script– Tool to mask Healthcare PHI data.
 Metadata driven ETL – Tool to create ETL solution with minimal configuration changes
and development effort.

Project: TMF DWH and Reporting


Domain: Payroll, Accounting, Capital Markets
Client: TMF Group, Amsterdam, Netherlands
Duration: May 2015 – November 2016
Location of Work: Bangalore, India
Role: Technical Lead

Tools and Technologies: Python, SQL Server, Tableau

Description: TMF Group’s business processes involve multiple systems and technologies. The current
IT landscape comprises of HR systems, business systems like Viewpoint and Microsoft Dynamics AX,
CRM systems and other tools, which allow it to conduct its business smoothly. However, the presence
of multiple systems and its Global presence pose a unique challenge of having an integrated system,
which can provide the management Global and cross-functional view of the business.  The Data
Warehouse system designed and developed as a repository to consolidate data from all the systems to
present Global and cross-functional picture of the business.

Roles and responsibilities:


 Identified and documented data sources and transformation rules required to populate and
maintain data warehouse. Interacting with business user for requirement understanding.

 Involved in understanding business processes and coordinated with business analysts to get
specific requirements to build Data Marts Performance tuning and Optimization of ETL and
Reporting.

 Was involved in Analysis, Design, Coding, Unit Testing and Implementation for the requirements
raised.

 Laid the foundation of ETL framework using Python and SQL server.

 Created Loader to load data in various stages of data warehouse and data marts using python, SQL
server.

 Created interactive dashboards using Tableau as per requirements.

 Performed Unit and Integration testing and validated the test cases by comparing the actual results
with expected results.
 Managed UAT and production deployment activities.

 Held daily status meetings and conducting internal and external reviews as well as formal walk-
throughs among various teams and documenting the proceedings.
Project: ImpactRx
Domain: Healthcare
Client: ImpactRx, Horsham, PA, USA
Duration: February 2014 – May 2015
Location of Work: Bangalore, India
Role: Lead Engineer

Tools and Technologies: SQLite, SQL Server, SSIS, SSRS

Description: ImpactRx, Inc. provides consultative and analytically-based promotional effectiveness


solutions to the healthcare industry. The company provides insightful solutions to pharmaceutical
marketing and sales decision makers. It offers Brand Impact, a solution suite that offers brand teams
unprecedented insight into the factors that influence physician brand choices within a complete
competitive context; and POP 2.0, an event-triggered tool enabling the addition of custom questions
to its longitudinal iPhone and iPad-connected physician research model. The company also provides
Sales Impact, a solution suite that provides sales management invaluable insights into the
effectiveness of sales represent.

Roles and responsibilities:


 Prepared TDs from FDs and then guiding the junior resource to build the jobs.
 Interviewing the junior and middle level candidates to build the ETL team.
 Held daily status meetings and conducting internal and external reviews as well as formal walk
through among various teams and documenting the proceedings.

 Identified and documented data sources and transformation rules required to populate and
maintain data warehouse.

 Involved in Analysis, Design, Coding, Unit Testing and Implementation for the requirements
raised.

 Created SSIS packages to load data from SQLite to SQL Server staging, data warehouse and
data marts

 Created various SSRS reports and worked on subscription and linked report with drill down
and drill through functionality.

 Created SQL server objects like procedures, views, functions, cursors, triggers.

 Worked on Performance tuning and Optimization of ETL and Reporting.

Project: SAP Item Manager Upgrade


Domain: Energy
Client: Shell Neo, The Hague, Netherlands
Duration: October 2013 – February 2014
Location of Work: Bangalore, India
Role: Senior software Engineer

Tools and Technologies: Oracle, SQL Server, SSIS


Description: The Objective of the SAP Item manager upgrade project is to build reusable ETL solution
for multiple operating units for master data management (MDM). Source data is fetched from SAP
systems.MDM database to be deployed on multiple SQL server instances for each operating unit. This
MDM data along with transactional data for different work packages is used to populate data
warehouse.

Roles and responsibilities:

 Involved in ETL framework design and implementation using SSIS and SQL server.

 As the project was in initial stage, so the data model is getting finalized. I have got a chance to
helped team in building data model using informatica power designer.
 Involved in understanding business processes and coordinated with business analysts to get
specific requirements to build Data warehouse.

 Created reusable template packages to load data to various stages.

 Documented the data anomalies and reviewed the same so that we can take care of those in en -
hancement of jobs.

Project: TGP reporting simplification


Domain: Finance
Client: Accenture, Bangalore, India
Duration: February 2012 – September 2013
Location of Work: Bangalore, India
Role: Senior software Engineer

Tools and Technologies: SQL Server, SSIS, SSRS


Description: The Objective of the TGP ODS Project is to bring all global TGP reporting on a common
platform using MSBI technology (SSIS, SSRS) this enable faster, efficient, on-demand reporting for
Global TGP Organization. The purpose of the project is to achieve reporting simplification by:
1.Provide cross functional reporting to higher management.
2.Automating management reporting to eliminate manual and redundant reporting teams allowing
management, and business support functions to better focus on analytics and root cause analysis
rather than report creation
3.Providing a central repository of integrated enterprise data to streamline the delivery and
presentation of a consistent set of management metrics for easy access to timely and reliable
management information

Roles and responsibilities:


 Involved in every phase of the system development life cycle including research and analysis
gathering user and business requirements from various data sources.

 Designed and developed the SSIS parallel jobs for extracting, cleansing, transforming, integrating
and loading data.

 Tuned SSIS transformations and jobs to enhance their performance

 Developed cross functional reporting using SSRS.

 Implemented Role based data security based on AD Group users and access list.
 Worked on documentation like – Functional specification, Technical Specification, Source to target
mapping, review checklist, unit test plan, product system support document, learning-doc out of
this project.

 Scheduled the job using Autosys.

 Worked on ClearQuest tool for defect fixes and tracking.

Project: Henkel SAM
Domain: FMCG
Client: Henkel, Dusseldorf, Germany
Duration: November 2009 – February 2012
Location of Work: Bangalore, India
Role: Software Engineer

Tools and Technologies: Oracle, SQL Server, SSIS, SSRS, SSAS, MDX, SCOM

Description: BI-New project involves development as well as support of customised data marts on Sales,
Procurement, Production and Costing and bug fix and enhancements of the existing solutions. The project is
platformed on SQL Server 2008 and .NET 3.5 with SSIS as the ETL Tool, SSAS as the analytics tool and .NET,
SSRS and Excel as the reporting tools. The source data is fetched from SAP Systems mounted on Oracle database.
The project scope also includes monitoring of hardware and infrastructure with the help of Microsoft SCOM tool.

Roles and responsibilities:


 Performed development, migration, support, live fixes, quality control changes and troubleshooting work.
 Involved in ETL framework design and development.

 Designed SSIS ETL jobs for extracting data from heterogeneous source systems, transform and finally load into
the Data Marts.

 Created SSAS cubes and implement role-based security.

 Wrote complex MDX queries to feed data to SSRS and Excel reports template.

 Designed end to end solution for BI portal for problem and request management system.

 Worked on SCOM tool optimization to track SSIS job and Server performance monitoring.

Project: Sales and Portfolio Data Warehouse and Analytics Portal


Domain: Finance
Client: SKS Microfinance, Hyderabad, India
Duration: February 2009 – November 2009
Location of Work: Hyderabad, India
Role: Consultant

Tools and Technologies: Python, MS Access, SQL Server, R, BIRT

Description: SKS Microfinance Pvt. Ltd. is India’s largest microfinance organization providing micro
credit to over 5 million customers across 1400 branches. To achieve its high growth targets while
keeping the risk and losses under control, the company needed to constantly monitor its sales and
portfolio performance.
The company needed to improve both the quantity of its sourcing (to book more and better loans) and
needed to constantly monitor and track the performance of its loan portfolio (to minimize credit
losses).

Roles and responsibilities:


 Involved in every phase of the system development life cycle including research and analysis
gathering user and business requirements from client.

 Designed and developed ETL solution using Python and T-SQL

 Automated ETL solution with detailed audit trails in the data warehouse.

 Implemented Role based data security in OLAP and BIRT reports

 Created Dashboards and reports using Mondrian online analytical processing (OLAP) and
Created Ad-hoc Reports using SSRS Report Designer

 Interacted with virtual users and IT team to identify key dimensions and measures for business
performance.

 Created database SQL Server procedures and functions for customizing the application and to
implement business logic.

 Created R-PART based delinquency model based on customer demography to track the
performance of running loans.

Project: TTD Data Warehouse


Domain: Finance
Client: India Bulls Credit Services Ltd., Gurgaon, India
Duration: May 2008 – January 2009
Location of Work: Bangalore, India
Role: Consultant

Tools and Technologies: Python, R, MYSQL, jpivot/Mondrian

Description: At India bulls the data warehousing was driven by the need to measure the sales
effectiveness (TTD Analysis) and credit performance (Portfolio Analysis) for the Consumer Finance
Portfolio. The data was sourced from their Oracle legacy systems into the SQL server data warehouse.
The data warehouse was designed to be Star Schema where there were 2 fact table containing the
aggregated data along with dimension tables containing metadata like geography, time, organizational
unit, product, demographic data, etc. This ensured consistency across the data warehouse while doing
analysis.

Roles and responsibilities:


 Created Data loading script using python and MYSQL.
 Automated ETL process with detailed audit trail design.
 Developed OLAP cube using jpivot/Mondrian OLAP Server.
 Worked on Through-The-Door (TTD) data mining work: The Customer/Loan Acquisition process
can be viewed as an “Acquisition Funnel” made up of multiple stages; with the number of qualify-
ing leads decreasing at each stage, and the booked loans appear as the output of the Acquisition
Funnel. For this purpose, we created GLM data model.

Project: VAS Reporting
Domain: Telecom
Client: OnMobile, Bangalore, India
Duration: August 2007 – May 2008
Location of Work: Bangalore, India
Role: Consultant

Tools and Technologies: Python, MySQL, R

Description: OnMobile is one of the India’s Largest Telecom Value Added Service providers. They col-
lect huge amount of data with throughput of a Terabyte per annum (approx.). At OnMobile I played a
key role in design, implementation and management of a data-warehouse solution for Analytics. The
Data Warehouse was designed on the lines of dimensional modelling and Bus architecture. There were
3 fact tables that contained 3 levels of call data records (CDR) along with metadata in dimension
tables. Also, it included audit trails for auditing. The data was used for OLAP reporting, statistical ana -
lysis and recommendation engines built for targeting customers with customized contents.

Roles and responsibilities:


 Worked on development and enhancement of ETL loader using python and MySQL
 Monitoring Daily loading of VAS data of various telecom operators
 Daily base generation for customized content and application to content conversion promotion
messages.
 Monthly Reporting.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy