Ajit Kumar Profile
Ajit Kumar Profile
Ajit Kumar Profile
EXECUTIVE SUMMARY
Microsoft certified professional having over 15 years of extensive experience in Information Tech-
nology with special emphasis on design, development, and administration of ETL, Data Integra-
tion, Data validation, Data Analysis, Data Migration, Data Management and Governance,
Technology architecture, Application architecture.
Experience in all phases of Data warehouse life cycle involving Data analysis / design / devel-
opment / testing using ETL, Data Modeling, Analytical processing & reporting tools.
Strong hand on experience in developing ETL and data validation solution using Python and pan-
das.
Extensive experience in providing technical solutions for business problems/scenario.
Strong hands-on experience on ETL tools – Azure data factory, SSIS, Talend, MuleSoft for full
SDLC including strong emphasis on development, architecture along with project management.
Strong Knowledge on various flavor of Database like- Azure Managed instance, Oracle, MSSQL,
MySQL, MS Access.
Strong analytical and conceptual skills in writing SQL Queries. Good understanding of creating ta -
bles, table spaces, databases, and indexes, worked with Cubes and Metadata.
Proficient in interaction with business users. Pioneered load strategies of loading staging area and
data marts. Dealt with SCD Type1/Type2/Type3 loads.
Strong understanding of the principles of Data Warehouse using Fact Tables, Dimension Tables,
Star Schema modeling and Snowflake Schema modeling
Strong hands-on experience in developing reports and Dashboards using SSRS, TABLEAU, Spot-
fire, Power BI, Salesforce Report Builder, Salesforce Einstein Analytics.
Experience in automation of complete ETL process using enterprise schedulers / monitoring / re -
porting tools, like - Autosys, Splunk, Control-M.
Extensive work in ETL process consisting of data transformation/migration, data sourcing, mapping,
conversion and loading and validation against various source systems.
Extensive experience in database backend work, development and tuning of Database objects like
stored procedures, views, functions, Triggers, cursors and applications development in MS SQL
Server and Oracle environment finding report and ETL drawbacks in performance and tuning
Acquired knowledge on Data mining and Data modeling concepts. Created Predictive GLM (General-
ized linear model), R-part Model and ARIMA model (FOR Time series Analysis) using R, ALGO-
RITHMIA also acquired knowledge of data mining concepts (scorecard model, trend analysis, fore-
casting).
Worked on both Agile and waterfall model.
Excellent team player and proven track record working in various team sizes performing
cross-functional roles.
Employment History:
Project Details:
Tools and Technologies: Azure data Factory, AZURE Managed Instance, SQL Server, SSIS, Power
BI, Salesforce, Workday
Description:
Enterprise Warehouse: - Compassus delivers outcome-based value through high-quality patient-
centric care, expanding access to qualified individuals while putting compliance at the forefront and
providing health care system cost savings. It provides post-acute care services in hospice, home
health, palliative and home infusion therapy
This program intended to build comprehensive enterprise warehouse and data marts to cater data
needs for reporting products build in power BI.
Description:
Executive Dashboard: - program provides end-to-end prospective and retrospective risk adjustment
services and support for health plans, combining technology, analytics, and deep subject-matter
expertise to ensure risk-associated revenue is optimized while maintaining appropriate compliance.
ETL using SSIS and SQL fetch data of various client and source systems and store in integrated
warehouse. Multiple MicroStrategy dashboards build on top of data marts provides better insight into
data.
MIPs Reporting: - program provides reporting platform for MIPS, PIQ and RPI systems on HRP
domain. This also involves in migrating Mediconnect data to HRP data mart. Several tableau
dashboards build on top of HRP data mart.
Was involved in Analysis, Design and Implementation for the requirements raised.
Involved in redesign and performance optimization of ETL (both SSIS and SQL queries).
Involved in understanding business processes and coordinated with business analysts to get
specific requirements to build Data Marts.
Created jobs to load data in various stages of data warehouse and data marts using SSIS SQL
server objects like procedures, views, functions, cursors, triggers.
Held daily status meetings and conducted internal and external reviews.
Tools and Technologies: Python, Salesforce health cloud, Apex Classes, Triggers, Process Builder,
HL7 & EDI integration, Einstein Analytics, ALGORITHMIA
Description: An In-house Emids initiative to build a platform on top of Salesforce Health Cloud to
provide “Connected Patient Engagement” solution. This platform can be plugged into healthcare facility
and can be used to provide key various features which will help the facility to reduce readmission cost
and improve health outcomes.
Roles and responsibilities:
Carried out assessment in a capacity of an associate architect for problem solution, project
scope and requirements gathering.
Documented, prepared and suggested technical solution for defined scope and project require -
ment as per approved and finalized solution.
Has laid the foundation of both the framework by developing underlying code that is required
to wrap around the custom objects, datasets, and data integration framework.
Was tasked with running a proof-of-concept implementation at client site.
In addition to above, I have played a vital role in developing tools required for this implemen -
tation
Python script to process EDI/HL7 files: -Wrote python script to parse EDI
and HL7 files
Timeseries Scripts: Scripts developed in SAQL (Einstein Analytics) that will
helpful to predict or forecast the future values of a series based on the history
of that series. These are custom scripts developed specific to client
Risk Stratification Algorithm – Script developed using ALGORITHMIA and
using custom Apex class result set loaded to health cloud objects which is fur-
ther used for provider and payer dashboards.
Data migration with verification scripts – Migration scripts developed to
load data from parsed csv file system to the health cloud data model.
Tools and Technologies: AWS, S3 Bucket, Talend, Oracle, AWS, JSON, XML files, Flat files, SQL and
Unix Shell Scripting.
Description: IQVIA is implementing Infosario Registry Platform for ACS. Where all previous registry
will be made live on the new integrated platform.
Description: TMF Group’s business processes involve multiple systems and technologies. The current
IT landscape comprises of HR systems, business systems like Viewpoint and Microsoft Dynamics AX,
CRM systems and other tools, which allow it to conduct its business smoothly. However, the presence
of multiple systems and its Global presence pose a unique challenge of having an integrated system,
which can provide the management Global and cross-functional view of the business. The Data
Warehouse system designed and developed as a repository to consolidate data from all the systems to
present Global and cross-functional picture of the business.
Involved in understanding business processes and coordinated with business analysts to get
specific requirements to build Data Marts Performance tuning and Optimization of ETL and
Reporting.
Was involved in Analysis, Design, Coding, Unit Testing and Implementation for the requirements
raised.
Laid the foundation of ETL framework using Python and SQL server.
Created Loader to load data in various stages of data warehouse and data marts using python, SQL
server.
Performed Unit and Integration testing and validated the test cases by comparing the actual results
with expected results.
Managed UAT and production deployment activities.
Held daily status meetings and conducting internal and external reviews as well as formal walk-
throughs among various teams and documenting the proceedings.
Project: ImpactRx
Domain: Healthcare
Client: ImpactRx, Horsham, PA, USA
Duration: February 2014 – May 2015
Location of Work: Bangalore, India
Role: Lead Engineer
Identified and documented data sources and transformation rules required to populate and
maintain data warehouse.
Involved in Analysis, Design, Coding, Unit Testing and Implementation for the requirements
raised.
Created SSIS packages to load data from SQLite to SQL Server staging, data warehouse and
data marts
Created various SSRS reports and worked on subscription and linked report with drill down
and drill through functionality.
Created SQL server objects like procedures, views, functions, cursors, triggers.
Involved in ETL framework design and implementation using SSIS and SQL server.
As the project was in initial stage, so the data model is getting finalized. I have got a chance to
helped team in building data model using informatica power designer.
Involved in understanding business processes and coordinated with business analysts to get
specific requirements to build Data warehouse.
Documented the data anomalies and reviewed the same so that we can take care of those in en -
hancement of jobs.
Designed and developed the SSIS parallel jobs for extracting, cleansing, transforming, integrating
and loading data.
Implemented Role based data security based on AD Group users and access list.
Worked on documentation like – Functional specification, Technical Specification, Source to target
mapping, review checklist, unit test plan, product system support document, learning-doc out of
this project.
Project: Henkel SAM
Domain: FMCG
Client: Henkel, Dusseldorf, Germany
Duration: November 2009 – February 2012
Location of Work: Bangalore, India
Role: Software Engineer
Tools and Technologies: Oracle, SQL Server, SSIS, SSRS, SSAS, MDX, SCOM
Description: BI-New project involves development as well as support of customised data marts on Sales,
Procurement, Production and Costing and bug fix and enhancements of the existing solutions. The project is
platformed on SQL Server 2008 and .NET 3.5 with SSIS as the ETL Tool, SSAS as the analytics tool and .NET,
SSRS and Excel as the reporting tools. The source data is fetched from SAP Systems mounted on Oracle database.
The project scope also includes monitoring of hardware and infrastructure with the help of Microsoft SCOM tool.
Designed SSIS ETL jobs for extracting data from heterogeneous source systems, transform and finally load into
the Data Marts.
Wrote complex MDX queries to feed data to SSRS and Excel reports template.
Designed end to end solution for BI portal for problem and request management system.
Worked on SCOM tool optimization to track SSIS job and Server performance monitoring.
Description: SKS Microfinance Pvt. Ltd. is India’s largest microfinance organization providing micro
credit to over 5 million customers across 1400 branches. To achieve its high growth targets while
keeping the risk and losses under control, the company needed to constantly monitor its sales and
portfolio performance.
The company needed to improve both the quantity of its sourcing (to book more and better loans) and
needed to constantly monitor and track the performance of its loan portfolio (to minimize credit
losses).
Automated ETL solution with detailed audit trails in the data warehouse.
Created Dashboards and reports using Mondrian online analytical processing (OLAP) and
Created Ad-hoc Reports using SSRS Report Designer
Interacted with virtual users and IT team to identify key dimensions and measures for business
performance.
Created database SQL Server procedures and functions for customizing the application and to
implement business logic.
Created R-PART based delinquency model based on customer demography to track the
performance of running loans.
Description: At India bulls the data warehousing was driven by the need to measure the sales
effectiveness (TTD Analysis) and credit performance (Portfolio Analysis) for the Consumer Finance
Portfolio. The data was sourced from their Oracle legacy systems into the SQL server data warehouse.
The data warehouse was designed to be Star Schema where there were 2 fact table containing the
aggregated data along with dimension tables containing metadata like geography, time, organizational
unit, product, demographic data, etc. This ensured consistency across the data warehouse while doing
analysis.
Project: VAS Reporting
Domain: Telecom
Client: OnMobile, Bangalore, India
Duration: August 2007 – May 2008
Location of Work: Bangalore, India
Role: Consultant
Description: OnMobile is one of the India’s Largest Telecom Value Added Service providers. They col-
lect huge amount of data with throughput of a Terabyte per annum (approx.). At OnMobile I played a
key role in design, implementation and management of a data-warehouse solution for Analytics. The
Data Warehouse was designed on the lines of dimensional modelling and Bus architecture. There were
3 fact tables that contained 3 levels of call data records (CDR) along with metadata in dimension
tables. Also, it included audit trails for auditing. The data was used for OLAP reporting, statistical ana -
lysis and recommendation engines built for targeting customers with customized contents.