Akhil Uppalapati DA
Akhil Uppalapati DA
TECHNICAL SKILLS:
• Database Management: MS SQL Server, MySQL, IBM DB2, SSMS, OEM, MySQL Workbench, Oracle Financials
• Programming Languages: Python, JavaScript, PySpark, SQL, R, Query Languages
• Data Analytics & Visualization: Tableau, Power BI, Qlik Sense, Looker, OBIEE, Oracle Business Intelligence
• Machine Learning & AI: AWS SageMaker, Apache Spark ALS, Python ML libraries
• Data Modeling & BI: Advanced data modeling techniques, Business Intelligence tools, ETL design
• Cloud & Data Engineering: AWS (S3, EMR, Redshift, Glue), Azure Stream Analytics
• Scripting & Automation: Python, Bash, Alteryx, Apache Spark ALS
• Data Warehousing: Azure SQL, ETL pipelines, Data Marts
• Data Quality & Security: Transparent Data Encryption, AWS Recognition, Confluence, SharePoint
PROFESSIONAL EXPERIENCE:
Client: Total Bank Solutions, Hackensack, NJ Dec 2023 to till date
Role: Senior Data Analyst
Roles & Responsibilities:
• Developed complex SQL queries and dynamic Qlik Sense dashboards for financial reporting, enhancing
decision-making for various business units.
• Managed data integration with AWS S3 and Redshift, ensuring efficient and secure financial data storage.
• Led the development of data-based AI/ML models for predictive analytics in financial forecasting, utilizing
AWS SageMaker and Python.
• Defined key financial performance metrics, driving business insights for strategic decision-making.
• Improved financial data reporting by automating analytics processes using Alteryx, reducing processing time
by 30%.
• Developed machine learning models using Python and PySpark for predictive analytics in financial forecasting
• Created complex data models using JavaScript and Python for interactive financial dashboards
• Implemented advanced BI solutions using various query languages for comprehensive financial reporting
• Designed and maintained sophisticated data models for financial risk assessment
• Performed ad-hoc analyses and answered analytic questions to drive strategic business decisions using
predictive models with AWS SageMaker.
• Managed data set analysis using Intermediate/Advanced Excel, improving financial data processing and
reporting efficiency.
• Led documentation of use cases for financial systems, ensuring clarity and accuracy in data management.
• Led multi-system data integration efforts, leveraging AWS S3 and Redshift to ensure seamless data flow
across systems.
• Implemented ETL pipelines using Alteryx for efficient data extraction, transformation, and loading (ETL) to
support financial reporting and analytics.
• Employed AWS Redshift for robust data warehousing solutions, enabling scalable analytics and storage
capabilities.
• Integrated into an Amazon SageMaker to develop and deploy machine learning models, predicting financial
trends and behaviors.
• Configured Apache Spark ALS for advanced analytics, enhancing predictive modeling and data processing
efficiency.
• Automated data processes using Alteryx, streamlining data preparation and integration to support analytics
projects.
• Maintained comprehensive documentation for all data processes and systems using Confluence, enhancing
project transparency.
• Developed and managed SharePoint sites for project collaboration and document management, ensuring
team alignment.
• Implemented R language for advanced statistical analysis, supporting complex data studies and analytics.
• Optimized data queries using SQL to improve performance and response times in financial reporting tasks.
• Analyzed data trends and generated financial reports using Excel, providing actionable insights to
management.
• Created and maintained AWS cloud environments, ensuring scalable and secure data operations.
• Utilized AWS Recognition to enhance data analysis capabilities, applying image and text recognition
technologies.
• Employed advanced analytics techniques using SQL and Excel to drive business decisions and operational
improvements.
• Configured and managed data integrations using RESTful APIs, enhancing system connectivity and data
exchange.
• Documented all technical processes and systems comprehensively, ensuring accuracy and adherence to
compliance standards.
• Led training sessions on data tools and analytical techniques, enhancing team capabilities and knowledge.
• Designed financial models and forecasts using Amazon SageMaker and Apache Spark, informing strategic
planning.
• Performed data validation and quality assurance using SQL and Excel, ensuring data integrity across
platforms.
• Collaborated with IT teams to resolve system issues and optimize data workflows, enhancing overall
efficiency.
• Advanced implementation of machine learning projects using R and AWS tools, contributing to predictive
analytics initiatives.
Environment: SQL, Tableau, Python, PySpark, JavaScript, AWS S3, AWS Redshift, Amazon SageMaker, Apache Spark
ALS, Alteryx, Confluence, SharePoint, R, Excel, AWS Recognition, RESTful APIs, Alteryx, AWS Glue, and Azure Stream
Analytics.
Client: Great American insurance company, Cincinnati, OH Oct 2021 to Dec 2023
Role: Data analyst
Roles & Responsibilities:
• Managed the integration of complex data pipelines for reporting key sales and finance metrics, enhancing
fraud detection and policy risk assessment.
• Developed AI/ML models for predictive analytics using PySpark and Python, focusing on policy pricing and
claims predictions.
• Utilized Python and Machine Learning algorithms for claims fraud detection and risk assessment
• Implemented PySpark for large-scale data processing and analysis of policy data
• Developed JavaScript-based interactive visualizations for insurance analytics dashboards
• Created complex data models for underwriting risk assessment and policy pricing
• Conducted data automation projects using AWS Redshift and Alteryx, effectively streamlining data processes
across multiple departments.
• Managed server accounts and Active Directory (AD) to ensure secure data access across departments.
• Orchestrated multi-system data integration using AWS Redshift and S3 to consolidate and analyze claims and
policy data.
• Developed ETL workflows using AWS Glue and Alteryx, improving data consistency and quality across
departments.
• Applied advanced analytics techniques using R language, enhancing forecasting accuracy and strategic
decision-making in risk management.
• Utilized Amazon SageMaker to develop predictive models, improving underwriting accuracy and claim
predictions.
• Leveraged AWS Cloud services to manage large data sets securely, facilitating scalable solutions for data
storage and analysis.
• Implemented interactive dashboards and reports using Power BI, providing critical insights into financial
trends and policy performance.
• Managed SharePoint to effectively document and share project results and data analysis findings with
stakeholders.
• Utilized Confluence for comprehensive project documentation, enhancing communication and collaboration
across project teams.
• Employed AWS S3 for robust data storage solutions, ensuring the scalability and security of insurance data.
• Developed machine learning models using Apache Spark ALS, significantly improving predictive analytics
capabilities in fraud detection.
• Orchestrated data integration and ETL processes using AWS Glue, ensuring data consistency and accuracy.
• Configured and maintained data security measures using AWS Recognition, safeguarding sensitive customer
and policyholder information.
• Conducted statistical analyses and developed predictive models using MATLAB, supporting actuarial teams
in rate setting and risk assessment.
• Utilized Qlik Sense for dynamic data visualization, enabling real-time interaction with claims and customer
data.
• Applied Looker for business intelligence reporting, enhancing operational efficiency and strategic insights in
claims processing.
• Employed Matplotlib in Python environments to perform detailed statistical analysis, enhancing data
visualization for internal audits.
• Managed Azure Stream Analytics for real-time data processing, supporting instant analytics and operational
dashboards.
• Enhanced data-driven decision-making using Azure SQL, optimizing data queries and integrations for faster
access and analysis.
• Developed and maintained comprehensive documentation and reporting using Microsoft PowerPoint,
ensuring clarity in stakeholder presentations.
• Implemented data encryption and security measures using Transparent Data Encryption, ensuring
compliance with regulatory requirements.
• Managed user access and database security configurations using MySQL Workbench and Oracle Enterprise
Manager, enhancing system security.
• Developed and optimized data workflows and processes using Alteryx, significantly improving efficiency in
data preparation and analysis.
Environment: SQL, Excel, JavaScript, AWS Redshift, Python, Machine Learning Alteryx, R, Amazon SageMaker, AWS
Cloud services, Power BI, SharePoint, Confluence, AWS S3, Apache Spark ALS, AWS Glue, AWS Rekognition, MATLAB,
Qlik Sense, Looker, Matplotlib, Python, Azure Stream Analytics, Azure SQL, Microsoft PowerPoint, Transparent Data
Encryption, MySQL Workbench, Oracle Enterprise Manager and Azure Stream Analytics
Environment: SQL, Power BI, Qlik Sense, PySpark, JavaScript, Looker, MATLAB, Matplotlib, Azure Stream Analytics, Azure
SQL, Microsoft Excel, R, Tableau, Azure Cognitive Services, Microsoft Azure, AWS Redshift, Amazon SageMaker,
Transparent Data Encryption, AWS Key Management Service, AWS S3, SharePoint, Confluence, Apache Spark, ggplot2.
Client: Lorventech Solutions India Pvt Ltd, Chennai, India Jun 2016 to May 2017
Role: Database Analyst
Roles & Responsibilities:
• Installed, configured, and maintained database systems including MS SQL Server, MySQL, and IBM DB2,
ensuring optimal performance.
• Managed user access and ensured data integrity and security using Transparent Data Encryption and Oracle
Data Pump.
• Utilized SQL Server Management Studio and Oracle Enterprise Manager for effective database administration
and operations.
• Created initial dashboards and reports using Tableau, providing visual insights into operational data and
trends.
• Developed Python scripts for automated database management and reporting
• Implemented basic Machine Learning algorithms for data quality assessment
• Created data models for efficient database structure and optimization
• Applied BI tools for database performance monitoring and analysis
• Employed Microsoft Excel for data manipulation and reporting, supporting business operations and decision-
making processes.
• Developed and maintained data documentation using Microsoft Word and PowerPoint, facilitating
knowledge sharing and project continuity.
• Implemented MySQL Workbench for database design and query optimization, enhancing system efficiency
and user satisfaction.
• Conducted data analysis and visualization tasks using Power BI, aiding in the strategic planning and
performance monitoring.
• Managed projects and team collaboration using SharePoint, improving project delivery timelines and
stakeholder communication.
• Utilized Looker for interactive data visualization and business intelligence, enhancing data-driven decision-
making in the organization.
• Applied AWS S3 for data storage solutions, ensuring data security and scalability for enterprise applications.
• Leveraged MATLAB for advanced mathematical modeling, supporting operational research and optimization
projects.
• Employed R for statistical analysis and data modeling, providing insights into customer behavior and business
trends.
• Utilized Apache Spark for processing large datasets, enhancing the speed and efficiency of data analysis
workflows.
• Orchestrated comprehensive data security measures, maintaining high standards of data privacy and
compliance.
Environment: MS SQL Server, Python, MySQL, IBM DB2, Transparent Data Encryption, Oracle Data Pump, SQL Server
Management Studio, Oracle Enterprise Manager, Tableau, Microsoft Excel, Microsoft Word, PowerPoint, MySQL
Workbench, Power BI, SharePoint, Looker, AWS S3, MATLAB, R, Apache Spark.
Education:
• Lovely Professional University, Punjab, India. - 2016