0% found this document useful (0 votes)
11 views8 pages

Karanchakravarthy@gmail

Karan Chakravarthy is a Senior Python Developer with 9 years of experience in software development, specializing in AI/ML and backend applications. He has worked with major companies like American Express and Bank of America, focusing on data processing, machine learning solutions, and enhancing customer experiences through innovative technologies. His technical expertise spans various programming languages, frameworks, and cloud platforms, with a strong emphasis on agile methodologies and data-driven decision-making.

Uploaded by

harshstudy2012
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views8 pages

Karanchakravarthy@gmail

Karan Chakravarthy is a Senior Python Developer with 9 years of experience in software development, specializing in AI/ML and backend applications. He has worked with major companies like American Express and Bank of America, focusing on data processing, machine learning solutions, and enhancing customer experiences through innovative technologies. His technical expertise spans various programming languages, frameworks, and cloud platforms, with a strong emphasis on agile methodologies and data-driven decision-making.

Uploaded by

harshstudy2012
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 8

Karan Chakravarthy

Email: karan.chakravarthy1@gmail.com PH: +15732006451


Sr. Python Developer with AI/ML
PROFESSIONAL SUMMARY
9 years of FAANG+ experience in complete software development lifecycle including Analysis, Design, Development, Integration,
Deployment, Testing and Documentation of Enterprise Backend Applications, Web - based, Service Oriented Architecture (SOA)
using Python, SQL, Machine Learning Models.
 Experience with hands on development in Python, Django, Angular13, C++, ASP.NET MVC, Java, J2EE, Servlets, JSP, Struts,
Hibernate, PHP, Laravel.
 Experience in working with environments using Agile (SCRUM) and Test-Driven development methodologies.
 Working primarily with Ruby on Rails and MySQL in UNIX environment and with the Rails MVC framework including complex
model relationships, controllers, views and helpers.
 Experience in understanding the requirements and development, implementing and testing the various application
architectures that include E-Commerce, Business 2 Business, and Distributed Applications.
 Experience in developing single-page applications using Angular13, Bootstrap, HTML5, and CSS3.
 Experience in developing dynamic web pages using HTML, CSS, AJAX, REACTJS, JSON and JavaScript for user interface using JSP
and Servlets.
 Experience with Log4j for application logging facility using Log4j, writing JUnit test cases for unit testing and Pytest, Unit test to
maintain the accuracy of the program code.
 Proficient use of developing applications using PyCharm, Eclipse, PHP Storm, NetBeans.
 Experience in working with Databases like SQL, MongoDB, PL/SQL, MYSQL, and Redis.
 Developed the code for front end using JQuery, JavaScript, AJAX, HTML, XML and JSON and with ReactJS, ReduxJS.
 Experienced in building data models using Machine Learning techniques for Classification, Regression, Clustering and Associative
mining. Experience with the ticketing tool with JIRA.
 Experience in writing Sub Queries, Stored Procedures, Triggers, Cursors, and Functions on MySQL and PostgreSQL database.
Experience with the uses of NoSQL platform, such as MongoDB, Oracle, Cassandra, CouchDB.

TECHNICAL STACK:
 Operating Systems: Windows, Mac OS, Unix/Linux
 Languages: Python, Swift, SQL, C++, Java, J2EE, PHP, Bash, Golang, JavaScript
 Frameworks/Libraries: Django, Flask, TensorFlow, PyTorch, NumPy, Pandas, Scikit-Learn, Requests, Matplotlib, NLTK,
Statsmodels, Scipy, SQL Alchemy, CherryPy, Docker, SOAP
 Web Technologies: HTML5, CSS3, AJAX, JSON, jQuery, Bootstrap, Angular13, ReactJS, Ruby on Rails, ASP.NET MVC, Struts,
Hibernate, Laravel, XML, RESTful Web Services, GCP, Gradio, Huggingface, Streamlit Llama/Phi-3-Vision models, Argilla, MLFlow,
 Databases: MySQL, PostgreSQL, SQLite, MongoDB, Oracle, Cassandra, CouchDB, Redis, MS SQL, HDFS
 Cloud/Big Data: GCP, BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Functions, Dataprep, AWS, Hadoop, Spark, Kafka,
Kubernetes, Docker, Presto, Hive, Spark-SQL, GCS, Cloud Logging, IAM, Data Studio
 DevOps/CI/CD: Git, Jenkins, Pytest, Selenium, JUnit, Cucumber, Robot Framework, ALLURE Reporting, Tableau, Data Studio,
JIRA, Confluence, Terraform, PCF, TFS, Azure Data Warehouse, ADF, AAS, DAX, Docker-registry, Bitbucket, Ansible, Maven, MS
Build, SVN, Review Board
 Testing: Selenium WebDriver, Selenium GRID, Nosetest, PyUnit, Robot framework, Selenium RC, QTP
 Tools: Putty, SQL Developer, Toad, Matlab, R, AutoIt, Auto Hotkey, Active Records, PyQt, VIM
 Methodologies: Agile (SCRUM), TDD, SDLC, data analytics, data wrangling
 NLP: NLTK, Open NLP, StanfordNLP
 IDEs: PyCharm, Jupyter Notebook, Sublime Text, Eclipse, NetBeans, PHP Storm

PROFESSIONAL EXPERIENCE

Sr. Python Developer with AI/ML


American Express, Phoenix, Arizona Apr 2024 to Present
American Express is a globally integrated payments company, providing customers with access to products, insights, and experiences
that enrich lives and build business success.
Responsibilities:
 Participated in the integration and optimization of AI-driven services for advanced data processing, focusing on information
extraction and analysis across various data formats including IDs, documents, statements, and emails.
 Contributed to the development and implementation of data redaction and reconstruction solutions, utilizing regular
expressions and OCR technologies to ensure accurate handling and compliance.
 Utilized advanced SQL techniques to perform complex data aggregation, facilitating the analysis of customer spending
patterns which supported strategic marketing initiatives.
 Developed interactive financial dashboards using Svelte, enhancing user experience and responsiveness. Leveraged Svelte’s
reactivity and component architecture to ensure real-time updates and seamless UI transitions, aiding executives in
strategic decision-making.
 Managed AWS S3 buckets for scalable and secure data handling, ensuring efficient data storage and operations.
 Developed interactive AI prototypes using Gradio and Streamlit, enabling stakeholders to visualize and interact with
machine learning models, enhancing understanding and feedback.
 Implemented several serverless applications using AWS Lambda to automate financial processes, resulting in a 30%
reduction in operational costs.
 Implemented advanced 3D reconstruction techniques and novel view synthesis to enhance fraud detection systems by
creating detailed visualizations of transaction environments, significantly reducing false positives in high-risk transaction
monitoring.
 Engineered robust data lake solutions using Amazon S3, optimizing data storage and retrieval processes that enhanced the
performance of big data analytics, significantly reducing query response times and costs.
 Integrated Huggingface's transformers with American Express's customer service chatbots to leverage state-of-the-art NLP
models, significantly improving response accuracy and customer satisfaction.
 Implemented Generative AI techniques to automate decision-making processes, enhancing the capabilities of fraud
detection systems by integrating RAG for dynamic data retrieval and real-time insights.
 Led the development of a chatbot that leverages GenAI to provide personalized customer service, utilizing RAG to improve
response accuracy and relevance based on historical transaction data.
 Utilized a diverse tech stack, employing Python, regex, OCR, CI/CD, Docker, GitHub, AWS, JSON, and machine learning
technologies to meet project requirements.
 Deployed Mosaic AI Gateway to facilitate seamless integration of AI-driven financial services, ensuring robust API security
and efficient data flow.
 Leveraged Claude for enhancing real-time AI-driven analytics, contributing to the development of predictive models that
significantly improved decision-making processes across business units at American Express.
 Implemented EDI solutions to automate and streamline financial transactions, resulting in a 30% reduction in processing
time for payments and reconciliation processes.
 Was part of the development of customer service chatbots using Amazon Lex at American Express, integrating NLU to
enhance the chatbot's ability to understand and respond to customer queries with high accuracy. This led to a 30%
improvement in customer service response times and significantly increased user satisfaction.
 Leveraged AWS Fargate to deploy containerized applications without the need to manage servers or clusters, enhancing
operational efficiency in payment processing systems.
 Utilized AWS SageMaker to design, build, and deploy machine learning models, streamlining the workflow from model
training to deployment.
 Collaborated with the data science team to refine NLU models, focusing on improving intent recognition and entity
extraction, which enabled more personalized and context-aware interactions in customer service applications.
 Utilized AWS Textract to automate the extraction of text and data from scanned documents, significantly streamlining the
processing of financial statements and reducing manual entry errors.
 Utilized Google Vertex AI to streamline the deployment of machine learning models directly from Jupyter notebooks to
production, significantly reducing model latency and improving prediction accuracy.
 Implemented AWS Kendra to enhance the search capabilities within the company's internal document repositories,
enabling more efficient retrieval of financial regulations and compliance documentation.
 Utilized lex for lexical analysis and yacc for parsing to automate the extraction and interpretation of structured data from
numerous sources, enhancing the efficiency of data processing tasks.
 Utilized MATLAB for complex financial data analysis and visualization to support decision-making processes. Developed
custom functions to automate the extraction, transformation, and loading of large datasets, significantly reducing
processing time and improving accuracy.
 Implemented comprehensive API security measures including OAuth and JWT, to protect sensitive financial data and
prevent unauthorized access.
 Implemented VectorDB to enhance the efficiency of search functionalities within the financial document retrieval system,
significantly reducing query response times.
 Applied LLAMA models for advanced text analytics, enhancing natural language understanding (NLU) capabilities across
various customer interaction points, boosting response effectiveness.
 Incorporated ChatGPT technology to revamp the AI-driven chatbot system, providing real-time, context-aware support to
enhance user interaction and operational efficiency.
 Engineered robust payment processing systems using advanced Java features, enhancing transaction handling and fraud
detection capabilities. Integrated Java with secure APIs and databases, ensuring high reliability and compliance with
financial industry standards.
Environment: Python, regex, OCR, CI/CD, Docker, GitHub, FastAPI, Flask, Jenkins, AWS (S3, EC2, Lambda, RDS), JSON, TensorFlow,
PyTorch, Scikit-Learn, Keras, NLP, Rally, JIRA, Confluence, PostgreSQL, MongoDB, Elasticsearch, Kafka, Hadoop, Spark, Kubernetes,
Redis, Angular13, React, Node.js, iONA, Power BI, Windows, Linux, VSCode, Pycharm, Jupyter, Scala, DynamoDB, Azure OpenAI
Studio, Typescript, AWS Fargate, Qlik, FastAPI
Sr. Python Developer with AI/ML
Bank Of America, Dallas, Texas Jan 2023 to Mar 2024
Bank of America is a leading multinational financial institution providing comprehensive banking, investing, and financial services
with a focus on innovation and customer experience.
Responsibilities:
 Utilized DynamoDB for risk assessment applications, designing data models that facilitated fast access and analysis of financial
data to drive decision-making in high-stakes environments
 Designed and implemented end-to-end machine learning solutions, from data collection to model deployment and monitoring,
using tools like TensorFlow, PyTorch, and Apache Airflow.
 Developed and executed large-scale data processing workflows, leveraging distributed computing frameworks and cloud-based
solutions such as GCP's BigQuery, Dataflow, and Dataproc.
 Implemented multi-layered security measures on Azure, utilizing Terraform for infrastructure as code initiatives, significantly
reducing deployment errors and security vulnerabilities.
 Authored YAML configurations to streamline CI/CD pipelines, improving deployment cycles and consistency across development
environments.
 Integrated Claude with existing business infrastructures to streamline workflow automation and data processing, ensuring
seamless interoperability with legacy systems at Bank of America.
 Implemented advanced NLU techniques to analyze customer feedback and inquiries, streamlining the process of extracting
meaningful insights from unstructured data, which directly contributed to strategic business decisions and enhanced customer
relationship management.
 Deployed Amazon OpenSearch to enhance search capabilities across vast datasets of financial records, improving search
performance and accuracy, which directly supported risk assessment and fraud detection initiatives.
 Utilized Neural Radiance Fields (NeRF) to generate 3D models from 2D images of banking facilities, improving virtual customer
interaction and engagement through more realistic and interactive online banking experiences.
 Integrated Google Vertex AI to enhance predictive analytics capabilities, automating credit risk assessments and personalizing
customer service interactions with advanced AI insights.
 Leveraged Mosaic AI Gateway to enhance API orchestration, improving the performance of real-time financial transaction
processing systems.
 Orchestrated containerized applications using Amazon EKS to streamline deployment processes and improve scalability and
reliability of banking services, resulting in enhanced customer experience and operational efficiency.
 Employed MATLAB's robust toolset for developing and testing quantitative risk models. Integrated MATLAB with other
technologies to perform back-testing and scenario analysis, helping the bank mitigate potential risks in investment strategies.
 Led the integration of EDI systems enabling seamless data exchange between banking platforms and external financial
institutions, enhancing operational efficiency and compliance with financial standards.
 Spearheaded a project using Generative AI to synthesize financial reports and predictive models, significantly reducing manual
data processing time and increasing report accuracy.
 Employed RAG techniques to augment data retrieval processes, enhancing the analytical capabilities of AI models used in risk
assessment and management.
 Designed and deployed several financial advisory bots using Amazon Lex, which used NLU to understand complex customer
queries about banking products and services, providing automated, yet personalized, advice and support.
 Utilized VectorDB to manage high-dimensional vector data for real-time fraud detection systems, improving detection rates and
reducing false positives.
 Deployed Llama/Phi-3-Vision models to enhance document processing capabilities, automatically extracting and analyzing data
from customer-submitted documents to streamline loan approvals.
 Implemented customer-facing financial tools using Svelte, which significantly improved the load time and interactivity of the
bank's online banking portal. This initiative led to an increased customer satisfaction score due to the enhanced user experience
and performance.
 Leveraged SageMaker's Jupyter notebook instance for developing and experimenting with models using direct access to high-
level TensorFlow and PyTorch libraries, enhancing predictive analytics capabilities.
 Proficient in utilizing monitoring tools such as AWS CloudWatch, Grafana, and Prometheus to oversee large-scale deployments,
ensuring high availability and performance through system health checks and real-time alerting.
 Experienced in tuning ElasticSearch for relevancy, using techniques like Retrieval-Augmented Generation (RAG) and RRF
practices to enhance search results from multiple indices.
 Applied cutting-edge 3D visualization techniques to create interactive models of financial data, providing stakeholders with
intuitive and engaging tools for financial planning and risk assessment.
 Engineered a real-time transaction monitoring system using AWS Lambda, effectively detecting and responding to fraudulent
activities by processing transactions as they occur.
 Implemented Gen AI techniques at Bank of America for financial forecasting and anomaly detection, significantly improving the
accuracy and speed of financial risk assessments.
 Integrated AWS Textract in the loan processing workflow to automatically extract data from mortgage applications, improving
processing speed by 40% and reducing dependencies on manual data entry.
 Leveraged AWS Kendra to build a cognitive search platform that provides employees with quick, contextually relevant answers
to customer queries based on a vast repository of banking guidelines and case studies.
 Utilized Generative AI and deep learning frameworks to analyze unstructured data, enhancing the bank's ability to extract
meaningful insights from large datasets.
 Deployed and managed scalable databases using Amazon Aurora, ensuring high availability and performance. Utilized AWS
services such as RDS, S3, and Lambda for data storage, processing, and automation.
 Implemented Fargate to automate microservices deployment, ensuring scalable and reliable banking services with minimal
manual intervention.
 Developed complex financial models and simulation tools using advanced Java. These tools were integral to risk assessment and
portfolio management, providing the bank with enhanced predictive capabilities and strategic insights.
Environment: Python, Swift, SQL, C++, Java, JavaScript, J2EE, PHP, Django, Flask, TensorFlow, PyTorch, Apache Airflow, Angular13,
ReactJS, Ruby on Rails, ASP.NET MVC, Struts, iONA, Laravel, NumPy, Pandas, Scikit-Learn, Requests, Matplotlib, NLTK, Statsmodels,
Scipy, Google Cloud Platform (GCP), AWS, BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Functions, Dataprep, Git, Jenkins, Pytest,
Selenium, JUnit, Cucumber, Robot Framework, ALLURE Reporting, Tableau, Data Studio, JIRA, Confluence, PyCharm, Typescript, AWS
Fargate, Eclipse, NetBeans, PHP Storm, Hadoop, Spark, Kafka, MySQL, PostgreSQL, SQLite, MongoDB, Oracle, CUDA, TensorRT,
Cassandra, CouchDB, Redis, Qlik, Kubernetes, Docker, Putty, SQL Developer, Toad, Matlab, R, Windows, Mac OS, Unix/Linux, Aurora,
Scala, DynamoDB, FastAPI
Python Developer with AI/ML
Apple, Hyderabad, India Aug 2021 to Jul 2022

Apple is a global technology leader known for its innovative products and services, including iPhone, iPad, Mac, and Apple Music.

Responsibilities:
 Participated in requirement analysis and collaborated with the architecture team to design and model AI/ML solutions.
 Developed robust, fault-tolerant machine learning applications using Python, enhancing Siri's performance and user
engagement.
 Enhanced customer experience by integrating DynamoDB with mobile applications to store and retrieve user data seamlessly,
contributing to the robust performance of iOS apps
 Created scalable and highly available RESTful APIs using Python’s Flask and Django frameworks, ensuring efficient
communication between different services.
 Applied Medallion Architecture principles to manage diverse datasets from multiple Apple platforms, enhancing data
harmonization and supporting robust analytics frameworks.
 Led an emotion detection project utilizing Swift and Python, contributing to Siri's enhanced ability to recognize and respond to
user emotions.
 Managed relational databases with Amazon RDS to support high-volume, high-velocity, mission-critical applications, ensuring
optimal performance, disaster recovery, and high availability across user-facing applications.
 Developed and optimized 3DGS (3D Geometric Structures) algorithms to support the design and testing of new product
prototypes, facilitating rapid visualization and adjustments that accelerated the product development lifecycle.
 Integrated VectorDB to support image and video retrieval systems, enhancing the accuracy and speed of media content
management within internal applications.
 Developed a Lambda-based backend for a mobile app that facilitated seamless data integration and synchronization across user
devices, enhancing user experience and system reliability.
 Improved Siri's performance by 30%, achieving a 1-second response time and increasing user engagement by 15%.
 Applied advanced SQL queries to analyze user engagement across various Apple services, contributing to significant
enhancements in personalized user experiences.
 Utilized Gen AI algorithms to enhance the capabilities of Siri, enabling more natural language (NLU) responses and context-
aware interactions for users.
 Implemented gRPC to facilitate high-performance, low-latency communication between microservices, optimizing the data
synchronization process across global retail and online systems.
 Developed and maintained scalable API solutions using REST, GraphQL, and gRPC focusing on high availability, fault tolerance,
and optimal load distribution.
 Applied Huggingface and Streamlit in the development of advanced NLP models to improve Siri's language understanding
capabilities, enabling more natural and accurate user interactions.
 Engineered secure and scalable cloud solutions on Azure, employing Bicep and Terraform to automate and manage
infrastructure, supporting high-volume Apple services.
 Pioneered the use of Neural Radiance Fields (NeRF) in the product design process at Apple, creating lifelike 3D models from
sparse photographic data which enhanced the accuracy of pre-production design reviews and customer previews.
 Leveraged OpenAI technologies to enhance user interaction within Apple’s customer service platforms, integrating generative AI
capabilities that improved automated responses.
 Engineered a custom scripting language for an internal tool, employing lex and yacc to design the grammar and manage syntax
parsing, significantly improving the tool's adaptability and user interaction.
 Leveraged AWS data services such as Redshift for data warehousing, Glue for ETL processes, and Kinesis for real-time data
streaming and analytics.
 Integrated Generative AI into product development pipelines to simulate user interactions with new software releases, using
RAG to fetch relevant user feedback and incorporate it into the development lifecycle.
 Developed a proprietary solution for automating content creation across digital platforms at Apple, utilizing RAG to ensure the
relevance and factual accuracy of generated content.
 Integrated AWS BedRock solutions to manage infrastructure and application deployments, focusing on automating and securing
DevOps pipelines.
 Implemented a highly efficient inventory management system for Apple’s retail operations using advanced Java, focusing on
multithreading and network programming to handle high-volume data processing and real-time inventory updates across global
outlets.
 Applied MATLAB in the development of algorithms essential for analyzing hardware performance metrics. Created simulation
models to predict product behavior under various operational conditions, aiding in the engineering decision-making process and
product improvements.
 Employed FastAPI in a project aimed at improving user experience on digital platforms by developing APIs that enabled faster
data retrieval for millions of users worldwide, ensuring robust performance during peak traffic.
 Integrated Scala into the development of data-intensive applications for analyzing user interaction data across Apple media
platforms, helping enhance customer engagement through insights-driven personalization.

Environment: Python, Swift, SQL, C++, Django, Flask, TensorFlow, PyTorch, Apache Airflow, NumPy, Pandas, Scikit-Learn, Requests,
Google Cloud Platform (GCP) including BigQuery, iONA, Dataproc, Pub/Sub, Cloud Functions, Git, Jenkins, Pytest, Selenium, Tableau,
Data Studio, Talend, JIRA, Confluence, PyCharm, Jupyter Notebook, Sublime Text, Apache Kafka, Kubernetes, Docker, Scala, Azure
OpenAI Studio, FastAPI

Full Stack Python Developer


Dell, Hyderabad, India Mar 2020 to Jul 2021
Dell is a global technology company renowned for its innovative solutions in computing, storage, and IT infrastructure, providing
comprehensive services to businesses and consumers worldwide.
Responsibilities:
 Develop resilient, fault-tolerant, and highly available web applications using Python, Django and GCP, Golang.
 Design, develop, and maintain microservices like Python applications hosted in Comcast infrastructure for providing services to
end customers.
 Develop scalable and highly available RESTful webservices using Python’s Requests library.
 Employed Jupyter Notebooks as a teaching tool to demonstrate complex concepts to peers and junior team members,
enhancing team skills and understanding of data science methodologies.
 Integrated Golang services with various databases, including SQL (PostgreSQL, MySQL) and NoSQL (MongoDB, Redis).
Implemented data access layers and ORM (Object-Relational Mapping) solutions.
 Managed Kubeflow pipelines, establishing standards for machine learning workflows and ensuring efficient model deployment.
 Prepare Test scripts for different applications using QTP & Selenium
 Integrated FastAPI to streamline software deployment processes, creating APIs that interfaced with CI/CD pipelines to automate
tasks and reduce deployment times by 30%.
 Automation of internal web application through Selenium 2 WebDriver.
 Implemented predictive maintenance capabilities within the EMS, using machine learning to foresee potential system failures
and reduce downtime.
 Developed a VectorDB-based solution for matching customer queries with product specifications, increasing the relevancy and
precision of search results on e-commerce platforms.
 Administered Azure-based projects using Terraform to ensure efficient, error-free deployments and operations, aligning with
Dell's global IT infrastructure expansion strategies.
 Utilized C# for developing internal tools and applications that supported various business processes, increasing productivity and
system reliability.
 Configured Kong AI Gateway to handle thousands of API calls daily, incorporating advanced rate limiting and authentication
mechanisms to bolster security.
 Used gRPC APIs to enable real-time data exchange between hardware monitoring systems and central analytics platforms,
improving the response times and reliability of system health checks.
 Implemented continuous integration and deployment pipelines using AWS BedRock as part of a broader cloud transformation
initiative, which increased deployment frequency and reduced rollback rates due to errors.
 Developed and tested many features in an AGILE environment using Ruby on Rails, HTML5, CSS, JavaScript, and Bootstrap.
Develop REST APIs to perform CRUD operations on Google Cloud.
 Led a project at Dell that integrated Gen AI into product design processes, automating the creation of design variations and
significantly reducing time-to-market for new products.
 Utilized AWS Lambda to orchestrate data pipelines for processing customer feedback across multiple channels, enabling faster
insights into customer satisfaction metrics.
 Responsible for full stack web-development with adherence to PEP8 standard.
 Perform unit test and integration tests, using unit test, nose, and Pytest framework.
 Developed high-performance backend services and microservices using Golang, focusing on efficiency and scalability. Utilized
Go's concurrency features to handle high-throughput and low-latency requirements.
 Responsible for writing code in Object Oriented Programming supported by Ruby on Rails in Agile SCRUM environment.
Implement AJAX calls to GET/POST/UPDATE/Delete (CRUD) the REST APIs.
 Experience in GCP's big data stack, including Dataflow, PubSub, GCS, Cloud Functions, Stack driver, Cloud Logging, IAM, and Data
Studio.
 leveraged AWS Lambda for automating image processing workflows in the cloud, significantly speeding up the time to market
for new product images on the e-commerce platform.
 Knowledge of ETL processes using GCP tools like Dataprep, Dataproc (Pyspark), and Bigquery for files from Abinitio and Google
Sheets.
 Utilized Databricks for developing and deploying machine learning models aimed at predicting hardware failure, enhancing
predictive maintenance capabilities and reducing system downtime by 15%.

Environment: Python 3.7, Flask, Angular.js, C++, Bootstrap, GCP, Big Query, Cucumber, Bash, Pytest, unit test, PyQt, XML, Shell
Scripting, MySQL, GitHub, JIRA, HTML, XHTML, CSS, AJAX, JavaScript, Jenkins, Linux, Python, Django, GCP, Comcast infrastructure, C+
+, SQL, TDD, Agile methodologies, QTP, Selenium, CI/CD pipelines, Bitbucket, Ruby on Rails, HTML5, CSS, JavaScript, Bootstrap, REST
APIs, Camunda ,PEP8, Pytest, Dataproc, Dataflow, PubSub, GCS, Cloud Functions, BigQuery, Stackdriver, Cloud Logging, IAM, Golang,
Data Studio, Dataprep, PySpark, Terraform, Kubernetes, Presto, Hive, Spark-SQL, Apache Airflow, GCP Composer, Bash operator,
Hadoop operators, PCF, TFS, Azure Data Warehouse, ADF, AAS, DAX, Cloud Dataflow, Scala, Databricks, FastAPI

Python Developer with AI/ML


Amazon Alexa Data Services, Hyderabad, India Aug 2017 to Mar 2020
Amazon Alexa Data Services is a division of Amazon focused on enhancing Alexa's capabilities through data management, machine
learning, and AI technologies.
Responsibilities:
 Generated Python Django Forms to record data of online users.
 Created MySQL back-end for data entry from Flask.
 Developed monitoring and notification tools using Python.
 Utilized modular frameworks like MEF to design and implement software solutions that support Amazon’s e-commerce
platform, improving modularity and easing the integration of new features.
 Worked closely with the legal team to navigate the complexities of e-commerce regulations, focusing on consumer data
protection laws and corporate governance.
 Played a key role in enhancing Amazon’s EMS to support its cloud infrastructure, ensuring seamless scalability and robust
performance under peak loads.
 Implemented a CI/CD pipeline using Azure DevOps (VSTS, TFS) in both cloud and on-premises with GIT, MS Build, Docker, Maven
along with Jenkins plugins.
 Led a team in the integration of 3D reconstruction technologies into Amazon’s warehouse optimization systems, enabling
precise volume measurement and spatial placement of products which improved storage efficiency and retrieval times.
 Implemented Amazon MWAA to automate and manage complex data workflows, enabling more efficient data processing and
integration tasks that supported scalable machine learning model training environments.
 Developed robust EDI transaction systems to support e-commerce operations, ensuring accurate and timely data exchange
between Amazon and its extensive network of suppliers and logistics partners.
 Spearheaded an initiative at Amazon to use Gen AI for content recommendation systems, boosting user engagement by
accurately predicting and aligning content with user preferences.
 Integrated Gradio into AWS to provide customers with customizable interfaces for interacting with deployed AI models,
enhancing user engagement and adoption.
 Utilized MLFlow within AWS to manage the lifecycle of machine learning models across Amazon’s vast e-commerce platform,
ensuring optimal performance during high-traffic events like Black Friday.
 Designed and deployed Amazon’s Azure environments using Terraform and Azure Bicep, optimizing cloud resource utilization
and security configurations.
 Implemented Kong AI Gateway to manage and secure microservices communications across our vast e-commerce platform,
enhancing throughput and reducing latency.
 Developed and maintained robust, secure, and scalable multi-tier applications using AWS BedRock, optimizing performance and
reliability across critical business operations.
 Applied AWS Textract for automating the extraction of technical specifications from hardware product documents, aiding in
faster aggregation and analysis of product data across multiple teams.
 Enhanced internal knowledge management systems using AWS Kendra, enabling engineers and product teams to quickly find
technical solutions and documentation, thus accelerating the R&D lifecycle.
 Setup Selenium GRID to run automation scripts on different browsers.
 Developed API tests for the web service REST.
 Employed VectorDB for personalized recommendation engines, enabling more accurate and timely product suggestions based
on user behavior analysis.
 Leveraged Databricks to build scalable data pipelines for e-commerce analytics, integrating with AWS services to enhance data
ingestion and transformation processes, leading to a 20% improvement in data processing speed.
 Architected and developed microservices in Golang, leveraging Docker for containerization and Kubernetes for orchestration.
Ensured services were loosely coupled and independently deployable.
 Integrated TypeScript into AWS Lambda functions to process and handle data with high accuracy and reliability, supporting
large-scale distributed systems within Amazon’s AWS infrastructure.
 Utilized AWS analytics services such as Amazon EMR for big data processing, and SageMaker for building, training, and
deploying machine learning models at scale.
 Conducted A/B testing on machine learning models using SageMaker's model management and deployment services, ensuring
optimal model performance and accuracy before full-scale implementation.
 Used Jenkins and pipelines to drive all micro services builds out to the Docker-registry and then deployed to Kubernetes,
Created Pods and managed using Kubernetes.
 Used Scala to build scalable back-end services that support ecommerce operations, enhancing the capability of Amazon's
recommendation engine by processing millions of transactions in real-time.
 Implemented and fine-tuned Large Language Models (LLMs) using OpenAI’s frameworks to enhance chatbot capabilities,
delivering personalized user experiences.

Environment: Python, Django, Flask, CherryPy, MySQL, JavaScript, HTML, CSS, jQuery, AngularJS, Bootstrap, JSON, Azure DevOps
(VSTS, TFS), GIT, Golang, Docker, Maven, Typescript, AWS Fargate Selenium, Nosetest, PyUnit, Robot framework, Selenium
WebDriver, Selenium GRID, REST APIs, Kubernetes, Docker-registry, Pods, Scala, SVN, Databricks, PL/SQL, Oracle, GCS, Google
Pub/Sub, AWS, Cloud Dataflow, MongoDB, Cassandra, CouchDB, ReactJS, pytest, web-sockets.

Jr. Python Developer


Bucker, Hyderabad, India Aug 2016 to Jun 2017

Responsibilities:
 Involved in the development of front end of the application using Python 2.7, HTML5, CSS3, AJAX, JSON and JQuery.
 Designed and maintained databases using Python and developed Python based API (RESTful Web Service) using Flask, SQL
Alchemy and PostgreSQL.
 Involved in the Complete Software development life cycle (SDLC) to develop the application.
 Involved in UI refactoring that involves experience in Ajax.
 Worked on backend of the application, mainly using Active Records.
 Analyzed the code completely and have reduced the code redundancy to the optimal level.
 Developed programs to automate the testing of RAID controller firmware and utilities using Python 2.7, Java, Bash script,
Windows command line, AutoIt, Auto Hotkey.
 Having experience in Agile Methodologies, Scrum stories and sprints experience in a Python based environment, along with data
analytics, data wrangling and Excel data extracts. Involved in the development of ORM Queries.
 Involved in development of Web Services using SOAP for sending and getting data from the external interface in the XML
format. Used Test driven approach (TDD) for developing services required for the application.
 Exported Test case Scripts and modified the selenium scripts and executed in Selenium RC environment.
 Worked on JavaScript MVC Framework like AngularJS.
 Used Python scripts to update content in the database and manipulate files Python Developer
 Led the training and deployment of LLMs for various applications, including automated content generation and sentiment
analysis.
 Developed and implemented feature engineering techniques to improve model performance and accuracy.
 Leveraged Python expertise to develop machine learning and data science solutions, utilizing libraries such as Pandas, Numpy,
and Scikit-learn for data preprocessing, feature engineering, and model evaluation.
Environment: Python 2.6/2.7, Django 1.5, Java, Bash, HTML5, CSS3, AJAX, JSON, jQuery, Bootstrap, Pyramid, Flask, SQL Alchemy,
MySQL, MS SQL, PostgreSQL, RESTful, SOAP, JavaScript, Selenium, Git, Active Records, Windows command line, AutoIt, Auto Hotkey,
Agile Methodologies, Scrum, data analytics, data wrangling, Excel data extracts, ORM Queries, TDD, Selenium RC, AngularJS

EDUCATION
Southern Illinois University Carbondale, Illinois | Master of Science, Major in Computer Science, 2023
Scient Institute of Technology, Hyderabad| Bachelor's in Technology, Major in Computer Science, 2016

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy