Akshay Ramanujam Ranganathan
- Results-driven full-stack machine learning engineer with 4 years of experience and a master's degree in computer science from the University of Texas at Dallas.
- Skilled in developing and implementing innovative machine learning models to solve complex real-world problems. Additionally, experienced in data engineering with a solid understanding of database design and ETL processes.
- Possess a strong ability to work with various types of data and create compelling visualizations to communicate insights effectively.
- Expertise in full-stack application development, including front-end and back-end development using modern technologies.
- Demonstrated ability to collaborate with cross-functional teams and deliver projects on time.
Work
06/2022 – 08/2022
Data Engineer Intern
Prodapt Inc., Irving, Texas, United States
- Utilized data-driven approaches to enhance operational efficiency for a leading telecommunication company.
- Streamlined KPI reporting by optimizing SQL queries, resulting in more accurate and efficient reporting.
- Developed a Python script to automate the monitoring and maintenance of Kubeflow data pipelines on GCP, saving ~5 hours of manual work every week and mitigating the risk of data loss.
- Improved accuracy of field technician work hour predictions by 5% by leveraging Google's Vertex AI, resulting in enhanced scheduling and resource management.
- Increased data accessibility by utilizing Dataflow to ingest data from multiple sources into Google's serverless data warehouse, Big Query, reducing data ingestion time by ~13%. This facilitated efficient storage and analysis of large datasets.
07/2017 – 07/2021
Machine Learning Engineer
Whirldata Labs Pvt. Ltd., Chennai, Tamil Nadu, India
- Trained a convolutional neural network on 5000 low-resolution plastic images obtained from a recycling/sorting facility's video feed using Python,
OpenCV, and TensorFlow. Developed a basic annotating tool to label images and utilized the model to pick up plastic using a uArm robotic arm.
Analyzed the model's performance on lower resolution images, providing insights into image resolution for classification tasks and the capabilities of neural networks for real-world computer vision tasks.
Published a white paper on these findings, currently available on the company's website.
The model was capable of identifying the materials with a 97.7% accuracy and the solution resulted in a 25% increase in the facility's overall efficiency. - Utilized Python, OpenCV, and TensorFlow to train a convolutional neural network that achieved 60% accuracy in classifying 10 different respiratory diseases from chest X-rays. Competed in a Kaggle competition, demonstrating the potential of artificial intelligence in solving healthcare problems.
- Developed Safetee, an AI-based surveillance system for an office lobby using object recognition and tracking algorithms with Python, OpenCV, and TensorFlow.
Classified events such as theft and tailgating, and maintained a database to track event logs in the video.
Implemented a web-based interface with Flask to visualize data streamed from the database (Used Google Charts for visualization). - Developed and implemented a boot failure recovery mechanism in a Nvidia Jetson TX2 using bash scripting, ensuring high fault tolerance.
Successfully integrated the Jetson TX2 into a remote sensing satellite launched into orbit by India's Polar Satellite Launch Vehicle, operated by the Indian Space Research Organization, in December 2019. - Created a Python script that utilizes a pose estimation model, Tensorflow and OpenCV to count the number of valid push-ups performed by a person, as part of a marketing campaign for the company.
- Developed and trained a robust Named Entity Recognition (NER) model utilizing the conditional random field (CRF) technique, yielding a 94% test accuracy. Integrated the model into the client's product, enabling automated software updates by extracting entities from the OS system registry, resulting in 17% improvement in customer satisfaction ratings.
- Improved performance of a customer support chatbot by fine-tuning the Bidirectional Attention Flow (BiDAF) model, a state-of-the-art model at the time, resulting in more accurate and precise answers to users' queries, resulting in ~8% decrease in number of escalated chats to human agents.
-
Led a team of 4 members in the development of a simulation and validation model for high-frequency trading of crude oil futures,
enabling the identification of profitable entry points.
Designed and implemented a data pipeline on AWS to acquire financial data and news articles from multiple sources.
Designed and built a visualization platform using Angular, AWS & Plotly, allowing traders to generate personalized plots and plots to reinforce the rationale behind recommended entry points.
Technical Projects
Pharmacy Accessibility
As part of my Big Data Management & Analytics course, I spearheaded a project to develop a Tableau dashboard that presented crucial pharmacy data alongside demographic information. Built a linear regression model that accurately predicted the number of pharmacies for counties without data, allowing us to identify under-served and over-served counties. Proposed new pharmacy locations based on the number of pharmacies serving every 10k people, aimed at improving access to healthcare in those areas.
Cyberminer
Designed and developed a basic search engine using Node and MongoDB as part of a team project for the Object Oriented Analysis & Design course. The project aimed to improve our understanding of object-oriented principles in software development, and to utilize UML as a tool for the design and analysis phase of the software development lifecycle. I gained valuable experience in applying OOP principles to real-world software development challenges and further developed my skills in UML modelling.
Art Auction System
As part of my coursework in Database Design, I collaborated with a team to design and develop an online art auction database system. The project involved creating the databases using MySQL and developing the backend services using Node. Our team successfully implemented a system that allowed artists and sellers to auction their paintings while providing buyers with the ability to bid on paintings they were interested in. This project provided me with the opportunity to enhance my skills in database design and development, as well as fostered my ability to effectively collaborate and work as part of a team.
Predicting Hacks Before They Happen
During a hackathon, I developed a predictive model to address the increasing occurrence of hacking in digital payments. The model utilizes 15 anonymized features to predict the likelihood of a hack, providing the cybersecurity team with valuable insights to prevent and protect against potential threats. Through this project, I gained practical experience in cybersecurity and data analytics, while also demonstrating my ability to innovate and problem-solve in a time-sensitive environment.
Wind Farm Layout Optimization
Participated in the Shell.ai hackathon with a friend and achieved a top 10% rank among 1530 teams by proposing a Python-based solution for wind farm layout optimization. Utilizing genetic algorithm, we determined the optimal layout for 50 turbines resulting in a mean Annual Energy Production (AEP) score of 532.04931 GWh. This experience provided me with valuable insights into using data-driven solutions to address complex problems in the energy sector.
Education
08/2021 – 05/2023
M.S. in Computer Science
The University of Texas at Dallas, Richardson, Texas, United States
GPA: 3.706
Activities:
- Member of the Fintech UTD group
I was part of the Comet Visor Project , where I created a script to run the efficient frontier model on stocks of a particular sector, which was used for the recommendation engine. Developed a script to gather headlines from various news sources and aggregate their sentiment scores. This script was used for generating one of the many indicators in the recommendation engine.
08/2013 – 06/2017
B.Tech. in Information Technology
Anna University, Chennai, Tamil Nadu, India
Thesis: Gravitational and Cuckoo Search – Hybrid algorithm for Task Scheduling Optimization
Proposed a hybrid algorithm that combines the advantages of Gravitational Search and Cuckoo Search algorithms to improve job scheduling efficiency in a cloud computing environment. By framing job scheduling as an optimization problem, the algorithm aims to explore the search space and perform local search effectively, leading to an optimal solution.
Advisor: Dr. M Anbu
GPA: 8.47
Honors and Awards:
- Achieved the 40th Rank out of 5587 candidates who graduated with a Bachelor of Technology degree in Information Technology.