Other

Google Image Matching Challenge: 3D Scene Reconstruction

I developed accurate 3D maps by implementing three local feature/matcher methods: LoFTR, DISK, and KeyNetAffNetHardNet. The competition aimed to reconstruct 3D scenes from multiple views, utilizing unstructured image collections available on the internet. This project has potential applications in photography, cultural heritage preservation, and various Google services, showcasing my expertise in Structure from Motion techniques and advancing to 3D modeling.

FitGen - Personalized Exercise Planner

FitGen is a groundbreaking project that utilizes genetic algorithms and video pose analysis to create custom exercise plans based on individual body weight, height, exercise goals, and heart rate. By optimizing workout routines through real-time feedback on form, FitGen empowers users to achieve their fitness goals efficiently and safely by minimizing the risk of injury.

Stock Trading using RL

This project utilizes Reinforcement Learning (RL) techniques to develop a stock trading system. By training RL agents to make informed trading decisions based on historical data, it aims to optimize portfolio management and enhance returns in the stock market, offering a data-driven approach to trading strategies.

Covivac Bot

This Telegram bot offers convenient one-click functionality to check vaccine appointment slots' availability and receive instant alerts when slots become open. Streamlining the process, it ensures users can easily access vaccination appointments with minimal effort.

PyTorch-Powered Machine Learning Models

This project is dedicated to creating a comprehensive repository of machine learning models implemented using PyTorch. It provides a valuable resource for both beginners and experts, offering well-structured, PyTorch-based implementations of various machine learning algorithms and neural networks. Users can explore, experiment, and build upon these models for a wide range of applications, from computer vision to natural language processing.