Research
Social navigation of Humanoid robot Digit (Perception and RL)
As a part of my capstone project, I developed an end-to-end navigation pipeline that directly maps latent features to bipedal robot actions. I used collision regions, emotion-related discomfort zones, and human-human interactions to derive the latent features. I developed an actor-critic reinforcement learning framework to train with full-body bipedal robot dynamics, effectively mitigating model discrepancies introduced by reduced order models (ROMs).

System integration of Humanoid robot Digit
As a part of the LIDAR Lab, I worked on system integration of a humanoid robot (Digit). I worked to help Digit navigate through an environment with humans and avoid collisions. My work involved using the Vicon Mocap system to detect humans in the vicinity and using the Zonotope-based model-predictive controller to plan Digit's trajectory.
Work published at T-ASE (IEEE Transactions in Automation Science and Engineering ).

Foundational models to understand failure recovery in robots
As a part of the Deep Learning for robotics coursework with Dr. Danfei Xu, my team and I explored VLM's to create open-world scene graph representations and used an LLM to identify robot failure given the event summary of the task performed by the robot. We evaluated our methodology on both AI2Thor simulator data and real life teleoperation data collected on the Stretch robot.

Turtlebot3 Burger
As a part of the Robotics research course (7785) at Georgia Tech, I developed algorithms using Python3, ROS2 and Gazebo to help the Turtlebot3 robot perform tasks like following an object, avoiding obstacles and navigating to waypoints, path planning with Navstack and maneuvering an obstacle course.

Biomechanics and machine learning
I was selected to be a MITACS Globalink Research Scholar and had the privilege to conduct research at the Human Performance Lab, New Brunswick, Canada. I worked under the supervision of Dr.Victoria Chester where I researched and developed various ML and DL models including CAT Boost, SVM with RBF kernels, KNN algorithms, LSTM’s and CNN’s for the binary classification problem of separating Autism vs Control class groups using multi-segment foot data.

Drone simulation and path planning
Worked under the supervision of Dr.Deepak Mishra at the Indian Institute of Space Science and Technology. I modeled a Quadcopter UAV in Gazebo integrated with Hokuyo LiDAR and Ardupilot controller. Created a custom environment with obstacles and used the RRT* algorithm to navigate the drone from start to goal.

Myoelectric pattern recognition
Worked under the guidance of Prof.P.A.Karthick on researching and developing CNN neural networks for classifying hand sEMG signals to be used in prosthesis for amputees,in collaboration with IIT Madras. I used the NINAPRO database, performed signal processing techniques using MATLAB and fed into neural nets. Achieved accuracy of 70 percent on a 10 class classification problem to classify hand movements like thumb flexion,extension etc.

KUKA robot vision
Worked under the guidance of Prof. Dr. S.K Pal. I built a colour detection software using OpenCV, which is implemented in a 6 D.O.F robotic arm at the Centre of Excellence,IIT Kharagpur. I also worked with TCUP Edge (IOT platform) to interface with the sensors of the FSW robot and building robust machine learning algorithms to control force and torque parameters.

Personal Projects
Reformulating dice game Farkle as a RL problem
Built a single-agent Farkle environment integrated with Stable Baselines3, enabling accessible RL research. Added adjustable rewards, hyperparameters, and a rule-based heuristic for baseline comparisons. Tested A2C and PPO, showcasing challenges in stochastic and long-horizon settings.

User study on social navigation strategies for robots
Analyzed user preferences of robot trajectories for humans displaying four different emotions namely Happy, Angry, Neutral and Sad. Teleoperated Digit the humanoid robot and Unitree Go2 robot to follow paths with differing distances around the human displaying a particular emotion and recorded videos of the process. Created a questionnaire and analyzed responses based on godspeed metrics about comfort, desired changes in the robot's path and overall preferences.

Behavior cloning for table-top manipulation
Used the Robomimic dataset, which contains information about robot states and actions to model the underlying action distribution. Trained an MLP model and an RNN model to predict robot actions in a supervised setting.

Learning dynamics model and planning for robot manipulation
Used the Pybullet environment to model a pusher on a table-top environment. The task is to make a Franka Emika Panda arm push a red disk into a designated green circle on the table. Trained an MLP model to predict the future position of the pusher given its current position and trajectory. Implemented random shooting algorithm to plan trajectories for the pusher.

Line Follower robot
We built a line-follower robot from scratch for Following’2020 competition conducted by RMI, the robotics and machine intelligence club of NIT Trichy.

Profanity filter
We collected tweets from Twitter and data from Reddit which included data with and without obscene language. We trained a ML model using the SVM classifier to predict whether a new sentence fed in is obscene or not. This was done to save children from viewing harmful content on the internet.

Sentiment Analysis
The 2020 farmer protests of Haryana, India were analysed,by collecting tweets and analyzing their nature. NLP techniques like vectorization and various pre-processing techniques to label the tweets as positive,negative or neutral were implemented to get an overall idea of the temperament of the people. Results were depicted in the form of bar graphs and charts for visualization.

Plagiarism checker
In this project, a Plagiarism detector is built using various NLP techniques.A machine learning model is developed which takes ‘n’ documents as input and outputs the percentage of plagiarism for everycombination.The percentage is calculated using the cosine similarity rule of Natural language processing. The entire programming is done in Jupyter environment.

About Me
I'm a second-year Master's student pursuing Robotics at Georgia Tech, expected to graduate in May 2025. I have a 3.9 GPA (as of Fall 2024). I graduated from the National Institute of Technology, Tiruchirappalli, India with a Bachelor's degree in Mechanical Engineering and a minor in Computer Science (GPA: 8.93). I have a keen interest in the areas of Robot Perception, Computer Vision, robot locomotion and AI applications in robotics and am working on related research projects with faculty at Georgia Tech.
My Resume
Get in Touch
I'm currently looking for full-time roles in robot perception/locomotion, AI, ML fields starting May 2025, based in USA. During my free time I play badminton and chess. I'm a huge fan of the Esports world and I follow it intently. Please feel free to drop me a message if my work resonates with you and if you have similar interests as me, and would like to discuss!
abirath.raju@gmail.com