Abhijit Chunduru

Abhijit Chunduru

Master’s Student

University of Massachusetts Amherst

Hi, I’m Abhijit!

I am a first-year Master’s student in Computer Science at UMass Amherst, with a focus on ML systems and trustworthy AI. I have hands-on experience building scalable federated learning systems, including Flotilla, a modular and resilient federated learning framework developed during my time at the Indian Institute of Science, and FedProj, an algorithm addressing catastrophic forgetting under non-IID data distributions. Working on these systems made me deeply interested in privacy-preserving deployment, heterogeneous infrastructure, and making federated systems production-ready using Trusted Execution Environments and Confidential Containers.

I am currently learning CUDA and large-scale distributed training across multi-GPU setups. My broader interests span Generative Modeling, Reinforcement Learning, Robotics, and AI Alignment.

Experience

Interests

  • Federated Learning and Secure ML
  • Diffusion Models and World Models
  • RL, Robotics, and AI Alignment

Skills

Proficient
Python

Primary language across research projects, federated systems, and ML experiments

PyTorch

Built CNNs, Transformers, diffusion pipelines, and custom dataloaders for FL frameworks

Kubernetes

Deployed Confidential Containers with runtime attestation across multi-cloud infrastructure

Docker

Used heavily across research deployments and personal projects

Intermediate
AMD SEV-SNP and SGX

Architected TEE-based federated learning deployments with cryptographic remote attestation

AWS

Scaled Flotilla to 1,024 clients and orchestrated large-scale FL experiments

Hugging Face

Fine-tuned BERT, RoBERTa, and clinical language models with LoRA and prompt-based methods

C++ and Bash

Systems scripting and automation for heterogeneous cluster deployments

Familiar
ROS

Explored through graduate robotics coursework at UMass Amherst

SQL

Used for data management and experiment logging across research projects

scikit-learn and Pandas

Data preprocessing, baselines, and statistical analysis of federated learning results

RLHF and ReAct

Applied in LLM reasoning and agentic framework experiments

Education

 
 
 
 
 
Master of Science in Computer Science

Graduate student focusing on Federated Learning, Generative Modeling, and Trustworthy AI.

Relevant Courses:

  • Robotics
  • Advanced Generative AI
  • Advanced NLP
  • Advanced ML
  • Distributed Systems

 
 
 
 
 
Bachelor of Technology in Computer Science

Undergraduate degree in Computer Science with focus on machine learning, distributed systems, and software engineering.

Experience

 
 
 
 
 
Associate Software Engineer - Security Research Team

  • Architected confidential computing solutions using Multi-Party Computation, Differential Privacy, and AMD SEV-SNP TEEs with cryptographic remote attestation for healthcare federated learning deployments.
  • Deployed Confidential Containers (Peer Pods) on Kubernetes with runtime attestation, enabling privacy-preserving ML across multi-cloud infrastructure while reducing setup time by 60% and attack surface by 85%.

 
 
 
 
 
Federated Learning Research Intern

  • Co-developed Flotilla, an in-house FL framework surpassing Flower, FedML, and OpenFL in scalability and reliability.
  • Executed 90+ experiments on heterogeneous clusters (46 Raspberry Pis, 12 Nvidia Jetsons) benchmarking IID and non-IID performance.
  • Scaled Flotilla to 1,024 AWS clients, achieving 55% faster performance than Flower by eliminating communication bottlenecks.
  • Designed CPU and memory visualizations contributing to a publication in Elsevier JPDC.

 
 
 
 
 
NLP Research Intern

  • Engineered a Memory Module into ClinicalBERT, DistilBERT, RoBERTa, and Clinical-Longformer for sentiment analysis, achieving 90% accuracy.
  • Transitioned centralized models to federated learning, improving accuracy by 2% and reducing training time by 30% with LoRA, P-Tuning, and Prompt Tuning.