Research

Papers, preprints, and ongoing projects. The through-line is sparse recovery applied wherever it fits — optimizers, attention, KV cache, interpretability, privacy.

publications & preprints
2024
SPriFed-OMP: A Differentially Private Federated Learning Algorithm for Sparse Recovery

Ajinkya Kiran Mulay, et al.

TMLR 2024 · arXiv 2402.19016

Federated learning via OMP under DP constraints. Near-SOTA at 8–10% of dense parameters with formal convergence guarantees.

2022
LOCKS: User Differentially Private and Federated Optimal Client Sampling

arXiv 2212.13071

Optimal client sampling strategy for differentially private federated networks.

2022
Private Hypothesis Testing for Social Sciences

ICML Workshop on Theory of Differential Privacy 2022

active projects
ongoing
Polar Decomposition Optimizers for Transformer Training

Halley Deg-5 T=3 optimizer outperforms GramMuon T=5 on WikiText-103 and enwiki8. Running experiments on H100 and RTX PRO 6000 Blackwell. Targets ICML-tier venue.

ongoing
Sparse Pursuit Transformers: A Unified Framework

OMP and FoBa applied uniformly across FFN activations, attention mechanisms, and optimizer orthogonalization. FoBa-Gate as SwiGLU replacement (p=0.012, Cohen’s d=2.16). OMP-attention with ridge-regularized backward (3× better validation perplexity than softmax).

ongoing
PowerClouds / SuperPower

Statistical power analysis via metric learning.