Dipankar Sarkar 💻
Dipankar Sarkar

Independent Researcher

About Me

Computer scientist and entrepreneur specializing in machine learning and decentralized systems. My research focuses on federated learning, privacy-preserving AI, and blockchain infrastructure, with particular emphasis on making advanced ML techniques practical and accessible.

I’ve pioneered techniques for handling data imbalance in federated learning (Fed-Focal Loss) and optimizing communication efficiency (CatFedAvg), contributing to privacy-preserving distributed machine learning. My recent work extends into Web3, developing protocols for decentralized physical infrastructure networks (DePIN), MEV mitigation in Ethereum, and multi-rollup composability.

With 25+ patents and multiple peer-reviewed publications, I bridge theoretical advances with real-world implementations. As author of the Nginx implementation cookbook, I’ve contributed to widely-deployed infrastructure. Throughout my career at companies like Hike, Nykaa, and now as CTO of Cryptuon, I’ve architected systems serving hundreds of millions of users while advancing the state of the art in ML and blockchain technology.

My work sits at the intersection of privacy, scalability, and fairness—building AI and Web3 systems that are simultaneously more capable, more equitable, and more respectful of individual rights.

Download CV
Interests
  • Machine Learning
  • Federated Learning
  • Privacy-Preserving AI
  • Deep Learning
  • Blockchain Technology
  • Decentralized Systems
  • DePIN (Decentralized Physical Infrastructure)
  • Cryptography
Education
  • MSc Computer Science

    Arizona State University

  • Graduate Certificate, Strategic Studies

    The Takshashila Institute

  • BTech Computer Science

    Indian Institute of Technology

📚 My Research

My research lies at the intersection of machine learning, distributed systems, and decentralized technologies, with a focus on developing practical solutions that bridge theoretical advances with real-world applications.

Machine Learning & Privacy-Preserving AI

In federated learning, I’ve pioneered techniques for handling imbalanced data classification (Fed-Focal Loss) and optimizing communication efficiency (CatFedAvg), making distributed machine learning more practical for real-world deployments. My work addresses fundamental challenges in privacy-preserving AI: how to train powerful models across distributed data without compromising individual privacy or model performance. Current research explores federated learning for Web3 applications, deepfake detection networks, and copyright-compliant generative AI systems.

Blockchain & Decentralized Systems

My Web3 research focuses on infrastructure-level innovations: developing protocols for Decentralized Physical Infrastructure Networks (DePIN), designing mechanisms for fair value distribution and MEV mitigation in Ethereum, and solving atomic composability challenges in multi-rollup environments. Through projects like Tesseract, FairFlow, and generalized DePIN frameworks, I work to make decentralized systems more scalable, fair, and practical.

With 25+ patents and publications in leading venues, I actively collaborate on projects that advance both ML and blockchain technology while maintaining a focus on real-world applicability and ethical deployment.

Featured Publications
Patents
Recent & Upcoming Talks
Recent Posts

Tackling Data Imbalance in Federated Learning

How Fed-Focal Loss addresses one of the most challenging problems in distributed machine learning: handling imbalanced data across federated clients.