Research
My research interests are at the intersection of networked system and data-intensive computing.
|
|
Auxo: Heterogeneity-Mitigating Federated Learning via Scalable Client Clustering
Jiachen Liu,
Fan Lai,
Yinwei Dai,
Aditya Akella,
Harsha Madhyastha,
Mosharaf Chowdhury
SoCC, 2023
/ Github
/ Paper
We propose Auxo, a scalable FL system that enable the server to decompose the large-scale FL task into groups with smaller intra-cohort heterogeneity.
|
|
ModelKeeper: Accelerating DNN Training via Automated Training Warmup
Fan Lai,
Yinwei Dai,
Harsha Madhyastha,
Mosharaf Chowdhury
NSDI, 2023 Acceptance Rate: 18.38%
/ Github
/ Paper
/ Talk
We introduce ModelKeeper, a cluster-scale model service framework to accelerate DNN training, by reducing the computation needed for achieving the same model performance via automated model transformation
|
|
FedScale: Benchmarking Model and System Performance of Federated Learning
Fan Lai,
Yinwei Dai,
Sanjay Singapuram,
Jiachen Liu,
Xiangfeng Zhu,
Harsha Madhyastha,
Mosharaf Chowdhury
ICML, 2022 Acceptance Rate: 21.94%
/ Website
/ Github
Short Version: Best Paper Award @
ResilientFL at SOSP, 2021
Paper
/
Talk
We present FedScale, a diverse set of challenging and realistic benchmark datasets to facilitate scalable, comprehensive, and reproducible federated learning (FL) research.
|
Conference Reviewer: NeurIPS (Datasets and Benchmarks) 2022, 2023
Journal Reviewer: Transactions on Mobile Computing 2022
Artifact Evaluation Committee: SIGCOMM 2022, MLSys 2023
|
My name in Chinese:
If you want to chat with me, please send me an email. :)
|
|