A DEEP LEARNING LIBRARY FOR X-RISK OPTIMIZATION
An open-source library that translates theories to real-world applications
Latest News Install
- [2022-07] LibAUC 1.2.0 is released! Please visit our github for more details!
- [2022-06] We gave a tutorial about Deep AUC Maximization at CVPR2022!
- [2022-06] 7 papers about optimization for ML/AI accepted to ICML 2022!
- [2022-02] Three papers were accepted by Top AI conferences and journal!
- [2021-11] DeepAUC algorithms achieve 1st Place in OGB Graph Property Prediction challenge!
- [2021-09] LibAUC V1.1.6 has been released! Support Training for Multi-Label!
- [2021-08] LibAUC V1.1.5 has been released! Check out all new features!
- [2021-07] Our paper "Robust Deep AUC Optimization" is accepted by ICCV2021!
- [2021-07] Dr. Yang is invited to give a talk at Google! Check the latest slides!
- [2021-06] We have released the code for AUPRC optimization in LibAUCv1.1.3!
- [2021-06] Our Library helps win the 1st Place in MIT AI Cures open challenge for COVID19!
- [2021-05] LibAUC V1.1.0 has been released! Check out all new features!
- [2021-05] Our paper "Deep Federated AUC Optimization" is accepted by ICML2021!
- Follow us and keep up to date with all latest progress!
KEY FEATURES & CAPABILITIES
Easy Installation
Easy to install and insert LibAUC code into existing training pipeline with Deep Learning frameworks like PyTorch.
Broad Applications
Users can learn different neural network structures (e.g., linear, MLP, CNN, GNN, transformer, etc) that support their data types.
Efficient Algorithms
Stochastic algorithms with provable convergence that support learning with millions of data points without a large batch size.
Hands-on Tutorials
Hands-on tutorials are provided for optimizing a variety of measures and objectives belonging to the family of X-risks.
WHAT IS X-RISK?
Traditional risk functions such as the cross-entropy loss, are limited in modeling a wide range of problems or tasks, e.g., imbalanced data, ranking problems, self-supervised learning. X-risk refers to a family of compositional measures/losses, in which each data point is compared with a set of data points explicitly or implicitly for defining a risk function. It covers a family of widely used measures/losses including but not limited to the following four interconnected categories:

QUICK FACTS
The achievements we made so far.
3+
Challenges winning solution (e.g., Stanford CheXpert, MIT AICures, OGB Graph Property Prediction).
4+
Collaborations and Deployments at multiple industrial units, e.g., Google, Uber, Tencent, etc.
17+
Scientific publications on top-tier AI Conferences (such as ICML, NeurIPS,ICLR).
35000+
Downloaded by more than 35K+ times by researchers around the world.
APPLICATIONS
Explore LibAUC with challenging applications.
CITATIONS
If LibAUC is helpful in your work, please cite our papers in BibTex( or ) and acknowledge our library. For any questions, please reach out to Zhuoning Yuan and Prof. Tianbao Yang.
@inproceedings{yuan2023libauc, title={LibAUC: A Deep Learning Library for X-risk Optimization}, author={Zhuoning Yuan and Dixian Zhu and Zi-Hao Qiu and Gang Li and Xuanhui Wang and Tianbao Yang}, booktitle={29th SIGKDD Conference on Knowledge Discovery and Data Mining}, year={2023} }
@article{yang2022algorithmic, title={Algorithmic Foundation of Deep X-risk Optimization}, author={Yang, Tianbao}, journal={arXiv preprint arXiv:2206.00439}, year={2022}