🏃♀️📖 1 Paper accepted to TMLR 2025! 🏃♀️📖
We are ecstatic to announce that the lab has one paper accepted to TMLR 2025! Congratulations to our student, Beyza Kalkanlı, for her exceptional and groundbreaking work:
- ⭐
Dependency-aware Maximum Likelihood Estimation for Active Learning
Dependency-aware Maximum Likelihood Estimation for Active Learning
Dependency-aware Maximum Likelihood Estimation for Active Learning
Beyza proposed a dependency-aware MLE (DMLE), which corrects maximimum likelihood estimation (MLE) within an active learning framework by addressing sample dependencies typically neglected due to the independent and identically distributed (i.i.d.) assumption in sequential active learning cycles. She models the selection-induced dependence across active-learning cycles—each selected sample depends on prior-cycle parameter estimates—and incorporates this dependence into the likelihood used for MLE in the next cycle. Her approach achieved superior performance across multiple benchmark datasets, reaching higher performance in earlier active learning cycles compared to conventional MLE. This approach is a general agnostic wrapper to any active learning acquisition function, so be sure to wrap your function!