Theory Of Disagreement-Based Active Learning

Balcan, M.-F. Hanneke, S. (2012). Robust interactive learning. In the proceedings of the 25th Annual Conference on Learning Theory (COLT).

Download the PDF file .

[ps] [arXiv] Active learning theory.

Download the PDF file .

[ps] This is an overview of some of the recent advances in active learning theory, with a particular focus on label complexity guarantees for divergence-based methods. The current version (v1.1) was updated on September 22, 2014. Some of the recent significant advances in active learning that have not yet been addressed in the survey: [ZC14], [WHE-Y15], [HY15]. An abbreviated version of this survey was published in the Foundations and Trends in Machine Learning series, Volume 7, Issues 2-3, 2014. Bio: Before arriving at TTIC, I was an independent scientist who worked at Princeton in 2012-18, apart from a short one-term stint as a visiting professor at Princeton University in 2018.

Prior to that, from 2009 to 2012, I was visiting assistant in the Department of Statistics at Carnegie Mellon University, which is also affiliated with the Machine Learning Department. I received my Ph.D. in 2009 from Carnegie Mellon University`s Machine Learning Department, which was advised by Eric Xing and Larry Wasserman. My doctoral thesis was on the theoretical basis of active learning. From 2002 to 2005, I studied computer science at the University of Illinois at Urbana-Champaign (UIUC), where I worked with Professor Dan Roth and students from the Cognitive Computing Group at semi-supervised learning. Before that, I studied computer science at Webster University in St. Louis, MO, where I played a bit with neural networks and classical AI. Research interests: My general interest in research is that of systems that can improve their performance with experience, a theme known as machine learning. I focus on statistical analysis of machine learning. The key questions I would like to answer are: “What can we learn from empirical observation/experiment” and “how much observation/experiment is necessary and sufficient to learn them?” This global theme is divided into several academic disciplines, including statistical theory of learning, artificial intelligence, statistical conclusions, algorithmic and statistical information theories, probability theory, scientific philosophy and knowledge theory. Hanneke, S., Kanade, V., and Yang, L. (2015).

Learn with a Drifting Target Concept. In Proceedings of the 26th International Conference on Algorithmic Learning Theory (ALAT).

Download the PDF file .

[ps] [arXiv] See also this note on a result for the example of the complexity of effective agnostic learning that is implicit in the above concept of drift paper:

Download the PDF file .

Yang, L., Hanneke, S., and Carbonell, J. (2010). Bayesian Active Learning with arbitrary binary value queries. In Proceedings of the 21st International Conference on Algorithmic Learning Theory (ALAT).

Download the PDF file .

[ps] Also available in the jargon of information theory.

Download the PDF file .

[ps] Teaching: Spring 2018: ORF 525, Statistical learning and non-parametric estimation. Spring 2012: 36-752, Advanced Probability Overview. Autumn 2011: 36-755, Advanced Statistical Theory I. Spring 2011: 36-752, Advanced Probability Overview.

Case 2010 Mini 1: 36-781, Advanced Statistical Methods I: Active Learning Fall 2010 Mini 2: 36-782, Advanced Statistical Methods II: Advanced Topics in Machine Learning Theory Spring 2010: 36-754, Advanced Probability II: Stoscha Processtices.

Please contact the office if you require a paper copy of any of these documents.

© 2020 Woodland View Primary School

Woodland View Primary School