Skip navigation
Full metadata record
DC FieldValueLanguage
dc.contributor.authorSaar-Tsechansky, Maytal-
dc.contributor.authorProvost, Foster-
dc.date.accessioned2005-11-29T20:52:33Z-
dc.date.available2005-11-29T20:52:33Z-
dc.date.issued2001-
dc.identifier.urihttp://hdl.handle.net/2451/14165-
dc.description.abstractIn many cost-sensitive environments class probability estimates are used by decision makers to evaluate the expected utility from a set of alternatives. Supervised learning can be used to build class probability estimates; however, it often is very costly to obtain training data with class labels. Active sampling acquires data incrementally, at each phase identifying especially useful additional data for labeling, and can be used to economize on examples needed for learning. We outline the critical features for an active sampling approach and present an active sampling method for estimating class probabilities and ranking. BOOTSTRAP-LV identifies particularly informative new data for learning based on the variance in probability estimates, and by accounting for a particular data item's informative value for the rest of the input space. We show empirically that the method reduces the number of data items that must be obtained and labeled, across a wide variety of domains. We investigate the contribution of the components of the algorithm and show that each provides valuable information to help identify informative examples. We also compare BOOTSTRAP-LV with UNCERTAINTY SAMPLING,a n existing active sampling method designed to maximize classification accuracy. The results show that BOOTSTRAP-LV uses fewer examples to exhibit a certain class probability estimation accuracy and provide insights on the behavior of the algorithms. Finally, to further our understanding of the contributions made by the elements of BOOTSTRAP-LV, we experiment with a new active sampling algorithm drawing from both UNCERTAINIY SAMPLING and BOOTSTRAP-LV and show that it is significantly more competitive with BOOTSTRAP-LV compared to UNCERTAINTY SAMPLING. The analysis suggests more general implications for improving existing active sampling algorithms for classification.en
dc.format.extent5412318 bytes-
dc.format.mimetypeapplication/pdf-
dc.languageEnglishEN
dc.language.isoen_US-
dc.publisherStern School of Business, New York Universityen
dc.relation.ispartofseriesIS-01-03-
dc.subjectactive learningen
dc.subjectclass probability estimationen
dc.subjectcost-sensitive learningen
dc.titleActive Sampling for Class Probability Estimation and Rankingen
dc.typeWorking Paperen
dc.description.seriesInformation Systems Working Papers SeriesEN
Appears in Collections:IOMS: Information Systems Working Papers

Files in This Item:
File Description SizeFormat 
IS-01-03.pdf5.29 MBAdobe PDFView/Open


Items in FDA are protected by copyright, with all rights reserved, unless otherwise indicated.