Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Melville, Prem | - |
dc.contributor.author | Saar-Tsechansky, Maytal | - |
dc.contributor.author | Provost, Foster | - |
dc.contributor.author | Mooney, Raymond | - |
dc.date.accessioned | 2008-12-02T18:01:52Z | - |
dc.date.available | 2008-12-02T18:01:52Z | - |
dc.date.issued | 2004-11 | - |
dc.identifier.citation | Proceedings of the 4th IEEE International Conference on Data Mining, | en |
dc.identifier.uri | http://hdl.handle.net/2451/27801 | - |
dc.description.abstract | Many induction problems include missing data that can be acquired at a cost. For building accurate predictive models, acquiring complete information for all instances is often expensive or unnecessary, while acquiring information for a random subset of instances may not be most effective. Active feature-value acquisition tries to reduce the cost of achieving a desired model accuracy by identifying instances for which obtaining complete information is most informative. We present an approach in which instances are selected for acquisition based on the current model’s accuracy and its confidence in the prediction. Experimental results demonstrate that our approach can induce accurate models using substantially fewer feature-value acquisitions as compared to alternative policies. | en |
dc.description.sponsorship | NYU, Stern School of Business, IOMS Department, Center for Digital Economy Research | en |
dc.format.extent | 60224 bytes | - |
dc.format.mimetype | application/pdf | - |
dc.language.iso | en_US | en |
dc.publisher | Proceedings of the 4th IEEE International Conference on Data Mining, | en |
dc.relation.ispartofseries | CeDER-PP-2004-06 | en |
dc.title | Active Feature-Value Acquisition for Classifier Induction | en |
dc.type | Article | en |
Appears in Collections: | CeDER Published Papers |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
CPP-06-04.pdf | 58.81 kB | Adobe PDF | View/Open |
Items in FDA are protected by copyright, with all rights reserved, unless otherwise indicated.