Skip navigation
Full metadata record
DC FieldValueLanguage
dc.contributor.authorPerlich, Claudia-
dc.contributor.authorProvost, Foster0-
dc.description.abstractDue to interest in social and economic networks, relational modeling is attracting increasing attention. The field of relational data mining/learning, which traditionally was dominated by logic-based approaches, has recently been extended by adapting learning methods such as naive Bayes, Baysian networks and decision trees to relational tasks. One aspect inherent to all methods of model induction from relational data is the construction of features through the aggregation of sets. The theoretical part of this work (1) presents an ontology of relational concepts of increasing complexity, (2) derives classes of aggregation operators that are needed to learn these concepts, and (3) classifies relational domains based on relational schema characteristics such as cardinality. We then present a new class of aggregation functions, ones that are particularly well suited for relational classification and class probability estimation. The empirical part of this paper demonstrates on real domain the effects on the system performance of different aggregation methods on different relational concepts. The results suggest that more complex aggregation methods can significantly increase generalization performance and that, in particular, task-specific aggregation can simplify relational prediction tasks into well-understood propositional learning problems.en
dc.format.extent3373502 bytes-
dc.publisherStern School of Business, New York Universityen
dc.subjectRelational Learningen
dc.subjectFeature Inventionen
dc.titleAggregation-Based Feature Invention and Relationalen
dc.typeWorking Paperen
dc.description.seriesInformation Systems Working Papers SeriesEN
Appears in Collections:IOMS: Information Systems Working Papers

Files in This Item:
File Description SizeFormat 
IS-03-03.pdf3.29 MBAdobe PDFView/Open

Items in FDA are protected by copyright, with all rights reserved, unless otherwise indicated.