Multi criteria wrapper improvements to naive bayes learning

Loading...
Thumbnail Image
Identifiers

Publication date

Authors

Giráldez Betrón, Juan Ignacio

Advisors

Editors

Journal Title

Journal ISSN

Volume Title

Publisher

Springer

Metrics

Google Scholar

Research Projects

Organizational Units

Journal Issue

Abstract

Feature subset selection using a wrapper means to perform a search for an optimal set of attributes using the Machine Learning Algorithm as a black box. The Naive Bayes Classifier is based on the assumption of independence among the values of the attributes given the class value. Consequently, its effectiveness may decrease when the attributes are interdependent. We present FBL, a wrapper that uses information about dependencies to guide the search for the optimal subset of features and we use the Naive Bayes Classifier as the black-box Machine Learning algorithm. Experimental results show that FBL allows the Naive Bayes Classifier to achieve greater accuracies, and that FBL performs better than other classical filters and wrappers.

Description

UNESCO Subjects

Keywords

Bibliographic reference

Cortizo, J. C., & Giráldez, J. I. (2006). Multi criteria wrapper improvements to naive bayes learning. In E. Corchado, H. Yin, V. Botti & C. Fyfe (Eds.), International Conference on Intelligent Data Engineering and Automated Learning (IDEAL 2006) (pp. 419-427). Berlin: Springer.

Type of document