E2H Distance-Weighted Minimum Reference Set for Numerical and Categorical Mixture Data and a Bayesian Swap Feature Selection Algorithm

Yuto Omae, Masaya Mori

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

Generally, when developing classification models using supervised learning methods (e.g., support vector machine, neural network, and decision tree), feature selection, as a pre-processing step, is essential to reduce calculation costs and improve the generalization scores. In this regard, the minimum reference set (MRS), which is a feature selection algorithm, can be used. The original MRS considers a feature subset as effective if it leads to the correct classification of all samples by using the 1-nearest neighbor algorithm based on small samples. However, the original MRS is only applicable to numerical features, and the distances between different classes cannot be considered. Therefore, herein, we propose a novel feature subset evaluation algorithm, referred to as the “E2H distance-weighted MRS,” which can be used for a mixture of numerical and categorical features and considers the distances between different classes in the evaluation. Moreover, a Bayesian swap feature selection algorithm, which is used to identify an effective feature subset, is also proposed. The effectiveness of the proposed methods is verified based on experiments conducted using artificially generated data comprising a mixture of numerical and categorical features.

Original languageEnglish
Pages (from-to)109-127
Number of pages19
JournalMachine Learning and Knowledge Extraction
Volume5
Issue number1
DOIs
Publication statusPublished - Mar 2023

Keywords

  • Bayesian optimization
  • classification
  • feature subset selection
  • machine learning
  • minimum reference set

Fingerprint

Dive into the research topics of 'E2H Distance-Weighted Minimum Reference Set for Numerical and Categorical Mixture Data and a Bayesian Swap Feature Selection Algorithm'. Together they form a unique fingerprint.

Cite this