Title

PGGP: Prototype Generation via Genetic Programming

Author

Hugo Jair Escalante

Mario Graff

Alicia Morales-Reyes

Access level

Open Access

Summary or description

Prototype generation (PG) methods aim to find a subset of instances taken from a large training data set, in such a way that classification performance (commonly, using a 1NN classifier) when using prototypes is equal or better than that obtained when using the original training set. Several PG methods have been proposed so far, most of them consider a small subset of training instances as initial prototypes and modify them trying to maximize the classification performance on the whole training set. Although some of these methods have obtained acceptable results, training instances may be under-exploited, because most of the times they are only used to guide the search process. This paper introduces a PG method based on genetic programming in which many training samples are combined through arithmetic operators to build highly effective prototypes. The genetic program aims to generate prototypes that maximize an estimate of the generalization performance of an 1NN classifier. Experimental results are reported on benchmark data to assess PG methods. Several aspects of the genetic program are evaluated and compared to many alternative PG methods. The empirical assessment shows the effectiveness of the proposed approach outperforming most of the state of the art PG techniques when using both small and large data sets. Better results were obtained for data sets with numeric attributes only, although the performance of the proposed technique on mixed data was very competitive as well.

Publisher

Applied Soft Computing - Elsevier

Publish date

2016

Publication type

Article

Format

application/pdf

Language

English

Audience

Researchers

Source repository

Repositorio Institucional de INFOTEC

Downloads

390

Comments



You need to sign in or sign up to comment.