What is it about?

Ensemble learning methods have been widely used in machine learning in recent years due to their high predictive performance. With the development of genetic programming-based symbolic regression methods, many papers begin to choose a popular ensemble learning method, random forests, as the baseline competitor. Instead of considering them as competitors, an alternative idea might be to consider symbolic regression as an enhancement technique for random forest. Genetic programming-based symbolic regression methods which fit a smooth function are complementary to the piecewise nature of decision trees, as the smooth variation is common in regression problems. In this article, we propose to form an ensemble model with symbolic regression-based decision trees to address this issue. Furthermore, we design a guided mutation operator to speed up the search on high-dimensional problems, a multi-fidelity evaluation strategy to reduce the computational cost and an ensemble selection mechanism to improve predictive performance. Finally, experimental results on a regression benchmark with 120 datasets show that the proposed ensemble model outperforms 25 existing symbolic regression and ensemble learning methods. Moreover, the proposed method can provide notable insights on an XGBoost hyperparameter performance prediction task, which is an important application area of ensemble learning methods.

Featured Image

Read the Original

This page is a summary of: Genetic Programming-based Evolutionary Feature Construction for Heterogeneous Ensemble Learning [Hot of the Press], July 2023, ACM (Association for Computing Machinery),
DOI: 10.1145/3583133.3595831.
You can read the full text:

Read

Contributors

The following have contributed to this page