What is it about?
Feature selection is an important method and step in Machine learning. We used information imbalance method which select the relevant features based on similarity of target space and feature space. We used this method on the structural features of glassy systems to extract relevant descriptors for dynamics.
Featured Image
Photo by Aakash Dhage on Unsplash
Why is it important?
This work is important as this method unlike other methods who use some model, it use similarity of spaces based on euclidean distance and ranks to extract features. Feature selection is important step in ML for: a) To reduce the dimension of training data which can speed up learning b) Reduces overfitting c) Tell us which features (and combination of features) are close to dynamical space. We get interesting insights when we used this on glassy systems. We selected features in both supervised and unsupervised manner and our supervised feature selection method successfully reproduced the old results with lot of new uncovering.
Perspectives
Read the Original
This page is a summary of: Selecting relevant structural features for glassy dynamics by information imbalance, The Journal of Chemical Physics, November 2024, American Institute of Physics,
DOI: 10.1063/5.0235084.
You can read the full text:
Contributors
The following have contributed to this page