What is it about?

Feature selection is an important method and step in Machine learning. We used information imbalance method which select the relevant features based on similarity of target space and feature space. We used this method on the structural features of glassy systems to extract relevant descriptors for dynamics.

Featured Image

Why is it important?

This work is important as this method unlike other methods who use some model, it use similarity of spaces based on euclidean distance and ranks to extract features. Feature selection is important step in ML for: a) To reduce the dimension of training data which can speed up learning b) Reduces overfitting c) Tell us which features (and combination of features) are close to dynamical space. We get interesting insights when we used this on glassy systems. We selected features in both supervised and unsupervised manner and our supervised feature selection method successfully reproduced the old results with lot of new uncovering.

Perspectives

Working on this method was interesting and rewarding. We were able to understand this method with more intuitive insights and able to get better understanding of glassy systems. It was a pleasure to work with co-authors wit them I get to learn a lot of new things.

Anand Sharma
Indian Institute of Science Education Research Pune

Read the Original

This page is a summary of: Selecting relevant structural features for glassy dynamics by information imbalance, The Journal of Chemical Physics, November 2024, American Institute of Physics,
DOI: 10.1063/5.0235084.
You can read the full text:

Read

Contributors

The following have contributed to this page