Documents
Poster
Group-wise Feature Selection for Supervised Learning
- Citation Author(s):
- Submitted by:
- Qi Xiao
- Last updated:
- 15 May 2022 - 1:31am
- Document Type:
- Poster
- Event:
- Presenters:
- Qi Xiao
- Paper Code:
- MLSP-2.4
- Categories:
- Log in to post comments
Feature selection has been explored in two ways, global feature selection and instance-wise feature selection. Global feature selection picks the same feature selector for the entire dataset, while instance-wise feature selection allows different feature selectors for different data instances. We propose group-wise feature selection, a new setting that sits between global and instance-wise feature selections. In group-wise feature selection, we constrain the number of possible feature selectors to be a finite number K, which allows different feature selectors while regularizing the number of different selectors. This is for flexible trade-offs between expressiveness and model complexity. We propose two techniques to solve the problem: the first applies K-Means Clustering to the instance-wise feature selection algorithm; the second uses the mixture of experts model with Gumbel-Softmax to learn group membership and feature selector simultaneously. We evaluate our techniques and show promising results on both synthetic and real datasets.