Name: | Description: | Size: | Format: | |
---|---|---|---|---|
3.95 MB | Adobe PDF |
Advisor(s)
Abstract(s)
This study presents a novel approach for kernel selection based on Kullback–Leibler
divergence in variational autoencoders using features generated by the convolutional encoder. The
proposed methodology focuses on identifying the most relevant subset of latent variables to reduce
the model’s parameters. Each latent variable is sampled from the distribution associated with a
single kernel of the last encoder’s convolutional layer, resulting in an individual distribution for each
kernel. Relevant features are selected from the sampled latent variables to perform kernel selection,
which filters out uninformative features and, consequently, unnecessary kernels. Both the proposed
filter method and the sequential feature selection (standard wrapper method) were examined for
feature selection. Particularly, the filter method evaluates the Kullback–Leibler divergence between
all kernels’ distributions and hypothesizes that similar kernels can be discarded as they do not
convey relevant information. This hypothesis was confirmed through the experiments performed on
four standard datasets, where it was observed that the number of kernels can be reduced without
meaningfully affecting the performance. This analysis was based on the accuracy of the model when
the selected kernels fed a probabilistic classifier and the feature-based similarity index to appraise the
quality of the reconstructed images when the variational autoencoder only uses the selected kernels.
Therefore, the proposed methodology guides the reduction of the number of parameters of the model,
making it suitable for developing applications for resource-constrained devices.
Description
Keywords
Convolutional neural network Feature selection Latent variables Probabilistic classifier Variational autoencoder . Faculdade de Ciências Exatas e da Engenharia
Citation
Mendonça, F.; Mostafa, S.S.; Morgado-Dias, F.; Ravelo-García, A.G. On the Use of Kullback–Leibler Divergence for Kernel Selection and Interpretation in Variational Autoencoders for Feature Creation. Information 2023, 14, 571. https:// doi.org/10.3390/info14100571
Publisher
MDPI