Universitat Politècnica de Catalunya. Doctorat en Teoria del Senyal i Comunicacions
Universitat Politècnica de Catalunya. Departament de Teoria del Senyal i Comunicacions
2024-03
The expressiveness of neural networks highly depends on the nature of the activation function, although these are usually assumed predefined and fixed during the training stage. Under a signal processing perspective, in this paper we present Expressive Neural Network (ENN), a novel model in which the non-linear activation functions are modeled using the Discrete Cosine Transform (DCT) and adapted using backpropagation during training. This parametrization keeps the number of trainable parameters low, is appropriate for gradient-based schemes, and adapts to different learning tasks. This is the first non-linear model for activation functions that relies on a signal processing perspective, providing high flexibility and expressiveness to the network. We contribute with insights in the explainability of the network at convergence by recovering the concept of bump, this is, the response of each activation function in the output space. Finally, through exhaustive experiments we show that the model can adapt to classification and regression tasks. The performance of ENN outperforms state of the art benchmarks, providing above a 40% gap in accuracy in some scenarios.
This work is part of the project IRENE (PID2020-115323RB-C31), funded by MCIN/AEI/10.13039/501100011033.
Peer Reviewed
Postprint (published version)
Article
English
Àrees temàtiques de la UPC::Enginyeria de la telecomunicació::Processament del senyal; Signal processing; Neural networks (Computer science); Machine learning; Adaptive activation functions; Discrete cosine transform; Explainable machine learning; Tractament del senyal; Xarxes neuronals (Informàtica); Aprenentatge automàtic
Institute of Electrical and Electronics Engineers (IEEE)
https://ieeexplore.ieee.org/document/10418453
http://creativecommons.org/licenses/by-nc-nd/4.0/
Open Access
Attribution-NonCommercial-NoDerivatives 4.0 International
E-prints [72986]