ENN: a neural network with DCT adaptive activation functions

Altres autors/es

Universitat Politècnica de Catalunya. Doctorat en Teoria del Senyal i Comunicacions

Universitat Politècnica de Catalunya. Departament de Teoria del Senyal i Comunicacions

Data de publicació

2024-03

Resum

The expressiveness of neural networks highly depends on the nature of the activation function, although these are usually assumed predefined and fixed during the training stage. Under a signal processing perspective, in this paper we present Expressive Neural Network (ENN), a novel model in which the non-linear activation functions are modeled using the Discrete Cosine Transform (DCT) and adapted using backpropagation during training. This parametrization keeps the number of trainable parameters low, is appropriate for gradient-based schemes, and adapts to different learning tasks. This is the first non-linear model for activation functions that relies on a signal processing perspective, providing high flexibility and expressiveness to the network. We contribute with insights in the explainability of the network at convergence by recovering the concept of bump, this is, the response of each activation function in the output space. Finally, through exhaustive experiments we show that the model can adapt to classification and regression tasks. The performance of ENN outperforms state of the art benchmarks, providing above a 40% gap in accuracy in some scenarios.


This work is part of the project IRENE (PID2020-115323RB-C31), funded by MCIN/AEI/10.13039/501100011033.


Peer Reviewed


Postprint (published version)

Tipus de document

Article

Llengua

Anglès

Publicat per

Institute of Electrical and Electronics Engineers (IEEE)

Documents relacionats

https://ieeexplore.ieee.org/document/10418453

Citació recomanada

Aquesta citació s'ha generat automàticament.

Drets

http://creativecommons.org/licenses/by-nc-nd/4.0/

Open Access

Attribution-NonCommercial-NoDerivatives 4.0 International

Aquest element apareix en la col·lecció o col·leccions següent(s)

E-prints [73012]