Heterogeneous Multilayer Generalized Operational Perceptron

Publikation: Bidrag til tidsskrift/Konferencebidrag i tidsskrift /Bidrag til avisTidsskriftartikelForskningpeer review

  • Dat Thanh Tran, Tampere University of Technology
  • ,
  • Serkan Kiranyaz, Qatar University, Doha
  • ,
  • Moncef Gabbouj, Tampere University
  • ,
  • Alexandros Iosifidis

The traditional multilayer perceptron (MLP) using a McCulloch-Pitts neuron model is inherently limited to a set of neuronal activities, i.e., linear weighted sum followed by nonlinear thresholding step. Previously, generalized operational perceptron (GOP) was proposed to extend the conventional perceptron model by defining a diverse set of neuronal activities to imitate a generalized model of biological neurons. Together with GOP, a progressive operational perceptron (POP) algorithm was proposed to optimize a predefined template of multiple homogeneous layers in a layerwise manner. In this paper, we propose an efficient algorithm to learn a compact, fully heterogeneous multilayer network that allows each individual neuron, regardless of the layer, to have distinct characteristics. Based on the complexity of the problem, the proposed algorithm operates in a progressive manner on a neuronal level, searching for a compact topology, not only in terms of depth but also width, i.e., the number of neurons in each layer. The proposed algorithm is shown to outperform other related learning methods in extensive experiments on several classification problems.

OriginalsprogEngelsk
Artikelnummer8727718
TidsskriftIEEE Transactions on Neural Networks and Learning Systems
Vol/bind31
Nummer3
Sider (fra-til)710-724
Antal sider15
ISSN2162-237X
DOI
StatusUdgivet - mar. 2020

Se relationer på Aarhus Universitet Citationsformater

ID: 173491609