• Non ci sono risultati.

Hebbian Learning Algorithms for Training Convolutional Neural Networks

N/A
N/A
Protected

Academic year: 2022

Condividi "Hebbian Learning Algorithms for Training Convolutional Neural Networks"

Copied!
25
0
0

Testo completo

(1)

Hebbian Learning Algorithms for

Training Convolutional Neural Networks

Gabriele Lagani

Computer Science PhD University of Pisa

(2)

Outline

● SGD vs Hebbian learning

● Hebbian learning variants

● Training CNNs with Hebbian + WTA approach on image classification tasks (CIFAR-10 dataset)

● Comparison with CNNs trained with SGD

● Results and conclusions

(3)

Hebbian Learning Algorithms

SGD vs Hebbian Learning

3

● SGD training requires forward and backward pass

(4)

SGD vs Hebbian Learning

● SGD training requires forward and backward pass

(5)

Hebbian Learning Algorithms

SGD vs Hebbian Learning

5

● Hebbian learning rule:

● Unique local forward pass

● Advantage: layer-wise parallelizable

(6)

Hebbian Learning Variants

● Weight decay:

● Taking ɣ(x, w) = η y(x, w) w

(7)

Hebbian Learning Algorithms

Lateral Interaction

7

● Competitive learning

○ Winner-Takes-All (WTA)

○ Self-Organizing Maps (SOM)

(8)

Convolutional Layers

● Sparse connectivity

● Shared weights

● Translation invariance

Update aggregation by averaging in order to maintain shared weights

(9)

Hebbian Learning Algorithms

Final Classification Layer

● Supervised Hebbian learning with teacher neuron

9

(10)

Experimental Setup

● Hebbian + WTA approach applied to deep CNNs

● Extension of Hebbian rules to convolutional layers with shared kernels: update aggregation

● Teacher neuron for supervised Hebbian learning

● Hybrid network architectures (Hebbian + SGD layers)

(11)

Hebbian Learning Algorithms

Network Architecture

11

(12)

Different Configurations

(13)

Hebbian Learning Algorithms

Classifiers on top of Deep Layers

13

(14)

Classifiers on Deep Layers Trained with SGD

Considerations on Hebbian classifier:

● Pros: good on high-level features, fast training (1-2 epochs)

(15)

Hebbian Learning Algorithms 15

Classifiers on Hebbian Deep Layers

(16)

Layer 1 Kernels

(17)

Hebbian Learning Algorithms 17

Hybrid Networks

(18)

Hybrid Networks: Bottom Hebb. - Top SGD

(19)

Hebbian Learning Algorithms 19

Hybrid Networks: Bottom SGD - Top Hebb.

(20)

Hybrid Networks: SGD - Hebb. - SGD

(21)

Hebbian Learning Algorithms

Hybrid Networks: SGD - Hebb. - SGD

21

(22)

Conclusions

● Pros of Hebbian + WTA:

○ Effective for low level feature extraction

○ Effective for training higher network layers, including a classifier on top of high-level features

○ Takes fewer epochs than SGD (2 vs 10)

→ useful for transfer learning

● Cons of Hebbian + WTA:

○ Not effective for training intermediate network layers

○ Not effective for training a classier on top of low-level features.

(23)

Hebbian Learning Algorithms

● Explore other Hebbian learning variants

○ Hebbian PCA

■ Can achieve distributed coding at intermediate layers

○ Contrastive Hebbian Learning (CHL)

■ Free phase + clamped phase

■ Update step:

■ Equivalent to Gradient Descent

Future Works

23

(24)

● Switch to Spiking Neural Networks (SNN)

○ Spike Time Dependent Plasticity (STDP)

○ Higher biological plausibility

○ Low power consumption

■ Good for neuromorphic hardware implementation

■ Ideal for applications on constrained devices

Future Works

(25)

Hebbian Learning Algorithms

References

G. Amato, F. Carrara, F. Falchi, C. Gennaro,G. Lagani; Hebbian Learning Meets Deep Convolutional Neural Networks (2019)

http://www.nmis.isti.cnr.it/falchi/Draft/2019-ICIAP-HLMSD.pdf

● S. Haykin; Neural Networks and Learning Machines (2009)

● W.Gerstner, W. Kistler; Spiking Neuron Models (2002)

● X. Xie, S. H. Seung; Equivalence of Backpropagation and Contrastive Hebbian Learning in a Layered Network (2003)

25

Riferimenti

Documenti correlati

emergencia. Lo vivido, al contrario, demuestra graves carencias endémicas y que lo que ha hecho y sigue haciendo falta en Italia es un sistema fundado sobre

Classifica Indipendenti Web // Negramaro in cima! Per gli emergenti Il Caos al n° 1. GRETA RAY – IL MIO DOMANI. VIOLA VALENTINO – COMPRAMI REMIX. LUCA PACIONI – SEI TU. “In un

1959: Receptive fields of single neurones in the cat's striate cortex 1962: Receptive fields, binocular interaction and functional architecture in the cat's visual cortex

In un contesto di dinamicità del mondo moderno e in particolar modo grazie alla crescente sensibilità di agenti economici di favorire la crescita dell’azienda attraverso

più genitori ricorrono contro le note scolastiche” (Giornale del Popolo, 2018), “Quella nota è troppo bassa – e scatta il ricorso” (Corriere del Ticino, 2019).

Several parameters affect the performance of these signal processing units, including the concentration of drug associated with the effect of interest, the prob- ability of a

Both dreams and active imagination may eventually be followed by an autonoetic process through which the emergent noetic images may be analyzed and verbally shared in an

section is intended to throw light on books that make contemporary geography a lively and intellectually challenging field of postdisciplinary inquiry, one capable of contributing