A Neural Network Architecture for the Categorization of Temporal Information

Science Direct Working Paper No S1574-034X(04)70162-3

58 Pages Posted: 5 Mar 2018

See all articles by Adriaan G. Tijsseling

Adriaan G. Tijsseling

affiliation not provided to SSRN

Luc Berthouze

affiliation not provided to SSRN

Date Written: August 2002

Abstract

In this paper we propose a neural network architecture that is capable of continuous learning multiple, possibly overlapping, arbitrary input sequences relatively quickly, autonomously and online. The architecture of this network has been constructed according to design principles derived from neuroscience and existing work on recurrent network models. The network utilizes sigmoid-pulse generating spiking neurons together with a Hebbian learning rule with synaptic noise. Combined with coincidence detection and an internal feedback mechanism, this leads to a learning process that is driven by dynamic adjustment of the learning rate. This gives the network the ability to not only adjust incorrectly recalled parts of a sequence but also to reinforce and stabilize the recall of previously acquired sequences. The performance of the network is tested with a set of overlapping sequences from an existing problem domain and the relative contribution from each design principle is analyzed.

Keywords: temporal information, recurrent neural network, autonomous Hebbian learning

Suggested Citation

Tijsseling, Adriaan G. and Berthouze, Luc, A Neural Network Architecture for the Categorization of Temporal Information (August 2002). Science Direct Working Paper No S1574-034X(04)70162-3, Available at SSRN: https://ssrn.com/abstract=3125426

Adriaan G. Tijsseling (Contact Author)

affiliation not provided to SSRN

No Address Available

Luc Berthouze

affiliation not provided to SSRN

No Address Available

Do you have negative results from your research you’d like to share?

Paper statistics

Downloads
9
Abstract Views
162
PlumX Metrics