UnipiEprints
Università di Pisa
Sistema bibliotecario di ateneo

A General Framework for Self-Organizing Structure Processing Neural Networks

Hammer, B. and Micheli, Alessio and Sperduti, Alessandro (2003) A General Framework for Self-Organizing Structure Processing Neural Networks. Technical Report del Dipartimento di Informatica . Università di Pisa, Pisa, IT.

[img] Postscript (GZip) - Published Version
Available under License Creative Commons Attribution No Derivatives.

Download (196Kb)

    Abstract

    Self-organization constitutes an important paradigm in machine learning with successful applications e.g. for data- and web-mining. However, so far most approaches have been proposed for data contained in a fixed and finite dimensional vector space. We will focus on extensions for more general data structures like sequences and tree structures in this article. Various extensions of the standard self-organizing map (SOM) to sequences or tree structures have been proposed in the literature: the temporal Kohonen map, the recursive SOM, and SOM for structured data (SOMSD), for example. These methods enhance the standard SOM by recursive connections. We define in this article a general recursive dynamic which enables the recursive processing of complex data structures based on recursively computed internal representations of the respective context. The above mechanisms of SOMs for structures are special cases of the proposed general dynamic, furthermore, the dynamic covers the supervised case of recurrent and recursive networks, too. The general framework offers a uniform notation for training mechanisms such as Hebbian learning and the transfer of alternatives such as vector quantization or the neural gas algorithm to structure processing networks. The formal definition of the recursive dynamic for structure processing unsupervised networks allows the transfer of theoretical issues from the SOM literature to the structure processing case. One can formulate general cost functions corresponding to vector quantization, neural gas, and a modification of SOM for the case of structures. The cost functions can be compared to Hebbian learning which can be interpreted as an approximation of a stochastic gradient descent. We derive as an alternative the exact gradients for general cost functions which can be computed in several ways similar to well-known learning algorithms for supervised recurrent and recursive networks. We afterwards discuss general design issues which should be taken into account when choosing internal representations and which limit the applicability of specific choices in concrete situations: We investigate the necessary cardinality of the set of internal representations, the tolerance with respect to noise, and the necessary of globality of processing. Finally, we discuss the issue of topology preservation of structure processing self-organizing maps and derive an explicit metric for a specific situation for SOMSD.

    Item Type: Book
    Uncontrolled Keywords: Self-organizing map, Kohonen map, Recurrent and Recursive SOM, Recurrent neural networks, Structured data
    Subjects: Area01 - Scienze matematiche e informatiche > INF/01 - Informatica
    Divisions: Dipartimenti (until 2012) > DIPARTIMENTO DI INFORMATICA
    Depositing User: dott.ssa Sandra Faita
    Date Deposited: 17 Dec 2014 16:34
    Last Modified: 17 Dec 2014 16:34
    URI: http://eprints.adm.unipi.it/id/eprint/2092

    Repository staff only actions

    View Item