Neural Networks Theory II (NNTII)

CHAIR: BARTLOMIEJ BELICZYNSKI


Time: Monday 21st March, 14h20-16h00

Paper ID   Title
   
NNTII-1 Speeding up backpropagation with Multiplicative Batch Update Step
NNTII-2 Generating Sequential Triangle Strips by Using Hopfield Nets
NNTII-3 The Linear Approximation Method to the Modified Hopfield Neural Network Parameters Analysis
NNTII-4 The Analytical Analysis of Hopfield Neuron Parameters by the Application of Special Trans Function Theory
NNTII-5 Time-Oriented Hierarchical Method for Computation of Minor Components


NNTII-1
 
Title: Speeding up backpropagation with Multiplicative Batch Update Step
Author(s): P. Cruz
Abstract: Updating steps in a backpropagation algorithm presented by several authors for backpropagation algorithm. The field of statistics Stochastic Approximation has a close show that for functions of one variable, different values of u and d can produce very different results: fast convergence at the cost of a poor solution, slow convergence with a better solution, or produce a fast move towards a solution but without converging. To speed up backpropagation in a simple manner we propose a batch step adaptation technique for the online backpropagation algorithm based on theoretical results on simple cases.


NNTII-2
 
Title: Generating Sequential Triangle Strips by Using Hopfield Nets
Author(s): Jiri Sima
Abstract: The important task of generating the minimum number of sequential triangle strips(tristrips) for a given triangulated surface model is motived by applications in computer graphics. This hard combinatorial optimization problem is reduced to the minimum energy problem in Hopfield nets by a linear-size construction. The Hopfield network powered by simulated annealing (i.e. Boltzmann machine) which is implemented in a program HTGEN can be used for computing the semi-optimal stripifications. Practical experiments confirm that one can obtain much better results using HTGEN than by a leading stripification program FTSG although the running time of simulated annealing grows rapidly near the global optimum.


NNTII-3
 
Title: The Linear Approximation Method to the Modified Hopfield Neural Network Parameters Analysis
Author(s): S. I. Bauk,
S. M. Perovich,
A. Lompar
Abstract: The dynamic of Hopfield network is usually described by the system of linear differential equations. Our idea is to modify Hopfield network in aim to allow its behavior description by the system of exponential equations. Furthermore, the linear approximation method to the system of exponential equations, based on the Special Trans Function Theory (STFT), has been discussed.


NNTII-4
 
Title: The Analytical Analysis of Hopfield Neuron Parameters by the Application of Special Trans Function Theory
Author(s): S. M. Perovich,
S. I. Bauk,
N. Konjevic
Abstract: The subject of the theoretical analysis presented in the paper is Hopfield neuron electronic model modification based upon capacitor replacement with an inverse polarized diode. The modified neuron parameters have been analytically analyzed by application of the Special Trans Function Theory (STFT). The obtained results are presented numerically and graphically.


NNTII-5
 
Title: Time-Oriented Hierarchical Method for Computation of Minor Components
Author(s): M. Jankovic,
H. Ogawa
Abstract: This paper proposes a general method that transforms known neural network MSA algorithms, into MCA algorithms. The method uses two distinct time scales. A given MSA algorithm is responsible, on a faster time scale, for the “behavior” of all output neurons. At this scale minor subspace is obtained. On a slower time scale, output neurons compete to fulfill their “own interests”. At this scale, basis vectors in the minor subspace are rotated toward the minor eigenvectors. Actually, time-oriented hierarchical method is proposed. Some simplified mathematical analysis as well as simulation results are presented.