180 likes | 300 Views
Neuron-Adaptive Higher Order Neural Network Models for Automated Financial Data Modeling. Dr. Ming Zhang, Associate Professor Department of Physics, Computer Science & Engineering Christopher Newport University 1 University Place, Newport News, VA 23606, USA. Published.
E N D
Neuron-Adaptive Higher Order Neural Network Models for Automated Financial Data Modeling Dr. Ming Zhang, Associate Professor Department of Physics, Computer Science & Engineering Christopher Newport University 1 University Place, Newport News, VA 23606, USA
Published IEEE Transactions on Neural Networks Vol. 13 No. 1 January 2002
Problems • Real-world financial data is often non-linear, high-frequency, multi-polynomial components, and is discontinuous (piecewise continuous). • Classical neural network models are unable to automatically determine the optimum model and appropriate order for financial data approximation.
PHONN Simulator (1994 - 1996) - Polynomial Higher Order Neural Network financial data simulator - A$ 105,000 Supported by Fujitsu, Japan • THONN Simulator (1996 - 1998) - Trigonometric polynomial Higher Order Neural Network financial data simulator - A$ 10,000 Supported by Australia Research Council • PT-HONN Simulator (1999 - 2000) - Polynomial and Trigonometric polynomial Higher Order Neural Network financial data simulator - US$ 46,000 Supported by USA National Research Council
PT-HONN MODEL • The network architecture of PT-HONN has combined both the characteristics of PHONN and THONN. • It is a multi-layer network that consists of an input layer with input-units, and output layer with output-units, and two hidden layers consisting of intermediate processing units.
NAHONN * The network architecture of NAHONN is a multi-layer feed-forward network that consists of an input layer with input-units, an output layer with output-units, and one hidden layer consisting of intermediate processing units.* There is no activation function in the input layer and the output neurons are summing units (linear activation)* our activation function for the hidden layer processing units is a Neuron-Adaptive Activation Function (NAAF)
NAAF The activation function for the hidden layer processing units is a Neuron-Adaptive Activation Function (NAAF) defined aswhere a1,b1,a2,b2,a3 and b3 are real variable which will be adjusted (as well as weights) during training.
NAHONN Group • Neuron-Adaptive Feed-forward Higher Order Neural network Group (NAFNG) is one kind of neural network group in which each element is a neuron-adaptive feed-forward higher order neural network (Fi). We have: NAFNG ={F1, F2, F3,…... Fi,…...Fn}
Hornik, K. (1991) “Whenever the activation function is continuous, bounded and non-constant, then for an arbitrary compact subset X Rn, standard multi-layer feed-forward networks can approximate any continuous function on X arbitrarily well with respect to uniform distance, provided that sufficiently many hidden units are available”
Leshno, M. (1993) “A standard multi-layer feed-forward network with a locally bounded activation function can approximate any continuous function to any degree of accuracy if and only if the network’s activation function is not a polynomial”
Zhang, Ming (1995) “ Consider a neural network piecewise function group, in which each member is a standard multi-layer feed-forward neural network, and which has locally bounded, piecewise continuous (rather than polynomial) activation function and threshold. Each such group can approximate any king of piecewise continuous function, and to any degree of accuracy”
Feature of NAHONN • A neuron-adaptive feed-forward neural network group with adaptive neurons can approximate any kind of piecewise continuous function.
Conclusion • We proved that a single NAHONN can approximate any piecewise continuous function, to any desired accuracy. • The experimental results show that NAHONN can handle high-frequency data, model multi-polynomial data, simulate discontinuous data, and are capable of approximating any kind of piecewise continuous functions, to any degree of accuracy.