210 likes | 394 Views
GSPT-AS-based Neural Network Design. Presenter: Kuan-Hung Chen Adviser: Tzi-Dar Chiueh October 13, 2003. Outline. Motivation GSPT-AS LMS Algorithm Power Amplifier Model Predistortor Architecture Simulation Results and Complexity Analysis Conclusions. Motivation.
E N D
GSPT-AS-based Neural Network Design Presenter: Kuan-Hung Chen Adviser: Tzi-Dar Chiueh October 13, 2003
Outline • Motivation • GSPT-AS LMS Algorithm • Power Amplifier Model • Predistortor Architecture • Simulation Results and Complexity Analysis • Conclusions
Motivation • Initial simulation results show that the GSPT-based neural network cannot converge. • The reason is that the magnitude of all weights will have approximately the same order if only the sign of the updating term is taken for weight updating. • So, it is reasonable that the magnitude of the updating term should be taken into account. • It is straightforward to apply the GSPT-AS LMS algorithm, that takes the magnitude of the updating term into account, to the weight updating in the neural network.
GSPT LMS Algorithm • Reduce the complexity of both linear filter and the coefficient updating block in an adaptive filter
GSPT-AS LMS Algorithm • Q(z) represents a power-of-2 value which is closest to z but not larger than zand g is the group size.
Coefficient Updater for GSPT-AS LMS • Based on the magnitude of the updating term, we choose the proper updating unit to receive the carryin/borrowin signal.
Power Amplifier Model • To simulate a solid-state power amplifier, the following model is used for the AM/AM conversion: • The AM/PM conversion of a solid-state power amplifier is small enough to be neglected. • A good approximation of existing amplifiers is obtained by choosing p in the range of 2 to 3.
Neural Network Structure • The neural network structure used is a MLP with one hidden layer. • The input layer has 1 neuron and 1 bias neuron. • The hidden layer has 10 neurons and 1 bias neuron. • The output layer has 1 neuron.
GSPT-AS-based Backpropagation Algorithm • Let Q(z) represent a power-of-2 value which is closest to z but not larger than z.
Complexity Analysis • N: The number of neurons in the hidden layer.
Conclusions • A low-complexity GSPT-AS-based neural network predistortor for nonlinear PA has been designed and simulated. • Simulation results show that the GSPT-AS-based neural network predistortor can achieve very close performance to the floating-point neural network predistortor with much lower complexity.
Reference • C. N. Chen, K. H. Chen, and T. D. Chiueh, “Algorithm and Architecture Design for a Low-Complexity Adaptive Equalizer,” in Proc. of IEEE ISCAS ‘03, 2003, pp. 304-307. • R. V. Nee and R. Prasad, OFDM Wireless Multimedia Communications, Artech House, 2000. • F. Langlet, H. Abdulkader, D. Roviras, A. Mallet, and F. Castanié, “Adaptive Predistortion for Solid State Power Amplifier using Multi-layer Perceptron,” GLOBECOM ’01. IEEE, Vol. 1, 25-29 Nov. 2001, pp. 325-329.