1 / 21

GSPT-AS-based Neural Network Design

GSPT-AS-based Neural Network Design. Presenter: Kuan-Hung Chen Adviser: Tzi-Dar Chiueh October 13, 2003. Outline. Motivation GSPT-AS LMS Algorithm Power Amplifier Model Predistortor Architecture Simulation Results and Complexity Analysis Conclusions. Motivation.

dyre
Download Presentation

GSPT-AS-based Neural Network Design

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. GSPT-AS-based Neural Network Design Presenter: Kuan-Hung Chen Adviser: Tzi-Dar Chiueh October 13, 2003

  2. Outline • Motivation • GSPT-AS LMS Algorithm • Power Amplifier Model • Predistortor Architecture • Simulation Results and Complexity Analysis • Conclusions

  3. Motivation • Initial simulation results show that the GSPT-based neural network cannot converge. • The reason is that the magnitude of all weights will have approximately the same order if only the sign of the updating term is taken for weight updating. • So, it is reasonable that the magnitude of the updating term should be taken into account. • It is straightforward to apply the GSPT-AS LMS algorithm, that takes the magnitude of the updating term into account, to the weight updating in the neural network.

  4. Basic Structure of an LMS Adaptive Filter

  5. GSPT LMS Algorithm • Reduce the complexity of both linear filter and the coefficient updating block in an adaptive filter

  6. GSPT-AS LMS Algorithm • Q(z) represents a power-of-2 value which is closest to z but not larger than zand g is the group size.

  7. Coefficient Updater for GSPT-AS LMS • Based on the magnitude of the updating term, we choose the proper updating unit to receive the carryin/borrowin signal.

  8. Power Amplifier Model • To simulate a solid-state power amplifier, the following model is used for the AM/AM conversion: • The AM/PM conversion of a solid-state power amplifier is small enough to be neglected. • A good approximation of existing amplifiers is obtained by choosing p in the range of 2 to 3.

  9. Transfer Function of AM/AM Conversion

  10. 64-QAM Constellations Distorted by PA Model

  11. Predistortor Architecture

  12. Neural Network Structure • The neural network structure used is a MLP with one hidden layer. • The input layer has 1 neuron and 1 bias neuron. • The hidden layer has 10 neurons and 1 bias neuron. • The output layer has 1 neuron.

  13. Backpropagation Algorithm

  14. GSPT-AS-based Backpropagation Algorithm • Let Q(z) represent a power-of-2 value which is closest to z but not larger than z.

  15. Simulation Results (1)

  16. Simulation Results (2)

  17. 64-QAM Constellation with GSPT-AS-based Predistortion

  18. Floating-Point Scheme vs. GSPT-AS-based Scheme

  19. Complexity Analysis • N: The number of neurons in the hidden layer.

  20. Conclusions • A low-complexity GSPT-AS-based neural network predistortor for nonlinear PA has been designed and simulated. • Simulation results show that the GSPT-AS-based neural network predistortor can achieve very close performance to the floating-point neural network predistortor with much lower complexity.

  21. Reference • C. N. Chen, K. H. Chen, and T. D. Chiueh, “Algorithm and Architecture Design for a Low-Complexity Adaptive Equalizer,” in Proc. of IEEE ISCAS ‘03, 2003, pp. 304-307. • R. V. Nee and R. Prasad, OFDM Wireless Multimedia Communications, Artech House, 2000. • F. Langlet, H. Abdulkader, D. Roviras, A. Mallet, and F. Castanié, “Adaptive Predistortion for Solid State Power Amplifier using Multi-layer Perceptron,” GLOBECOM ’01. IEEE, Vol. 1, 25-29 Nov. 2001, pp. 325-329.

More Related