1 / 6

ECE/CS/ME 539 Artificial Neural Networks Final Project

ECE/CS/ME 539 Artificial Neural Networks Final Project. A Comparison of a Learning Decision Tree and a 2-Layer Back-Propagation Neural Network on classifying a car purchase using a 2-Layer Back-Propagation Neural Network constructed in Java. Steve Ludwig 12-19-03. Introduction/Motivation.

maik
Download Presentation

ECE/CS/ME 539 Artificial Neural Networks Final Project

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ECE/CS/ME 539Artificial Neural NetworksFinal Project

  2. A Comparison of a Learning Decision Tree and a 2-Layer Back-Propagation Neural Network on classifying a car purchase using a 2-Layer Back-Propagation Neural Network constructed in Java Steve Ludwig 12-19-03

  3. Introduction/Motivation • Studied Decision Learning Trees • Same purpose as pattern classifying BP Neural Nets • Wanted to compare/contrast using identical data • Built own 2-layer back-propagation neural network in Java with customizable attributes

  4. Data • Learning Tree uses text-based attributes/values • Constructs ‘tree’ with nodes as attributes • Leaf nodes classify as positive or negative • Had to convert to numeric values for BP Neural Net • e.g. acceptable case = 1, unacceptable case = 0 • Could customize Neural Net parameters • Tried different learning rates, epochs, permutation of train set (to avoid overfitting)

  5. Results • Both Neural Net and Learning Tree had almost identical test set classification rates • Learning Tree = 95.789 % • BP Neural Net = 95.105 % • Learning Tree runs much faster, always consistent • Neural Net only consistent when train set not permutated

  6. Conclusions • Learning Tree works faster, great accuracy, can use text-based attributes • BP Neural Net has more flexibility, can be modified to work better (more hidden layers), still good classification rate

More Related