460 likes | 2.11k Views
Extreme Learning Machine. Outline. Experimental Results ELM Weighted ELM Locally Weighted ELM Problem. Experiment. All training data are randomly chosen Targets are normalize -1 to 1 Features are normalize 0 to 1 Using RMSE criterion. Experimental results. Sinc function:
E N D
Outline • Experimental Results • ELM • Weighted ELM • Locally Weighted ELM • Problem
Experiment • All training data are randomly chosen • Targets are normalize -1 to 1 • Features are normalize 0 to 1 • Using RMSE criterion
Experimental results • Sinc function: • X=-10:0.05:10 • Train:351 • Test:50 • (hidden neuron, h, k)
Function: • X=-5:0.05:5 • Train:151 • Test:50 • (hidden neuron, h, k)
Function: • X1,x2,x3=-1:0.005:1 • Train:351 • Test:50 • (hidden neuron, h, k)
Machine CPU • Feature:6 • Train:100 • Test:109 • (hidden neuron, h, k)
Auto Price • Feature:15 ,1 nominal ,14 continuous • Train:80 • Test:79 • (hidden neuron, h, k)
Cancer • Feature:32 • Train:100 • Test:94 • (hidden neuron, h, k)
ELM Input layer hidden layer output layer The weights between input layer and hidden layer and the biases of neurons in the hidden layer are randomly chosen.
Locally Weighted ELM • Find the k nearest training data to testing data
Problem • Paper數據 • Randomly weight and bias • The output of Nearest data • (feature selection…?)