adaboost
所属分类:人工智能/神经网络/深度学习
开发工具:matlab
文件大小:7186KB
下载次数:289
上传日期:2008-06-11 18:11:14
上 传 者:
Jack_Li
说明: 非常难得的adaboost程序,包括3维adaboost adaboost and rbf 等,非常适合adaboost感兴趣的初学者
(Very rare AdaBoost procedures, including 3-dimensional adaboost adaboost and rbf and so on, very suitable for beginners interested in AdaBoost)
文件列表:
adaboost\3άadaboost\adaboost.m (6696, 2008-06-06)
adaboost\3άadaboost\adaboostM1.m (8321, 2008-06-06)
adaboost\3άadaboost\adaboostM1_revise1.m (8529, 2008-06-06)
adaboost\3άadaboost\adaboostM1_revise2.m (8592, 2008-06-06)
adaboost\3άadaboost\addnoise.m (326, 2008-06-06)
adaboost\3άadaboost\circle.dat (34919, 2008-06-06)
adaboost\3άadaboost\cross.dat (35254, 2008-06-06)
adaboost\3άadaboost\exp1.m (2578, 2008-06-06)
adaboost\3άadaboost\exp2.m (1986, 2008-06-06)
adaboost\3άadaboost\exp3.m (1353, 2008-06-06)
adaboost\3άadaboost\exp4.m (819, 2008-06-06)
adaboost\3άadaboost\exp5.m (1117, 2008-06-06)
adaboost\3άadaboost\knn.m (622, 2008-06-06)
adaboost\3άadaboost\lda_tr.m (1616, 2008-06-06)
adaboost\3άadaboost\lda_ts.m (817, 2008-06-06)
adaboost\3άadaboost\line.dat (34848, 2008-06-06)
adaboost\3άadaboost\mytest.m (50, 2008-06-06)
adaboost\3άadaboost\nb_tr.m (1356, 2008-06-06)
adaboost\3άadaboost\nb_ts.m (1292, 2008-06-06)
adaboost\3άadaboost\normalize.m (768, 2008-06-06)
adaboost\3άadaboost\readfeature.m (974, 2008-06-06)
adaboost\3άadaboost\rec.dat (34986, 2008-06-06)
adaboost\3άadaboost\Report.pdf (91472, 2008-06-06)
adaboost\3άadaboost\resample.m (1253, 2008-06-06)
adaboost\3άadaboost\split.m (922, 2008-06-06)
adaboost\3άadaboost\tri.dat (34976, 2008-06-06)
adaboost\3άadaboost\usps.mat (514824, 2008-06-06)
adaboost\adaboost and rbf\abr_v1\@adabooster\adabooster.m (1194, 2008-06-06)
adaboost\adaboost and rbf\abr_v1\@adabooster\calc_output.m (1134, 2008-06-06)
adaboost\adaboost and rbf\abr_v1\@adabooster\calc_output_step.m (1466, 2008-06-06)
adaboost\adaboost and rbf\abr_v1\@adabooster\calc_output_steps.m (1776, 2008-06-06)
adaboost\adaboost and rbf\abr_v1\@adabooster\comp_distr.m (611, 2008-06-06)
adaboost\adaboost and rbf\abr_v1\@adabooster\comp_weight.m (1344, 2008-06-06)
adaboost\adaboost and rbf\abr_v1\@adabooster\CVS\Entries (1015, 2008-06-06)
adaboost\adaboost and rbf\abr_v1\@adabooster\CVS\Repository (21, 2008-06-06)
adaboost\adaboost and rbf\abr_v1\@adabooster\CVS\Root (51, 2008-06-06)
adaboost\adaboost and rbf\abr_v1\@adabooster\display.m (468, 2008-06-06)
adaboost\adaboost and rbf\abr_v1\@adabooster\do_learn.m (2266, 2008-06-06)
adaboost\adaboost and rbf\abr_v1\@adabooster\finish_learn.m (934, 2008-06-06)
... ...
Part 1:
For each weak learner(tree stump), pick the best feature and best threshold(C), which is better
than building tree stump for each feature, and making decision on voting across all the features.
I.e., just one tree stump for each weak learner.
Part 2:
I=ceil(N * rand([N,1])) can obtain the bootstrap sample indices for N training samples.
Part 3:
Randomly using 90% data for training, 10% data
for evaluation, randomly flip 5% training data into the alternate class label. compute the
errors on the test set using the classifiers trained on the clean/noisy training data.
The maximum tree stumps are 50. Do this 50 times, average the errors.
DATA Error(%,clean) Error(%,noise) Increase(%)
Ionosphere 17.3 17.9 0.6
Sonar 19.1 22.8 3.7
When using the bootstrap samples(stochastic adaboost):
DATA Error(%,clean) Error(%,noise) Increase(%)
Ionosphere 15.6 17.0 1.4
Sonar 20.3 22.4 2.1
#######################################################################################
main.m: start point.
adaboost_train.m: adaboost training(with and without bootstrap sampling)
adaboost_test.m: make the final decision on voting on the M tree stumps
eva_stump.m: subroutine for evaluating a tree stump with one sample
data_partition.m: partition the original data set into training and test sets
debug_test.m: used for debuging only.
近期下载者:
相关文件:
收藏者: