adaboost
所属分类:matlab编程
开发工具:matlab
文件大小:7188KB
下载次数:374
上传日期:2009-06-04 23:18:39
上 传 者:
CSWXL
说明: 人脸识别,基于MATLAB的一个人脸识别训练样本
(recognition of face)
文件列表:
adaboost\adaboost_test.m (329, 2004-12-04)
adaboost\adaboost_train.m (4376, 2004-12-05)
adaboost\data_partition.m (926, 2004-12-05)
adaboost\debug_test.m (972, 2004-12-04)
adaboost\eva_stump.m (725, 2004-12-05)
adaboost\main.m (1696, 2009-03-21)
adaboost\part3_data.mat (196784, 2004-12-04)
adaboost\3άadaboost\adaboost.m (6696, 2004-10-10)
adaboost\3άadaboost\adaboostM1.m (8321, 2004-10-10)
adaboost\3άadaboost\adaboostM1_revise1.m (8529, 2004-10-10)
adaboost\3άadaboost\adaboostM1_revise2.m (8592, 2004-10-10)
adaboost\3άadaboost\addnoise.m (326, 2004-10-10)
adaboost\3άadaboost\circle.dat (34919, 2004-10-10)
adaboost\3άadaboost\cross.dat (35254, 2004-10-10)
adaboost\3άadaboost\exp1.m (2578, 2004-10-10)
adaboost\3άadaboost\exp2.m (1986, 2004-10-10)
adaboost\3άadaboost\exp3.m (1353, 2004-10-10)
adaboost\3άadaboost\exp4.m (819, 2004-10-10)
adaboost\3άadaboost\exp5.m (1117, 2004-10-10)
adaboost\3άadaboost\knn.m (622, 2004-10-10)
adaboost\3άadaboost\lda_tr.m (1616, 2004-10-10)
adaboost\3άadaboost\lda_ts.m (817, 2004-10-10)
adaboost\3άadaboost\line.dat (34848, 2004-10-10)
adaboost\3άadaboost\mytest.m (50, 2005-03-15)
adaboost\3άadaboost\nb_tr.m (1356, 2004-10-10)
adaboost\3άadaboost\nb_ts.m (1292, 2004-10-10)
adaboost\3άadaboost\normalize.m (768, 2004-10-10)
adaboost\3άadaboost\readfeature.m (974, 2004-10-10)
adaboost\3άadaboost\rec.dat (34986, 2004-10-10)
adaboost\3άadaboost\Report.pdf (91472, 2004-10-10)
adaboost\3άadaboost\resample.m (1253, 2004-10-10)
adaboost\3άadaboost\split.m (922, 2004-10-10)
adaboost\3άadaboost\tri.dat (34976, 2004-10-10)
adaboost\3άadaboost\usps.mat (514824, 2002-07-25)
adaboost\adaboost and rbf\abr_v1\COPYRIGHTS.TXT (3411, 2004-10-11)
adaboost\adaboost and rbf\abr_v1\mytest.m (1145, 2005-03-11)
adaboost\adaboost and rbf\abr_v1\sample_adaboost_reg.m (2167, 1999-12-11)
adaboost\adaboost and rbf\abr_v1\sample_rbf_classif.m (1881, 1999-12-10)
... ...
Part 1:
For each weak learner(tree stump), pick the best feature and best threshold(C), which is better
than building tree stump for each feature, and making decision on voting across all the features.
I.e., just one tree stump for each weak learner.
Part 2:
I=ceil(N * rand([N,1])) can obtain the bootstrap sample indices for N training samples.
Part 3:
Randomly using 90% data for training, 10% data
for evaluation, randomly flip 5% training data into the alternate class label. compute the
errors on the test set using the classifiers trained on the clean/noisy training data.
The maximum tree stumps are 50. Do this 50 times, average the errors.
DATA Error(%,clean) Error(%,noise) Increase(%)
Ionosphere 17.3 17.9 0.6
Sonar 19.1 22.8 3.7
When using the bootstrap samples(stochastic adaboost):
DATA Error(%,clean) Error(%,noise) Increase(%)
Ionosphere 15.6 17.0 1.4
Sonar 20.3 22.4 2.1
#######################################################################################
main.m: start point.
adaboost_train.m: adaboost training(with and without bootstrap sampling)
adaboost_test.m: make the final decision on voting on the M tree stumps
eva_stump.m: subroutine for evaluating a tree stump with one sample
data_partition.m: partition the original data set into training and test sets
debug_test.m: used for debuging only.
近期下载者:
相关文件:
收藏者: