H-ELM

所属分类:数值算法/人工智能
开发工具:matlab
文件大小:4KB
下载次数:48
上传日期:2018-10-12 11:14:48
上 传 者lanlianmeng
说明:  可用作数据分类和拟合,深度极限学习机拥有深度学习的优势和自身计算速度快的优势
(It can be used to classify and fit data. The deep extrme learning machine has the advantages of depth learning and fast computing speed.)

文件列表:
H-ELM\demo_MNIST.m (1596, 2015-08-11)
H-ELM\demo_NORB.m (1732, 2015-08-09)
H-ELM\helm_train.m (2356, 2015-08-09)
H-ELM\LICENSE.txt (1110, 2015-08-10)
H-ELM\result_tra.m (76, 2015-08-09)
H-ELM\sparse_elm_autoencoder.m (406, 2015-08-09)
H-ELM (0, 2018-10-12)

The demo consists of two parts: source codes and data. In order to reproduce the exact results of experiments in our paper, please use the same simulation data, including testing datasets (MNIST and NORB) and random matrices, which can be provided upon request via email: cwdeng@bit.edu.cn To use these codes, you can simply unzip all files into the same path and then run the "demo_MNIST.m" and "demo_NORB.m". The main training function "helm_train()" could be called as follows: Example: [TrainingAccuracy, TestingAccuracy, Training_time, Testing_time] = helm_train(train_x, train_y, test_x, test_y, b1, b2, b3, s, C); % train_x is the training data and train_y is the training label. % test_x is the training data and test_y is the training label. % b1, b2 and b3 are the random matrices, they are pre-stored in our demo and can be used by loading the random*.mat. % C is the L2 penalty of the last layer ELM and s is the scaling factor of the activation function. Please cite our paper in your publications if it helps your research: @article{tang2015helm, author={Tang, Jiexiong and Deng, Chenwei and Huang, Guang-Bin.}, journal={IEEE Transactions on Neural Networks and Learning Systems}, title={Extreme Learning Machine for Multilayer Perceptron}, year={2015}, doi={10.1109/TNNLS.2015.2424995}, ISSN={2162-237X},} Enjoy, :-).

近期下载者

相关文件


收藏者