ivmSoftware4.2
所属分类:其他
开发工具:matlab
文件大小:7392KB
下载次数:3
上传日期:2018-07-01 21:20:46
上 传 者:
anna_zhao
说明: 输入向量机的MATLAB功能包,包括演示,接口。
(Input vector machine's MATLAB function package, including demo, interface.)
文件列表:
ivmSoftware4.2 (0, 2018-07-01)
ivmSoftware4.2\eigen-eigen-b23437e61a07.zip (776524, 2015-12-13)
ivmSoftware4.2\ivmSoftware4.2 (0, 2018-07-01)
ivmSoftware4.2\ivmSoftware4.2\D2_D3.m (320, 2013-03-25)
ivmSoftware4.2\ivmSoftware4.2\D3_D2.m (131, 2013-03-18)
ivmSoftware4.2\ivmSoftware4.2\X.mat (1176, 2007-08-07)
ivmSoftware4.2\ivmSoftware4.2\check_input.m (1334, 2012-08-03)
ivmSoftware4.2\ivmSoftware4.2\compute_kernel.cpp (1708, 2011-05-11)
ivmSoftware4.2\ivmSoftware4.2\compute_kernel.mexw64 (11264, 2015-12-14)
ivmSoftware4.2\ivmSoftware4.2\dna.mat (63859, 2012-06-01)
ivmSoftware4.2\ivmSoftware4.2\eigen (0, 2018-07-01)
ivmSoftware4.2\ivmSoftware4.2\eigen\.hg_archival.txt (94, 2010-02-12)
ivmSoftware4.2\ivmSoftware4.2\eigen\.hgignore (170, 2010-02-12)
ivmSoftware4.2\ivmSoftware4.2\eigen\.hgtags (472, 2010-02-12)
ivmSoftware4.2\ivmSoftware4.2\eigen\.krazy (42, 2010-02-12)
ivmSoftware4.2\ivmSoftware4.2\eigen\CMakeLists.txt (3664, 2010-02-12)
ivmSoftware4.2\ivmSoftware4.2\eigen\COPYING (35147, 2010-02-12)
ivmSoftware4.2\ivmSoftware4.2\eigen\COPYING.LESSER (7639, 2010-02-12)
ivmSoftware4.2\ivmSoftware4.2\eigen\CTestConfig.cmake (548, 2010-02-12)
ivmSoftware4.2\ivmSoftware4.2\eigen\Doxyfile (10260, 2010-02-12)
ivmSoftware4.2\ivmSoftware4.2\eigen\Eigen (0, 2018-07-01)
ivmSoftware4.2\ivmSoftware4.2\eigen\Eigen\Array (1156, 2010-02-12)
ivmSoftware4.2\ivmSoftware4.2\eigen\Eigen\CMakeLists.txt (827, 2010-02-12)
ivmSoftware4.2\ivmSoftware4.2\eigen\Eigen\Cholesky (1988, 2010-02-12)
ivmSoftware4.2\ivmSoftware4.2\eigen\Eigen\Core (4710, 2010-02-12)
ivmSoftware4.2\ivmSoftware4.2\eigen\Eigen\Dense (140, 2010-02-12)
ivmSoftware4.2\ivmSoftware4.2\eigen\Eigen\Eigen (35, 2010-02-12)
ivmSoftware4.2\ivmSoftware4.2\eigen\Eigen\Geometry (1269, 2010-02-12)
ivmSoftware4.2\ivmSoftware4.2\eigen\Eigen\LU (634, 2010-02-12)
ivmSoftware4.2\ivmSoftware4.2\eigen\Eigen\LeastSquares (523, 2010-02-12)
ivmSoftware4.2\ivmSoftware4.2\eigen\Eigen\NewStdVector (6454, 2010-02-12)
ivmSoftware4.2\ivmSoftware4.2\eigen\Eigen\QR (2257, 2010-02-12)
ivmSoftware4.2\ivmSoftware4.2\eigen\Eigen\QtAlignedMalloc (934, 2010-02-12)
ivmSoftware4.2\ivmSoftware4.2\eigen\Eigen\SVD (556, 2010-02-12)
ivmSoftware4.2\ivmSoftware4.2\eigen\Eigen\Sparse (2803, 2010-02-12)
ivmSoftware4.2\ivmSoftware4.2\eigen\Eigen\StdVector (5534, 2010-02-12)
ivmSoftware4.2\ivmSoftware4.2\eigen\Eigen\src (0, 2018-07-01)
ivmSoftware4.2\ivmSoftware4.2\eigen\Eigen\src\Array (0, 2018-07-01)
ivmSoftware4.2\ivmSoftware4.2\eigen\Eigen\src\Array\BooleanRedux.h (4167, 2010-02-12)
ivmSoftware4.2\ivmSoftware4.2\eigen\Eigen\src\Array\CMakeLists.txt (128, 2010-02-12)
... ...
Matlab implementation of a revised version of Import Vector Machine classifier (Zhu and Hastie 2005)
Table of Contents
=================
- Introduction
- Installation
- Usage
- Determining the kernel and regularization parameter
Introduction
============
This implementation is based on the Import Vector Machine algorithm of Zhu and Hastie.
The algorithm is a sparse, probabilistic and discriminative Kernel Logistic Regression model.
It shows similar accuracy like the Support Vector Machines, but has a probabilistic output.
The model is also sparser, so that the classification step is faster.
Installation
============
On Unix systems, we recommend using GNU g++ as your
compiler.
On all systems just type 'make' into your Matlab command window to build the mex-files.
The implementation uses the Eigen library (http://eigen.tuxfamily.org/dox/).
Usage
=====
params = init;
- initialization of all parameters
result = ivm(data, params);
Input:
- phi
(M x N) feature matrix with (M x 1) homogeneous feature vectors of each training data point,
that means the first element has to be a 1
- c
(N x 1) vector of training labels, valid labels are 1, 2, ..., C
- params
parameters
- phit (optional)
(M x N) feature matrix with (M x 1) homogeneous feature vectors of each testing data point,
that means the first element has to be a 1
- ct (optional)
(N x 1) vector of testing labels, valid labels are 1, 2, ..., C. If labels of test
data are unknown, simply use any random values
Output:
- result:
P: probabilities of test data
trainTime: training time
testTime: testing time
model: stored model (see below)
confMat: confusion matrix (rows: true label, columns: estimated label)
perc: User and Producer accuracy
acc: overall accuracy (given in percentage)
kappa: kappa coefficient
nIV: number of used import vectors
testAcc: number of right classified testing points
Ntest: number of testing data
- result.model
indTrain: indices of used training points
P: probabilities of train data
log: log file of all iterations
L: function values of optimization function
lambda: regularization parameter
err_train: training error
params: used params (output from init)
trainError: last training error
trainTime: training time
IV: feature vectors of the import vectors
kernelSigma: used kernel parameter
lambda: used regularization parameter
C: number of classes
c: true labels of import vectors
S: indices of the import vectors
nIV: number of used import vectors
alpha: parameters of the decision hyperplane
fval: last function value of objective function
A toy example is given in the function main_toyExample.m with the Ripley dataset.
Determining the kernel and regularization parameter
===================================================
The determination of the kernel parameter is done via gridsearch and crossvalidation.
Greedy selection of import vectors
==================================
We use a hybrid forward/backward strategy, which successively add import vectors to the set
(forward step), but also test in each step if import vectors can be removed (backward step).
Import vectors tend to lie in the acceptance area of their class labels defined through the
decision boundary. Therefore, an import vector, which lies in the incorrect acceptance area
is tested to be replaced by the nearest import vector from the correct class (backward II step).
Since we start with an empty import vector set, and only add import vectors sequentially,
in the first iterations the decision boundary can be very different from their final position.
Therefore a removal of import points can lead to a sparser and more accurate solution than
only using forward selection steps.
Publication of the Import Vector Machine:
=========================================
Zhu, Ji / Hastie, Trevor:
Kernel Logistic Regression and the Import Vector Machine
In: Journal of Computational and Graphical Statistics 14, 1, 2005, pp. 185-205.
http://pubs.amstat.org/doi/abs/10.11***/106186005X25619.
Roscher, Ribana / Frstner, Wolfgang / Waske, Bjrn:
I^2VM: Incremental Import Vector Machines
In: Image and Vision Computing 30, 4-5, 2012, pp. 263-278.
(Please cite this paper if you use the software)
近期下载者:
相关文件:
收藏者: