bilinear

所属分类:matlab编程
开发工具:matlab
文件大小:11KB
下载次数:48
上传日期:2013-12-21 13:40:52
上 传 者Nithesh
说明:  In this paper, we introduce a new machine-learning-based data classification algorithm that is applied to network intrusion detection. The basic task is to classify network activities (in the network log as connection records) as normal or abnormal while minimizing misclassification. Although different classification models have been developed for network intrusion detection, each of them has its strengths and weaknesses, including the most commonly applied Support Vector Machine (SVM) method and the Clustering based on Self-Organized Ant Colony Network (CSOACN). Our new approach combines the SVM method with CSOACNs to take the advantages of both while avoiding their weaknesses. Our algorithm is implemented and evaluated using a standard benchmark KDD99 data set. Experiments show that CSVAC (Combining Support Vectors with Ant Colony) outperforms SVM alone or CSOACN alone in terms of both classification rate and run-time efficiency.

文件列表:
toyexample.m (1313, 2013-01-15)
private\parseparam.m (463, 2012-04-10)
bilinsvmpredict.m (506, 2013-01-15)
bilinsvmtrain.m (9258, 2013-01-15)
COPYRIGHT (1474, 2013-01-14)
hMKLpredict.m (595, 2013-01-15)
hMKLtrain.m (1924, 2013-01-15)
initw.m (1190, 2013-01-15)
smooth_regularizer.m (2041, 2013-01-15)
svmbysmo.m (540, 2012-12-12)

This is a MATLAB software package for bilinear optimization used in the paper: T. Kobayashi and N. Otsu, "Efficient Optimization For Low-Rank Integrated Bilinear Classifiers," Proc. European Conference on Computer Vision (ECCV), pp. 474-487, 2012. Table of Contents ================= - Installation - `bilinsvmtrain' Usage - `bilinsvmpredict' Usage - `hMKLtrain' Usage - `hMKLpredict' Usage - Examples - Reference Installation ============ Unzip bilinear.zip in the folder you like. In a default setting, this package requires SMO package which can be downloaded from http://staff.aist.go.jp/takumi.kobayashi/code/SMO.zip So, unzip SMO.zip and add the path into MATLAB cache. `bilinsvmtrain' Usage ===================== Bilinear classifier y = - rho is optimized by min 0.5( + + + ) + C sum_i (xi_i) s.t. y_i ( - rho) >= 1 - xi_i xi_i >=0 W = Wh*Ww' Usage: model = bilinsvmtrain(y, X, params) Input: y - Label vector {-1,+1} [n x 1] X - Feature matrices [h x w x n] NOTE: samples are given in the form of "matrix (h x w)". params - Parameters [struct] .C - Balancing parameter [scalar] (default 1) .Jh - Constraint matrix for row weights [h x h] (default []) .Jw - Constraint matrix for column weights [w x w] (default []) These matrixes Jh and Jw are used such as for smoothing regularizatoin (see sec 3.1 in the paper) produced by `smooth_regularizer.m'. If Jh=[] or Jw=[], the corresponding regularization is ignored. .Ww0- Initial column weight [w x r] (default []) Initilization by SVM is produced by `initw.m'. If Ww0=[] , it is randomly initialized. .svmsolver - Linear svm solver, formed by "[alpha rho cost] = func(y,col_vectors,C)" [function_handle] (default @svmbysmo) The bilinear optimization requires inner svm optimization (see sec 2.2 in the paper). You can give your own svm-solver in this parameter field. .verbose - Verbose level, 0: sicent, 1: verbose (default 0) Output: model - Classifier model .W - Matrix weights [h x w], W = Wh*Ww' .rho - Bias [scalar], y = -rho .Wh - Vertical (row) weights [h x r] .Ww - Horizontal (column) weights [w x r] `bilinsvmpredict' Usage ======================= Blinear classification, y = - rho Usage: [pl val] = bilinsvmpredict(X, model) Input: X - Feature matrices [h x w x #test] NOTE: samples are given in the form of "matrix (h x w)". model - Classifier model produced by `bilinsvmtrain.m' Output: pl - Predicted label vector {-1,+1} [#test x 1] val - Classification values [#test x 1] `hMKLtrain' Usage ================= Heterogeneous multiple kernel learning (see sec 3.2 in the paper) Usage: model = hMKLtrain(y, K, params) Input: y - Label vector {-1,+1} [n x 1] K - Multiple kernel matrices [n x n x #kernel] params - Parameters for `bilinsvmtrain.m' [struct] .C - Balancing parameter [scalar] (default 1) .Jw - Constraint matrix for kernel weights [w x w] (default []) If Jw=[], the regularization is ignored. .svmsolver - Linear svm solver, formed by "[alpha rho cost] = func(y,col_vectors,C)" [function_handle] (default @svmbysmo) The bilinear optimization requires inner svm optimization (see sec 2.2 in the paper). You can give your own svm-solver in this parameter field. .verbose - Verbose level, 0: sicent, 1: verbose (default 0) Output: model - Classifier model .W - Matrix weights [n x #kernel], W = Wh*Ww' .rho - Bias [scalar] .Wh - Vertical (row) weights [n x r] .Ww - Horizontal (column) weights [#kernel x r] .hinvK - K^(-1/2) [n x n x #kernel] `hMKLpredict' Usage =================== Classification by using heterogeneous multiple kernel learning Usage: [pl val] = hMKLpredict(K, model) Input: K - Multiple kernel matrices [n x #test x #kernel] model - Classifier model produced by `hMKLtrain.m' Output: pl - Predicted label vector {-1,+1} [#test x 1] val - Classification values [#test x 1] Examples ======== For bilinear classification >> addpath('../SMO'); >> X = cat(3, rand(10,10,100), rand(10,10,100)+1); >> y = [ones(100,1);-ones(100,1)]; >> model = bilinsvmtrain(y, X, struct('C',1, 'verbose',1)); >> [pl, val] = bilinsvmpredict(X, model); >> plot([val,y]); For heterogeneous MKL, >> addpath('../SMO'); >> X = [rand(100,10);rand(100,10)+1]; >> y = [ones(100,1);-ones(100,1)]; >> rbf = @(x,y,gam) exp(-gam*(bsxfun(@plus,sum(x.^2,2),sum(y.^2,2)')-2*x*y')); >> K = cat(3, rbf(X,X,0.1),rbf(X,X,1),rbf(X,X,10)); >> model = hMKLtrain(y, K, struct('verbose',1)); >> Xt = [rand(50,10);rand(50,10)+1]; >> yt = [ones(50,1);-ones(50,1)]; >> Kt = cat(3, rbf(X,Xt,0.1),rbf(X,Xt,1),rbf(X,Xt,10)); >> [pl, val] = hMKLpredict(Kt, model); >> plot([val,yt]); Reference ========= If you use this package, please cite it as T. Kobayashi and N. Otsu, "Efficient Optimization For Low-Rank Integrated Bilinear Classifiers," Proc. European Conference on Computer Vision (ECCV), pp. 474-487, 2012.

近期下载者

相关文件


收藏者