Information-Estimators

所属分类:图形图像处理
开发工具:matlab
文件大小:1564KB
下载次数:24
上传日期:2012-11-30 10:27:30
上 传 者yanncnrs
说明:  这是一个有关信息熵估计,以及互信息量计算的程序包,基于模块化设计,已经做了很多优化。
(ITE is capable of estimating many different variants of entropy, mutual information and divergence measures. Thanks to its highly modular design, ITE supports additionally (i) the combinations of the estimation techniques, (ii) the easy construction and embedding of novel information theoretical estimators, and (iii) their immediate application in information theoretical optimization problems.)

文件列表:
Information Theoretical Estimators (ITE) Toolbox\szzoli-ite-0da37f919f4c.zip (1761094, 2012-11-21)
Information Theoretical Estimators (ITE) Toolbox (0, 2012-11-21)

https://bitbucket.org/szzoli/ite/downloads https://bitbucket.org/szzoli/ite/overview ITE is capable of estimating many different variants of entropy, mutual information and divergence measures. Thanks to its highly modular design, ITE supports additionally (i) the combinations of the estimation techniques, (ii) the easy construction and embedding of novel information theoretical estimators, and (iii) their immediate application in information theoretical optimization problems. ITE is (i) written in Matlab/Octave, (ii) multi-platform (tested extensively on Windows and Linux), (iii) free and open source (released under the GNU GPLv3(>=) license). ITE can estimate Shannon-, R¨nyi-, Tsallis entropy; generalized variance, kernel canonical correlation analysis, kernel generalized variance, Hilbert-Schmidt independence criterion, Shannon-, L2-, R¨nyi-, Tsallis mutual information, copula-based kernel dependency, multivariate version of Hoeffding's Phi, Schweizer-Wolff's sigma and kappa; complex variants of entropy and mutual information; L2-, R¨nyi-, Tsallis-, Kullback-Leibler divergence; Hellinger-, Bhattacharyya distance; maximum mean discrepancy, and J-distance. ITE offers solution methods for Independent Subspace Analysis (ISA) and its extensions to different linear-, controlled-, post nonlinear-, complex valued-, partially observed systems, as well as to systems with nonparametric source dynamics. ---------------------- ITE (Information Theoretical Estimators) ITE is capable of estimating many different variants of entropy, mutual information and divergence measures. Thanks to its highly modular design, ITE supports additionally the combinations of the estimation techniques, the easy construction and embedding of novel information theoretical estimators, and their immediate application in information theoretical optimization problems. ITE is written in Matlab/Octave, multi-platform (tested extensively on Windows and Linux), free and open source (released under the GNU GPLv3(>=) license). ITE can estimate Shannon-, R¨nyi-, Tsallis entropy; generalized variance, kernel canonical correlation analysis, kernel generalized variance, Hilbert-Schmidt independence criterion, Shannon-, L2-, R¨nyi-, Tsallis mutual information, copula-based kernel dependency, multivariate version of Hoeffding's Phi, Schweizer-Wolff's sigma and kappa; complex variants of entropy and mutual information; L2-, R¨nyi-, Tsallis-, Kullback-Leibler divergence; Hellinger-, Bhattacharyya distance; maximum mean discrepancy, and J-distance. ITE offers solution methods for Independent Subspace Analysis (ISA) and its extensions to different linear-, controlled-, post nonlinear-, complex valued-, partially observed models, as well as to systems with nonparametric source dynamics. Note: the evolution of the ITE code is briefly summarized in CHANGELOG.txt. become a Follower to be always up-to-date with ITE. if you have an entropy, mutual information, divergence estimator/subtask solver with a GPLv3(>=)-compatible license that you would like to be embedded into ITE, feel free to contact me.

近期下载者

相关文件


收藏者