Aksalim

积分:501
上传文件:9
下载次数:81
注册日期:2016-01-21 21:44:11

上传列表
RobotPUMA560.rar - Robotique3 puma Modélisation géométrique du robot Acma H80,2016-02-07 03:12:26,下载1次
Puma2.rar - Robotique3 puma2 Modélisation géométrique du robot Acma H80,2016-02-07 03:11:47,下载1次
Puma.rar - Robotique2 Modélisation géométrique du robot Acma H80,2016-02-07 03:10:51,下载1次
MGDPPrb-inverse.rar - Robotique Modélisation géométrique du robot Acma H80,2016-02-07 03:09:20,下载1次
LMMSE.zip - In statistics and signal processing, a minimum mean square error (MMSE) estimator is an estimation method which minimizes the mean square error (MSE) of the fitted values of a dependent variable, which is a common measure of estimator quality. In the Bayesian setting, the term MMSE more specifically refers to estimation in a Bayesian setting with quadratic cost function. In such case, the MMSE estimator is given by the posterior mean of the parameter to be estimated. Since the posterior mean is cumbersome to calculate, the form of the MMSE estimator is usually constrained to be within a certain class of functions. Linear MMSE estimators are a popular choice since they are easy to use, calculate, and very versatile. It has given rise to many popular estimators such as the Wiener-Kolmogorov filter and Kalman filter.,2016-01-21 22:21:28,下载4次
ROOT_MUSIC.rar - The root-MUSIC method is based on the eigenvectors of the sensor array correlation matrix. It obtains the signal estimation by examining the roots of the spectrum polynomial. The peaks in the spectrum space correspond to the roots of the polynomial lying close to the unit circle.,2016-01-21 22:17:08,下载1次
AIC_MDL.rar - AIC & MDL The Akaike information criterion (AIC) is a measure of the relative quality of statistical models for a given set of data. Given a collection of models for the data, AIC estimates the quality of each model, relative to each of the other models. Hence, AIC provides a means for model selection. AIC is founded on information theory: it offers a relative estimate of the information lost when a given model is used to represent the process that generates the data. In doing so, it deals with the trade-off between the goodness of fit of the model and the complexity of the model. AIC does not provide a test of a model in the sense of testing a null hypothesis i.e. AIC can tell nothing about the quality of the model in an absolute sense. If all the candidate models fit poorly, AIC will not give any warning of that.,2016-01-21 22:10:17,下载12次
MUSIC.rar - MUSIC estimates the frequency content of a signal or autocorrelation matrix using an eigenspace method. This method assumes that a signal, x(n), consists of p complex exponentials in the presence of Gaussian white noise. Given an M \times M autocorrelation matrix, \mathbf{R}_x, if the eigenvalues are sorted in decreasing order, the eigenvectors corresponding to the p largest eigenvalues (i.e. directions of largest variability) span the signal subspace. The remaining M-p eigenvectors span the orthogonal space, where there is only noise. Note that for M = p + 1, MUSIC is identical to Pisarenko harmonic decomposition. The general idea is to use averaging to improve the performance of the Pisarenko estimator.,2016-01-21 22:07:02,下载3次
Bartlet_Capon_Music.rar - Capon Bartlet Music MUSIC estimates the frequency content of a signal or autocorrelation matrix using an eigenspace method. This method assumes that a signal, x(n), consists of p complex exponentials in the presence of Gaussian white noise. Given an M \times M autocorrelation matrix, \mathbf{R}_x, if the eigenvalues are sorted in decreasing order, the eigenvectors corresponding to the p largest eigenvalues (i.e. directions of largest variability) span the signal subspace. The remaining M-p eigenvectors span the orthogonal space, where there is only noise. Note that for M = p + 1, MUSIC is identical to Pisarenko harmonic decomposition. The general idea is to use averaging to improve the performance of the Pisarenko estimator.,2016-01-21 22:05:51,下载1次

近期下载

收藏