跳到主要內容區塊
:::
A- A A+

演講公告

:::

Minimum Discrepancy Parameter Estimation

Abstract

In statistical estimation theory, a satisfactory estimator should be able to embody the available information, which may be know a priori or provided by the data. Hence, the loss of information is minimal when this estimator is employed. Recently, an estimator based on the discrepancy between the estimator's error covariance and its information lower bound has been proposed. The existence and uniqueness conditions of this minimum discrepancy estimator (MDE) are studied under certain regularity conditions. It is shown that the MDE is consistent and asymptotically efficient. As a result, the MDE is ensured to have minimum loss of information in finite samples and no loss of information when sample size tends to infinity. Examples indicate that if the prior information is vague, the MDE is superior to the minimum variance estimator in terms of information loss. If the prior distribution is suitable, the MDE is superior to the maximum likelihood estimator on the basis of deficiency.

最後更新日期:
回頁首