跳到主要內容區塊
:::
A- A A+

演講公告

:::

Measuring Information when Fisher's Fails

  • 2000-06-23 (Fri.), 10:30 AM
  • 二樓交誼廳
  • Prof. N. Balakrishnan
  • Department of Mathematics and Statistics, McMaster

Abstract

Fisher's information measure and its relationship with the problem of best estimation is very well known. However, there are many situations where the Fisher information can not be determined as the likelihood function is not differentiable with respect to the parameter. In this case, I will discuss a simple linear sensitivity measure (proposed earlier by Tukey) and explain how it can be computed, and what its relationship to the problem of optimal estimation is. In the second part of this talk, I will propose a multiparameter version of this measure and prove some of its properties including non-negative definiteness, weak additivity, monotonicity, and convexity. I will then point out its relationship to the Fisher information (when we can determine it) and the Best Linear Unbiased Estimators of parameters. Finally, I will present some examples to illustrate all these results.

最後更新日期:
回頁首