Документ взят из кэша поисковой машины. Адрес оригинального документа : http://www.sao.ru/precise/Midas_doc/doc/94NOV/vol2/node14.html
Дата изменения: Fri Feb 23 14:01:43 1996
Дата индексирования: Tue Oct 2 17:24:04 2012
Кодировка:

Поисковые слова: http news.cosmoport.com 2005 04 22 1.htm
Estimation



next up previous contents
Next: Raw to Calibrated Up: Basic Concepts Previous: Noise distributions

Estimation

  A number of different statistical methods are used for estimating parameters from a data set. The most commonly used one is the least squares method which estimates a parameter by minimizing the function :

 

where y is the dependent and x the independent variables while f is a given function. Equation gif can be expanded to more parameters if needed. For linear functions f an analytic solution can be derived whereas an iteration scheme must be applied for most non--linear cases. Several conditions must be fulfilled for the method to give a reliable estimate of . The most important assumptions are that the errors in the dependent variable are normal distributed, the variance is homogeneous, and the independent variables have no errors and are uncorrelated.

The other main technique for parameter estimation is the maximum likelihood method where the joint probability of the parameter :

 

is maximized. In Equation gif, P denotes the probability density of the individual data sets. Normally, the logarithm likelihood is used to simplify the maximization procedure. This method can be used for any given distribution. For a normal distribution the two methods will give the same result.



Pascal Ballester
Tue Mar 28 16:52:29 MET DST 1995