Документ взят из кэша поисковой машины. Адрес оригинального документа : http://www.sao.ru/precise/Midas_doc/doc/95NOV/vol2/node15.html
Дата изменения: Fri Feb 23 12:57:24 1996
Дата индексирования: Tue Oct 2 17:15:08 2012
Кодировка:

Поисковые слова: п п п п п п п п п п п п п п п п п п п п п п
Estimation



next up previous contents
Next: Raw to Calibrated Up: Basic Concepts Previous: Noise distributions

Estimation

  A number of different statistical methods are used for estimating parameters from a data set. The most commonly used one is the least squares method which estimates a parameter by minimizing the function :

 

where y is the dependent and x the independent variables while f is a given function. Equation gif can be expanded to more parameters if needed. For linear functions f an analytic solution can be derived whereas an iteration scheme must be applied for most non--linear cases. Several conditions must be fulfilled for the method to give a reliable estimate of . The most important assumptions are that the errors in the dependent variable are normal distributed, the variance is homogeneous, and the independent variables have no errors and are uncorrelated.

The other main technique for parameter estimation is the maximum likelihood method where the joint probability of the parameter :

 

is maximized. In Equation gif, P denotes the probability density of the individual data sets. Normally, the logarithm likelihood is used to simplify the maximization procedure. This method can be used for any given distribution. For a normal distribution the two methods will give the same result.



Rein Warmels
Mon Jan 22 15:08:15 MET 1996