Given a set of data points x[0..ndata-1],y[0..ndata-1] with individual standard deviations
sig[0..ndata-1], fit them to a straight line y = a + bx by minimizing ¥ö2. Returned are
a,b and their respective probable uncertainties siga and sigb, the chi-square chi2, and the
goodness-of-fit probability q (that the fit would have ¥ö2 this large or larger). If mwt=0 on
input, then the standard deviations are assumed to be unavailable: q is returned as 1.0 and
the normalization of chi2 is to unit standard deviation on all points.
- A collection of series objects. For example, to evaluate this indicator for two series
you will need to pass a series collection containing this two series.
- A series wich contains the standard deviations of the given data point.
- When this parameter is 0 the standard deviation will not be considered.