Trend lines are a measure of the urgency of a disease or risk factor. If the rate of an indicator is increasing over time, it may be considered more urgent than indicators for which the rates are decreasing.
As with rates, themselves, there is random variation in the trend lines of rates, so that a line that slopes upward may not represent a statistically significant increase, particularly if the rates are based on small numbers. For that reason, we test to determine whether or not we can be at least 95 percent confident that what appears to be an increase or decrease is real, not just the result of random fluctuation.
The measure used is the coefficient of the slope of the regression line for the time period. This coefficient is derived using the least squares method from rates for each year and then compared to zero using the student's T test. If the t value is less than a T table value at the ninety-five percent level for the degrees of freedom, then the slope is considered to be zero and the coefficient is set to zero. Degrees of freedom is the number of years minus two. The zero value sets all non-significant coefficients between the increasing and decreasing values and with equal weight.