Distributed lag

From Wikipedia, the free encyclopedia

In statistics a distributed lag model explains a time series by a series of lags of the same variable. In general y_t=\sum \beta_i y_{t-i} +\epsilon_t, where yt is the time series and ε is the error.