| |
Das Dokument ist öffentlich im Netz zugänglich. |
|
Classical change point analysis aims at (1) detecting abrupt changes in the mean of a possibly non-stationary time series and at (2) identifying regions where the mean exhibits a piecewise constant behavior. In many applications however, it is more reasonable to assume that the mean changes gradually in a smooth way. Those gradual changes may either be non-relevant (i.e., small), or relevant for a specific problem at hand, and the present paper presents statistical methodology to detect the latter. More precisely, we consider the common nonparametric regression model Xi = µ(i/n) + εi with possibly non-stationary errors and propose a test for the null hypothesis that the maximum absolute deviation of the regression function µ from a functional g(µ) (such as the value µ(0) or the integral R 1 0 µ(t)dt) is smaller than a given threshold on a given interval [x0, x1] ⊆ [0, 1]. A test for this type of hypotheses is developed using an appropriate estimator, say ˆ d∞,n, for the maximum deviation d∞ = supt∈[x0,x1] |µ(t) − g(µ)|. We derive the limiting distribution of an appropriately standardized version of ˆ d∞,n, where the standardization depends on the Lebesgue measure of the set of extremal points of the function µ(·) − g(µ). A refined procedure based on an estimate of this set is developed and its consistency is proved. The results are illustrated by means of a simulation study and a data example. |
|
Das PDF-Dokument wurde 120 mal heruntergeladen. |
|
|