Classical change point analysis aims at (1) detecting abrupt changes in the mean of a possibly non-stationary time series and at (2) identifying regions where the mean exhibits a piecewise constant behavior. In many applications however, it is more reasonable to assume that the mean changes gradually in a smooth way. Those gradual changes may either be non-relevant (i.e., small), or relevant for a specific problem at hand, and the present paper presents statistical methodology to detect the latter. More precisely, we consider the common nonparametric regression model Xi = (i/n) + i with possibly non-stationary errors and propose a test for the null hypothesis that the maximum absolute deviation of the regression function from a functional g() (such as the value (0) or the integral R 1 0 (t)dt) is smaller than a given threshold on a given interval [x0, x1] [0, 1]. A test for this type of hypotheses is developed using an appropriate estimator, say d,n, for the maximum deviation d = supt[x0,x1] |(t) g()|. We derive the limiting distribution of an appropriately standardized version of d,n, where the standardization depends on the Lebesgue measure of the set of extremal points of the function (·) g(). A refined procedure based on an estimate of this set is developed and its consistency is proved. The results are illustrated by means of a simulation study and a data example.