In this paper we propose a new approach for sequential monitoring of a parameter of a d-dimensional time series. We consider a closed-end-method, which is motivated by the likelihood ratio test principle and compare the new method with two alternative procedures. We also incorporate self-normalization such that estimation of the long-run variance is not necessary. We prove that for a large class of testing problems the new detection scheme has asymptotic level and is consistent. The asymptotic theory is illustrated for the important cases of monitoring a change in the mean, variance and correlation. By means of a simulation study it is demonstrated that the new test performs better than the currently available procedures for these problems.