In this paper, we investigate the problem of testing semiparametric hypotheses in locally stationary processes. The proposed method is based on an empirical version of the L2-distance between the true time varying spectral density and its best approximation under the null hypothesis. As this approach only requires estimation of integrals of the time varying spectral density and its square, we do not have to choose a smoothing bandwidth for the local estimation of the spectral density – in contrast to most other procedures discussed in the literature. Asymptotic normality of the test statistic is derived both under the null hypothesis and the alternative. We also propose a bootstrap procedure to obtain critical values in the case of small sample sizes. Additionally, we investigate the finite sample properties of the new method and compare it with the currently available procedures by means of a simulation study. Finally, we illustrate the performance of the new test in two data examples, one regarding log returns of the S&P 500 and the other a well-known series of weekly egg prices.