Time series are demeaned when sample autocorrelation functions are computed. By the same logic it would seem appealing to remove seasonal means from seasonal time series before computing sample autocorrelation functions. Yet, standard practice is only to remove the overall mean and ignore the possibility of seasonal mean shifts in the data. Whether or not time series are seasonally demeaned has very important consequences on the asymptotic behavior of autocorrelation functions. The effect on the asymptotic distribution of seasonal mean shifts and their removal is investigated and the practical consequences of these theoretical developments are discussed. We also examine the small sample behavior of autocorrelation function estimates through Monte Carlo simulations.