Mismatch negativity (MMN) is measured by subtracting the averaged response to a set of standard stimuli from the averaged response to rarer deviant stimuli, and taking the amplitude of this difference wave in a given time window. This method is problematic when used to evaluate individuals, because there is no estimate of variance. We describe a new approach, in which independent components with high trial-by-trial variance are first removed. Next, each deviant response has the preceding standard response subtracted, giving a set of single trial difference waves. We illustrate this approach in analysis of MMN to brief tones in 17 adults. The best criterion for MMN combined t-test with an index of inter-trial coherence, giving significant MMN in 14 (82%) of individuals. Single-trial methods can indicate which people show MMN. However, in some clinically normal individuals there was no MMN, despite good behavioral discrimination of stimuli.