Summary. Typically T-optimality is used to obtain optimal designs to discriminate between homoscedastic models with normally distributed observations. Some extensions of this criterion have been made for the heteroscedastic case and binary response models in the literature. In this paper, a new criterion based on the Kullback–Leibler distance is proposed to discriminate between rival models with non-normally distributed observations. The criterion is coherent with the approaches mentioned above. An equivalence theorem is provided for this criterion and an algorithm to compute optimal designs is developed. The criterion is applied to discriminate between the popular Michaelis–Menten model and a typical extension of it under the log-normal and the gamma distributions.