• Artificial neural networks;
  • Lagrange multiplier tests;
  • neglected nonlinearity tests;
  • heteroskedasticity;
  • wild bootstrap;
  • GARCH models;
  • simulation

The purpose of this article is to investigate the empirical performance of various statistical techniques for detecting the optimal structure of a neural network (NN) regression model. We are particularly concerned with the specification of the NN architecture when the error component is characterized by special statistical properties, such as heteroskedasticity and non-normality. We consider the sequential testing procedure based on standard Lagrange multiplier (LM) tests for neglected nonlinearity and also examine three modifications of this test that are robust to heteroskedasticity. By means of Monte Carlo simulations, we investigate the ability of these procedures to detect the right structure of the NN under different types of heteroskedasticity and noise distributions. Simulation results show that robustified LM tests allow the researcher to control the complexity of the NN without having to explicitly model all statistical aspects of the data-generating process, something which is not generally feasible with the standard LM test. The combination of robust regression-based testing with bootstrapping and generalized autoregressive conditional heteroskedasticity modelling techniques increases the efficiency of the statistical sequential procedure in eliciting the optimal NN architecture.