The backpropagation algorithm is one of the most used tools for training artificial neural networks. However, this tool may be very slow in some practical applications. Many techniques have been discussed to speed up the performance of this algorithm and allow its use in an even broader range of applications. Although the backpropagation algorithm has been used for decades, we present here a set of computational results that suggest that by replacing bihyperbolic functions the backpropagation algorithm performs better than the traditional sigmoid functions. To the best of our knowledge, this finding was never previously published in the open literature. The efficiency and discrimination capacity of the proposed methodology are shown through a set of computational experiments, and compared with the traditional problems of the literature.