We present a detailed comparison between the 2–10 keV hard X-ray and infrared (IR) luminosity functions (LFs) of active galactic nuclei (AGNs). The composite X-ray to IR spectral energy distributions (SEDs) of AGNs, which are used to compare the hard X-ray LF (HXLF) and the IRLF, are modelled with a simple, but well-tested torus model, based on the radiative transfer and photoionization code, cloudy. Four observational determinations of the evolution of the 2–10 keV HXLF and six evolution models of the obscured type 2 AGN fraction (f2) are considered. The 8.0- and 15-m LFs for unobscured type 1, obscured type 2 and all AGNs are predicted from the HXLFs, and then compared with the measurements currently available. We find that the IRLFs predicted from the HXLFs tend to underestimate the number of the most IR-luminous AGNs. This is independent of the choices of HXLF and f2, and this is even more obvious for the HXLFs recently measured. We show that the discrepancy between the HXLFs and IRLFs can be largely resolved when the anticorrelation between the ultraviolet (UV) to X-ray slope αox and UV luminosity LUV is appropriately considered. We also discuss other possible explanations for the discrepancy, such as the missing population of Compton-thick AGNs and the possible contribution of star formation in the host to the mid-IR. Meanwhile, we find that the HXLFs and IRLFs of AGNs might be more consistent with each other if the obscuration mechanisms of quasars and Seyferts are assumed to be different, corresponding to their different triggering and fuelling mechanisms. In order to clarify these interesting issues, it would be very helpful to obtain more accurate measurements of the IRLFs of AGNs, especially those determined at smaller redshift bins, and to more accurately separate the measurements for type 1 and type 2 AGNs.