Functional additive models provide a flexible yet simple framework for regressions involving functional predictors. The utilization of a data-driven basis in an additive rather than linear structure naturally extends the classical functional linear model. However, the critical issue of selecting non-linear additive components has been less studied. In this work, we propose a new regularization framework for structure estimation in the context of reproducing kernel Hilbert spaces. The approach proposed takes advantage of functional principal components which greatly facilitates implementation and theoretical analysis. The selection and estimation are achieved by penalized least squares using a penalty which encourages the sparse structure of the additive components. Theoretical properties such as the rate of convergence are investigated. The empirical performance is demonstrated through simulation studies and a real data application.