Analysis of scientific data requires accurate regressor algorithms to decrease prediction errors. Lots of machine learning algorithms, that is, neural networks, rule-based algorithms, regression trees and some kinds of lazy learners, are used to realize this need. In recent years, different ensemble regression strategies were improved to obtain enhanced predictors with lower forecasting errors. Ensemble algorithms combine good models that make errors in different parts of analyzed data. There are mainly two approaches in ensemble regression algorithm generation; boosting and bagging. The aim of this article is to evaluate a boosting-based ensemble approach, forward stage-wise additive modelling (FSAM), to improve some widely used base regressors’ prediction ability. We used 10 regression algorithms in four different types to make predictions on 10 diverse data from different scientific areas and we compared the experimental results in terms of correlation coefficient, mean absolute error, and root mean squared error metrics. Furthermore, we made use of scatter plots to demonstrate the effect of ensemble modelling on the prediction accuracies of evaluated algorithms. We empirically obtained that in general FSAM enhances the accuracies of base regressors or it at least maintains the base regressor performance.