SEARCH

SEARCH BY CITATION

REFERENCES

  • Alleyne, I. R., “Optimal Grade Transition Policies for a High Pressure EVA Polymerization Plant,” Ph.D. Thesis, University of Alberta (2006).
  • Bao, Y., and Z. Liu, “A Fast Grid Search Method in Support Vector Regression Forecasting Time Series,” in “Intelligent Data Engineering and Automated Learning,” E.Corchado, H.Yin, V.Botti and C.Fyfe, Eds., Springer, Springer-Verlag, Berlin (2006), pp. 504511.
  • Candes, E. J. and M. B. Wakin, “An Introduction to Compressive Sampling,” IEEE Signal Process. Mag. 25(2), 2130 (2008).
  • Chang, C.-C. and C.-J. Lin, “LIBSVM: A Library for Support Vector Machines,” Software available at http://www.csie.ntu.edu.tw/∼cjlin/libsvm (2001).
  • Cherkassky, V. and Y. Ma, “Practical Selection of SVM Parameters and Noise Estimation for SVM Regression,” Neural Netw. 17(1), 113126 (2004).
  • De Moor, B. L. R., “DaISy: Database for the Identification of Systems,” Department of Electrical Engineering, ESAT/SISTA, K.U. Leuven, Belgium. http://homes.esat.kuleuven.be/∼smc/daisy/ [used data set: pH Data, Section: Process Industry Systems, code number: 96-014] (2010).
  • Drucker, H., C. Burges, L. Kaufman, A. Smola and V. Vapnik, “Support Vector Regression Machines,” Proceedings of the 1996 Conference on Advances in Neural Information Processing Systems, 9, 155–161, MIT Press, Cambridge, MA (1997).
  • Flake, G. W. and S. Lawrence, “Efficient SVM Regression Training With SMO,” Machine Learn. 46(1–3), 271290 (2002).
  • Han, I.-S., C. Han and C.-B. Chung, “Melt Index Modeling With Support Vector Machines, Partial Least Squares, and Artificial Neural Networks,” J. Appl. Polym. Sci. 95, 967974 (2005).
  • Hsu, C.-W., C.-C. Chang and C.-J. Lin, “A Practical Guide to Support Vector Classification,” Technical Report, Department of Computer Science and Information Engineering, National Taiwan University. Available at http://www.csie.ntu.edu.tw/∼cjlin/papers/guide/guide.pdf (2004).
  • Huber, P. J., “Robust Estimation of a Location Parameter,” Ann. Math. Stat. 35(1), 73101 (1964).
  • Iqbal, M. H., U. Sundararaj and S. L. Shah, “A New Approach to Develop Dynamic Grey Box Model for a Plasticating Twin Screw Extruder,” Ind. Eng. Chem. Res. 49(2), 648657 (2010).
  • Janssen, L. P. B. M., “Twin Screw Extrusion,” Elseviers Scientific Publishing Company, New York (1977).
  • Joseph, B. and C. B. Brosilow, “Inferential Control of Processes: Part I. Steady State Analysis and Design,” AIChE J. 24(3), 485492 (1978).
  • Li, H., X. Zhu and B. Shi, “Nonlinear Identification Based on Least Squares Support Vector Machine,” Control, Automation, Robotics and Vision Conference, 2004, ICARCV 2004 8th, Vol. 3, 2331–2335 (2004).
  • Ljung, L., “System Identification: Theory for the User,” Prentice Hall PTR, (1999).
  • Ljung, L., “Identification of Nonlinear Systems,” 9th International Conference of IEEE on Control, Automation, Robotics and Vision (ICARCV 2006) (2006).
  • Ljung, L., Q. Zhang, P. Lindskog, A. Juditsky and R. Singh, “An Integrated System Identification Toolbox for Linear and Non-Linear Models,” Proc. 14th IFAC Symposium on System Identification, Newcastle, Australia (2006).
  • McAvoy, T. J., E. Hsu and S. Lowenthal, “Dynamics of pH in Controlled Stirred Tank Reactor,” Ind. Eng. Chem. Process Des. Dev. 11(1), 6870 (1972).
  • Mercer, J., “Functions of Positive and Negative Type, and Their Connection With the Theory of Integral Equations,” Phil. Trans. R. Soc. Lond. Ser. A 209, 415446 (1909).
  • Nandi, S., Y. Badhe, J. Lonari, U. Sridevi, B. Rao, S. S. Tambe and B. D. Kulkarni, “Hybrid Process Modeling and Optimization Strategies Integrating Neural Networks/Support Vector Regression and Genetic Algorithms: Study of Benzene Isopropylation on hbeta Catalyst,” Chem. Eng. J. 97(2–3), 115129 (2004).
  • Ohshima, M. and M. Tanigaki, “Quality Control of Polymer Production Processes,” J. Process Control 10(2–3), 135148 (2000).
  • Park, J. and I. W. Sandberg, “Universal Approximation Using Radial-Basis-Function Networks,” Neural Comput. 3(2), 246257 (1991).
  • Pearson, R. K., “Nonlinear Empirical Modeling Techniques,” Comput. Chem. Eng. 30, 15141528 (2006).
  • Platt, J. C., “Fast Training of Support Vector Machines Using Sequential Minimal Optimization,” in “Advances in Kernel Methods: Support Vector Learning,” B.Schölkopf, C. J.Burges and A. J.Smola, Eds., MIT Press, Cambridge, MA, pp. 185208 (1999).
  • Pontil, M., S. Mukherjee and F. Girosi, “On the Noise Model of Support Vector Machines Regression,” Algorithmic Learning Theory, 11th International Conference, ALT 2000, Sydney, Australia, December 2000, Proceedings, Vol. 1968, pp. 316–324, Springer, Springer-Verlag, Berlin (2000).
  • Rhodes, C. and M. Morari, “Determining the Model Order of Nonlinear Input/Output Systems,” AIChE J. 44(1), 151163 (1998).
  • Rojo-Álvarez, J. L., M. Martínez-Ramón, M. de Prado-Cumplido, A. Artés-Rodríguez and A. R. Figueiras-Vidal, “Support Vector Method for Robust ARMA System Identification,” IEEE Trans. 52(1), 155164 (2004).
  • Saunders, G., A. Gammerman and V. Vovk, “Ridge Regression Learning Algorithm in Dual Variables,” Proc. 15th International Conf. on Machine Learning, Morgan Kaufmann, San Francisco, CA, pp. 515–521 (1998).
  • Scholkopf, B., K.-K. Sung, C. Burges, F. Girosi, P. Niyogi, T. Poggio and V. Vapnik, “Comparing Support Vector Machines With Gaussian Kernels to Radial Basis Function Classifiers,” IEEE Trans. Signal Process. 45(11), 27582765 (1997).
  • Scholkopf, B., P. Bartlett, A. Smola and R. Williamson, “Support Vector Regression With Automatic Accuracy Control,” in “Proceedings of Eighth International Conference on Artificial Neural Networks,” L.Niklasson, M.Boden and T.Ziemke, Eds., Prentice Hall PTR, Upper Saddle River, NJ pp. 111116 (1998).
  • Shi, J. and X. Liu, “Melt Index Prediction by Weighted Least Squares Support Vector Machines,” J. Appl. Polym. Sci. 101(1), 285289 (2006).
  • Smola, A. J. and B. Scholkopf, “A Tutorial on Support Vector Regression,” Technical Report NC2-TR-1998-030, NeuroCOLT2, ESPRIT Working Group on Neural and Computational Learning Theory, Royal Holloway College, University of London (2003).
  • Suykens, J. A. K., “Support Vector Machines and Kernel-Based Learning for Dynamical Systems Modelling,” Proceedings of the 15th IFAC Symposium on System Identification, Saint-Malo, France, pp. 1029–1037 (2009).
  • Suykens, J. A. K. and J. Vandewalle, “Least Squares Support Vector Machine Classifiers,” Neural Process. Lett. 9(3), 293300 (1999).
  • Vapnik, V. N., “Statistical Learning Theory,” John Wiley & Sons, New York (1998).
  • Vapnik, V. N. and A. Chervonenkis, “Theory of Pattern Recognition (In Russian),” Nauka, Moscow (1974).
  • Vogt, M., K. Spreitzer and V. Kecman, “Identification of a High Efficiency Boiler by Support Vector Machines Without Bias Term,” System Identification 2003: A Proceedings Volume from the 13th IFAC Symposium on System Identification, Rotterdam, The Netherlands, August 27–29, 2003, 1, 465–471 (2004).
  • Wang, J., Q. Chen, and Y. Chen, “Rbf Kernel Based Support Vector Machine With Universal Approximation and Its Application,” in “Advances in Neural Networks—ISNN 2004, Part I,” F.Yin, J.Wang and C.Guo, Eds., Springer, Springer-Verlag, Berlin pp. 512517 (2004).
  • Weston, J., A. Gammerman, M. O. Stitson, V. Vapnik, V. Vovk and C. Watkins, “Support Vector Density Estimation,” in “Advances in Kernel Methods—Support Vector Learning,” B.Scholkopf, C. J. C.Burges and A. J.Smola, Eds., MIT Press, Cambridge, MA pp. 293305 (1999).
  • Xi, X.-C., A.-N. Poo and S.-K. Chou, “Support Vector Regression Model Predictive Control on a HVAC Plant,” Control Eng. Pract. 15(8), 897908 (2007).
  • Yan, W., H. Shao and X. Wang, “Soft Sensing Modeling Based on Support Vector Machine and Bayesian Model Selection,” Comput. Chem. Eng. 28(8), 14891498 (2004).