\input zb-basic \input zb-ioport \iteman{io-port 05588603} \itemau{Ha, Ming-Hu; Pedrycz, Witold; Zhang, Zhi-Ming; Tian, Da-Zeng} \itemti{The theoretical foundations of statistical learning theory of complex random samples.} \itemso{Far East J. Appl. Math. 34, No. 3, 315-336 (2009).} \itemab Summary: Statistical learning theory is commonly regarded as a sound framework within which we handle a variety of learning problems in presence of small size data samples. It has become a rapidly progressing research area in machine learning. The theory is based on real random samples and as such is not ready to deal with the statistical learning problems involving complex random samples, which we may encounter in the real world scenarios. This paper explores statistical learning theory based on complex random samples and in this sense generalizes the existing fundamentals. Firstly, the definitions of complex random variable and primary norm are introduced. Next, the concepts and some properties of the mathematical expectation and variance of complex random variables are provided. For complex random variables, we discuss a number of fundamental properties such as Markov's inequality, Chebyshev's inequality and a Khinchine's law of large numbers. Secondly, the definitions of the complex empirical risk functional, the complex expected risk functional and complex empirical risk minimization principle are proposed. Finally, the key theorem of learning theory based on complex random samples is proved, and the bounds on the rate of uniform convergence of learning process are constructed. The investigations will help laying essential theoretical foundations for the systematic and comprehensive development of the complex statistical learning theory. \itemrv{~} \itemcc{} \itemut{complex random variable; norm; complex empirical risk minimization principle; the key theorem; bounds on the rate of uniform convergence} \itemli{http://pphmj.com/abstract/3799.htm} \end