Abstract:Stochastic configuration networks (SCN), as a novel incremental learning model, which is different from other randomized neural network models. The parameters of hidden layer nodes are determined through supervision mechanism, keeping a faster convergence of SCN. Due to the advantages of higher learning efficiency, lower human intervention, and stronger generalization ability, SCN has attracted a large number of research interests from domestic and foreign scholars and has been promoted and developed rapidly since it was proposed in 2017. In this paper, a survey on SCN is summarized comprehensively for the first time from the aspects of basic theory, typical algorithm variants, application fields and future research directions of SCN. Firstly, the algorithm principle, universal approximation capacity, advantages of SCN are analyzed theoretically. Secondly, various variants of SCN are summarized, such as DeepSCN, 2DSCN, Robust SCN, Ensemble SCN, Distributed parallel SCN, Regularized SCN. Then, the applications of SCN in different fields, including hardware implementation, computer vision, biomedical data analysis, fault detection and diagnosis, system modeling and prediction are introduced. Finally, the development potential of SCN in convolutional neural network architecture, semi-supervised learning, unsupervised learning, multi-view learning, fuzzy neural network, and recurrent neural network is pointed out.