The Stochastic Configuration Network (SCN) is a universal approximator that stochastically configures the input weights and biases of hidden nodes under a supervised mechanism. This paper extends the original SCN to improve the learning efficiency by optimizing the configuration of hidden node parameters and expanding the training dataset with unlabeled data. Firstly, Randomly configuring hidden nodes parameters introduces uncertainty and does not achieve optimal values. To address this limitation, this paper introduces an improved Pelican Optimization Algorithm (IPOA) to enhance its global optimization capability, which is then applied to optimize the hidden nodes configuration. This improvement boosts network learning efficiency and results in a more lightweight structure. To address the challenge of limited labeled data for fully supervised SCN models, a semi-supervised learning approach is employed, using a small amount of labeled data to classify unlabeled data. Finally, the improved SCN (ISCN) algorithm is applied to a real-world industrial process for predicting the setpoint of coal mill outlet temperature in power system. Simulation results and real-world power plant applications demonstrate that the ISCN achieves faster convergence and improved generalization ability compared to the original SCN.