Abstract:
Accurate short-term prediction of indoor air temperature is a critical prerequisite for intelligent and proactive environmental control in modern laying hen houses. However, the thermal environment inside poultry facilities is governed by complex dynamics that exhibit pronounced nonlinearity, non-stationarity, and multi-scale temporal characteristics. These characteristics arise from the combined effects of slow-varying processes, such as diurnal and seasonal cycles, and relatively rapid fluctuations induced by management operations and environmental control equipment. Such complexity poses significant challenges for conventional statistical models and standard deep learning approaches, which often struggle to simultaneously capture long-term trends and short-term variations in agricultural environmental time series. To address these challenges, this study proposes a hybrid temperature prediction framework that integrates Empirical Mode Decomposition (EMD), Long Short-Term Memory (LSTM) networks, and an attention mechanism, referred to as the EMD-LSTM-Attention model. The proposed approach aims to enhance the representation and learning of multi-scale temporal features inherent in laying hen house temperature data. Specifically, EMD is first employed as an adaptive signal decomposition technique to decompose the original non-stationary temperature time series into a finite number of intrinsic mode functions (IMFs), each corresponding to oscillatory components at different characteristic time scales. These IMFs are then treated as multi-channel inputs and fed into an LSTM-based prediction network equipped with an attention mechanism. By doing so, the model is able to learn temporal dependencies within each decomposed component while dynamically allocating greater importance to more informative historical time steps during prediction. The proposed model was evaluated using continuous indoor temperature data collected from a commercial laying hen house over an entire year, with measurements recorded at 10-minute intervals. To examine model robustness under different seasonal conditions, the dataset was organized into five independent but temporally continuous subsets: full-year, spring, summer, autumn, and winter. For each dataset, a chronological split was applied, allocating 60% of the data for training, 20% for validation, and 20% for testing. All models were trained under identical experimental settings using the Adam optimizer and an early stopping strategy to ensure fair comparison and prevent overfitting. Model performance was assessed using three widely adopted evaluation metrics: mean absolute percentage error (MAPE), root mean squared error (RMSE), and the coefficient of determination (
R2). To comprehensively evaluate the effectiveness of the proposed framework, the EMD-LSTM-Attention model was compared with multiple benchmark models, including traditional autoregressive (AR) models and several commonly used deep learning architectures, such as RNN, GRU, BiGRU, BiLSTM, and Transformer. In addition, ablation experiments were conducted to quantify the contributions of the EMD and attention modules. The results demonstrate that the proposed EMD-LSTM-Attention model consistently outperforms the benchmark models across all datasets. Under the default experimental configuration, with an input sequence length of 4 h and a prediction interval of 10 min, the proposed model achieved a full-year MAPE of 1.43%, an RMSE of 0.291℃, and an
R2 of 0.952. Compared with the baseline models under the same conditions, the proposed approach exhibited superior overall prediction accuracy. Furthermore, the model maintained relatively low prediction errors across spring, summer, autumn, and winter datasets, indicating stable performance under varying seasonal thermal conditions. A parameter sensitivity analysis was conducted to further investigate the influence of key input settings on model performance. Within the tested range of input sequence lengths, the model achieved its lowest prediction error when a 1.0 h historical input window was used. Under this configuration, the full-year MAPE decreased from 1.43% (with a 4.0 h input window) to 0.76%. Additionally, the impact of prediction horizon was examined by extending the forecast length from 0.5 h to 2.5 h. As the prediction horizon increased, model errors exhibited an overall upward trend; nevertheless, the prediction accuracy remained at a relatively high level throughout the tested range. Overall, the findings of this study indicate that the effective integration of signal decomposition techniques with deep learning models can substantially enhance the accuracy and robustness of short-term temperature prediction in laying hen houses. The proposed EMD-LSTM-Attention framework provides a viable data-driven solution for capturing complex thermal dynamics and offers practical support for the development of predictive and intelligent environmental control systems in poultry production. Future research should focus on validating the proposed approach across multiple housing facilities and incorporating additional environmental variables to further improve model generalization and applicability.