Abstract:
In flood forecasting and reservoir flood control operation practices, the stage-discharge relation of mainstreams subject to multi-tributary water jacking exhibits significant nonlinear characteristics that conventional linear models fail to accurately capture.To address this limitation, we develop a Long Short-Term Memory (LSTM)-based stage-discharge relation modeling framework incorporating multi-tributary water jacking influences.The advantages of deep learning methods are compared and verified by using Gradient Boosting Regression Tree (GBRT), Random Forest (RF), and Support Vector Regression (SVR) as benchmark models.SHapley Additive exPlanations (SHAP) interpretability techniques are employed to quantify contribution weights and elucidate the mechanistic impacts of tributary water jacking factors.The application results of Xiangjiaba Hydrological Station show that: LSTM model effectively characterizes complex multi-tributary water jacking-affected stage-discharge relationships, achieving a Nash-Sutcliffe Efficiency (NSE) coefficient of 0.948 during test period, outperforming GBRT, RF, and SVR counterparts.Interpretability analysis reveals critical tributary water jacking thresholds: when Hengjiang Station′s discharge exceeds 300 m
3/s concurrent with Gaochang Station′s discharge surpassing 3 000 m
3/s, significant tributary water jacking occurs.The integrated LSTM-SHAP methodology achieves accurate simulation of multi-tributary water jacking-affected stage-discharge relationships and quantitative identification of critical discharge thresholds.This framework provides technical foundations for real-time flood control operations accounting for tributary water interactions.