Adaptive Hybrid Architecture for Retail Demand Forecasting: Synergising XGBOOST and LSTM via Dynamic Weighing

Authors

  • Daksh Jain School of Computing Science & Engineering, Galgotias University, Greater Noida, UP, India Author
  • Dipanshu Dedha School of Computing Science & Engineering, Galgotias University, Greater Noida, UP, India Author
  • M. Navysari School of Computing Science & Engineering, Galgotias University, Greater Noida, UP, India Author

Keywords:

Demand Forecasting, XGBOOST, LSTM, Dynamic Weighing, Hybrid Architecture, Retail Analytics

Abstract

Inaccurate demand forecasting in the retail sector creates significant financial pressure, manifesting as capital tied up in overstock or sales lost due to understock. Conventional forecasting methods, often reliant on simple historical averaging or linear statistical models, fail to account for the complex, nonlinear factors influencing cutting-edge customer behaviour, as well as seasonal promotions and holiday trends. This research addresses this hole by providing an AI-powered hybrid forecasting system. utilising a publicly available Kaggle retail dataset, we evolved a novel structure that combines the computational performance of XGBOOST (Gradient Boosting) with the sequential dependency seize of long short-term memory (LSTM) networks. The primary contribution of this observation is a dynamic weighing mechanism that adaptively balances predictions from both models based on contextual volatility. Experimental results indicate that this adaptive hybrid technique yields stronger performance than both constituent versions in isolation, offering a pathway towards an extra resilient and profitable retail environment.

Downloads

Published

13-03-2026

How to Cite

Jain, D. ., Dedha, D. ., & Navysari, M. . (2026). Adaptive Hybrid Architecture for Retail Demand Forecasting: Synergising XGBOOST and LSTM via Dynamic Weighing. DMPedia Lecture Notes in Multidisciplinary Research, IMPACT26, 699-704. https://digitalmanuscriptpedia.com/conferences/index.php/DMP-LNMR/article/view/23