Abstract
This study introduces an advanced algorithm based on the Generalized Least Deviation Method(GLDM) tailored for the univariate time series analysis of COVID-19 data. At the core of this approach is theoptimization of a loss function, strategically designed to enhance the accuracy of the model’s predictions. Thealgorithm leverages second-order terms, crucial for capturing the complexities inherent in time series data. Ourfindings reveal that by optimizing the loss function and effectively utilizing second-order model dynamics, there is amarked improvement in the predictive performance. This advancement leads to a robust and practical forecasting tool,significantly enhancing the accuracy and reliability of univariate time series forecasts in the context of monitoringCOVID-19 trends
Recommended Citation
Abotaleb, Mostafa
(2024)
"Soft Computing-Based Generalized Least Deviation MethodAlgorithm for Modeling and Forecasting COVID-19 usingQuasilinear Recurrence Equations,"
Iraqi Journal for Computer Science and Mathematics: Vol. 5:
Iss.
3, Article 39.
DOI: https://doi.org/10.52866/ijcsm.2024.05.03.028
Available at:
https://ijcsm.researchcommons.org/ijcsm/vol5/iss3/39