ارسل ملاحظاتك

ارسل ملاحظاتك لنا







Forecasting Financial Time Series Using A Low Complexity Recurrent Neural Network And Evolutionary Learning Approach

المصدر: مجلة جامعة الملك سعود - علوم الحاسب والمعلومات
الناشر: جامعة الملك سعود
المؤلف الرئيسي: Rout, Ajit Kumar (Author)
مؤلفين آخرين: Dash, P. K. (Co-Author) , Dash, Rajashree (Co-Author) , Bisoi, Ranjeeta (Co-Author)
المجلد/العدد: مج29, ع4
محكمة: نعم
الدولة: السعودية
التاريخ الميلادي: 2017
الصفحات: 536 - 552
DOI: 10.33948/0584-029-004-011
ISSN: 1319-1578
رقم MD: 974274
نوع المحتوى: بحوث ومقالات
اللغة: الإنجليزية
قواعد المعلومات: science
مواضيع:
كلمات المؤلف المفتاحية:
Low Complexity FLANN Models | Recurrent Computationally Efficient FLANN | Differential Evolution | Hybrid Moderate Random Search PSO
رابط المحتوى:
صورة الغلاف QR قانون
حفظ في:
المستخلص: The paper presents a low complexity recurrent Functional Link Artificial Neural Network for predicting the financial time series data like the stock market indices over a time frame varying from 1 day ahead to 1 month ahead. Although different types of basis functions have been used for low complexity neural networks earlier for stock market prediction, a comparative study is needed to choose the optimal combinations of these for a reasonably accurate forecast. Further several evolutionary learning methods like the Particle Swarm Optimization (PSO) and modified version of its new variant (HMRPSO), and the Differential Evolution (DE) are adopted here to find the optimal weights for the recurrent computationally efficient functional link neural network (RCEFLANN) using a combination of linear and hyperbolic tangent basis functions. The performance of the recurrent computationally efficient FLANN model is compared with that of low complexity neural networks using the Trigonometric, Chebyshev, Laguerre, Legendre, and tangent hyperbolic basis functions in predicting stock prices of Bombay Stock Exchange data and Standard & Poor’s 500 data sets using different evolutionary methods and has been presented in this paper and the results clearly reveal that the recurrent FLANN model trained with the DE outperforms all other FLANN models similarly trained.

ISSN: 1319-1578