A journal of IEEE and CAA , publishes high-quality papers in English on original theoretical/experimental research and development in all areas of automation
Volume 8 Issue 5
May  2021

IEEE/CAA Journal of Automatica Sinica

  • JCR Impact Factor: 11.8, Top 4% (SCI Q1)
    CiteScore: 17.6, Top 3% (Q1)
    Google Scholar h5-index: 77, TOP 5
Turn off MathJax
Article Contents
X. R. Hou, K. Wang, C. Zhong, and Z. Wei, "ST-Trader: A Spatial-Temporal Deep Neural Network for Modeling Stock Market Movement," IEEE/CAA J. Autom. Sinica, vol. 8, no. 5, pp. 1015-1024, May. 2021. doi: 10.1109/JAS.2021.1003976
Citation: X. R. Hou, K. Wang, C. Zhong, and Z. Wei, "ST-Trader: A Spatial-Temporal Deep Neural Network for Modeling Stock Market Movement," IEEE/CAA J. Autom. Sinica, vol. 8, no. 5, pp. 1015-1024, May. 2021. doi: 10.1109/JAS.2021.1003976

ST-Trader: A Spatial-Temporal Deep Neural Network for Modeling Stock Market Movement

doi: 10.1109/JAS.2021.1003976
More Information
  • Stocks that are fundamentally connected with each other tend to move together. Considering such common trends is believed to benefit stock movement forecasting tasks. However, such signals are not trivial to model because the connections among stocks are not physically presented and need to be estimated from volatile data. Motivated by this observation, we propose a framework that incorporates the inter-connection of firms to forecast stock prices. To effectively utilize a large set of fundamental features, we further design a novel pipeline. First, we use variational autoencoder (VAE) to reduce the dimension of stock fundamental information and then cluster stocks into a graph structure (fundamentally clustering). Second, a hybrid model of graph convolutional network and long-short term memory network (GCN-LSTM) with an adjacency graph matrix (learnt from VAE) is proposed for graph-structured stock market forecasting. Experiments on minute-level U.S. stock market data demonstrate that our model effectively captures both spatial and temporal signals and achieves superior improvement over baseline methods. The proposed model is promising for other applications in which there is a possible but hidden spatial dependency to improve time-series prediction.

     

  • loading
  • [1]
    I. Ben-David, F. Franzoni, and R. Moussawi, “Do ETFs increase volatility?” J. Finance, vol. 73, no. 6, pp. 2471–2535, Dec. 2018. doi: 10.1111/jofi.12727
    [2]
    B. C. Greenwald, J. Kahn, P. D. Sonkin, and M. Van Biema, Value Investing: From Graham to Buffett and Beyond. New York: John Wiley & Sons, 2004.
    [3]
    L. Chen, X. M. Hu, W. Tian, H. Wang, D. P. Cao, and F. Y. Wang, “Parallel planning: A new motion planning framework for autonomous driving,” IEEE/CAA J. Autom. Sinica, vol. 6, no. 1, pp. 236–246, Jan. 2019. doi: 10.1109/JAS.2018.7511186
    [4]
    Z. P. Tan, J. Chen, Q. Kang, M. C. Zhou, A. Abusorrah, and K. Sedraoui, “Dynamic embedding projection-gated convolutional neural networks for text classification,” IEEE Trans. Neural Netw. Learn. Syst., 2021.
    [5]
    R. Ying, R. He, K. Chen, P. Eksombatchai, W. L. Hamilton, and J. Leskovec, “Graph convolutional neural networks for web-scale recommender systems,” in Proc. 24th ACM SIGKDD International Conf. Knowledge Discovery & Data Mining, London, United Kingdom, 2018, pp. 974−983.
    [6]
    T. Xie and J. C. Grossman, “Crystal graph convolutional neural networks for an accurate and interpretable prediction of material properties,” Phys. Rev. Lett., vol. 120, no. 14, Article No. 145301, Apr. 2018. doi: 10.1103/PhysRevLett.120.145301
    [7]
    D. Rapach and G. F. Zhou, “Forecasting stock returns,” in Handbook of Economic Forecasting. Amsterdam: Elsevier, 2013, vol. 2, pp. 328−383.
    [8]
    B. Graham, The Intelligent Investor. New Delhi: Prabhat Prakashan, 1965.
    [9]
    R. W. Banz, “The relationship between return and market value of common stocks,” J. Finan. Econom., vol. 9, no. 1, pp. 3–18, Mar. 1981. doi: 10.1016/0304-405X(81)90018-0
    [10]
    S. Basu, “Investment performance of common stocks in relation to their price-earnings ratios: A test of the efficient market hypothesis,” J. Finance, vol. 32, no. 3, pp. 663–682, Jun. 1977. doi: 10.1111/j.1540-6261.1977.tb01979.x
    [11]
    N. Jegadeesh, “Evidence of predictable behavior of security returns,” J. Finance, vol. 45, no. 3, pp. 881–898, Jul. 1990. doi: 10.1111/j.1540-6261.1990.tb05110.x
    [12]
    E. F. Fama and K. R. French, “A five-factor asset pricing model,” J. Finan. Econom., vol. 116, no. 1, pp. 1–22, Apr. 2015. doi: 10.1016/j.jfineco.2014.10.010
    [13]
    W. Huang, Y. Nakamori, and S. Y. Wang, “Forecasting stock market movement direction with support vector machine,” Comput. Operat. Res., vol. 32, no. 10, pp. 2513–2522, Oct. 2005. doi: 10.1016/j.cor.2004.03.016
    [14]
    Y. L. Lin, H. X. Guo, and J. L. Hu, “An SVM-based approach for stock market trend prediction,” in Proc. Int. Joint Conf. Neural Networks, Dallas, TX, USA, 2013, pp. 1−7.
    [15]
    L. Khaidem, S. Saha, and S. R. Dey, “Predicting the direction of stock market prices using random forest,” arXiv preprint arXiv: 1605.00003, 2016.
    [16]
    A. Krizhevsky, I. Sutskever, and G. E. Hinton, “ImageNet classification with deep convolutional neural networks,” in Proc. 25th Int. Conf. Neural Information Processing Systems, Red Hook, NY, United States, 2012, pp. 1097−1105.
    [17]
    K. Cho, B. Van Merrienboer, C. Gulcehre, D. Bahdanau, F. Bougares, H. Schwenk, and Y. Bengio, “Learning phrase representations using RNN encoder-decoder for statistical machine translation,” arXiv preprint arXiv: 1406.1078, 2014
    [18]
    Q. Cao, K. B. Leggio, and M. J. Schniederjans, “A comparison between Fama and French’s model and artificial neural networks in predicting the Chinese stock market,” Comput. Operat. Res., vol. 32, no. 10, pp. 2499–2512, Oct. 2005. doi: 10.1016/j.cor.2004.03.015
    [19]
    D. Olson and C. Mossman, “Neural network forecasts of Canadian stock returns using accounting ratios,” Int. J. Forecasting, vol. 19, no. 3, pp. 453–465, Jul.–Sept. 2003. doi: 10.1016/S0169-2070(02)00058-4
    [20]
    M. Abe and H. Nakayama, “Deep learning for forecasting stock returns in the cross-section,” in Proc. Pacific-Asia Conf. Knowledge Discovery and Data Mining. Melbourne, VIC, Australia: Springer, 2018, pp. 273−284.
    [21]
    C. Krauss, X. A. Do, and N. Huck, “Deep neural networks, gradient-boosted trees, random forests: Statistical arbitrage on the S&P 500,” Eur. J. Oper. Res., vol. 259, no. 2, pp. 689–702, Jun. 2017. doi: 10.1016/j.ejor.2016.10.031
    [22]
    T. Fischer and C. Krauss, “Deep learning with long short-term memory networks for financial market predictions,” Eur. J. Oper. Res., vol. 270, no. 2, pp. 654–669, Oct. 2018. doi: 10.1016/j.ejor.2017.11.054
    [23]
    D. M. Q. Nelson, A. C. M. Pereira, and R. A. de Oliveira, “Stock market’s price movement prediction with LSTM neural networks,” in Proc. Int. Joint Conf. Neural Networks, Anchorage, AK, USA, 2017, pp. 1419−1426.
    [24]
    J. Z. Wang, J. J. Wang, Z. G. Zhang, and S. P. Guo, “Forecasting stock indices with back propagation neural network,” Exp. Syst. Appl., vol. 38, no. 11, pp. 14346–14355, Oct. 2011.
    [25]
    T. J. Hsieh, H. F. Hsiao, and W. C. Yeh, “Forecasting stock markets using wavelet transforms and recurrent neural networks: An integrated system based on artificial bee colony algorithm,” Appl. Soft Comput., vol. 11, no. 2, pp. 2510–2525, Mar. 2011. doi: 10.1016/j.asoc.2010.09.007
    [26]
    M. Khashei and M. Bijari, “An artificial neural network (p, d, q) model for timeseries forecasting,” Exp. Syst. Appl., vol. 37, no. 1, pp. 479–489, Jan. 2010. doi: 10.1016/j.eswa.2009.05.044
    [27]
    A. P. Ratto, S. Merello, L. Oneto, Y. K. Ma, L. Malandri, and E. Cambria, “Ensemble of technical analysis and machine learning for market trend prediction,” in Proc. IEEE Symp. Series on Computational Intelligence, Bangalore, India, 2018, pp. 2090−2096.
    [28]
    C. R. Cheng, F. Tan, X. R. Hou, and Z. Wei, “Success prediction on crowdfunding with multimodal deep learning,” in Proc. 28th Int. Joint Conf. Artificial Intelligence, Macau, China, 2019, pp. 2158−2164.
    [29]
    F. Tan, Z. Wei, J. He, X. Wu, B. Peng, H. R. Liu, and Z. Y. Yan, “A blended deep learning approach for predicting user intended actions,” in Proc. IEEE Int. Conf. Data Mining, Singapore, 2018, pp. 487−496.
    [30]
    X. R. Hou, K. Wang, J. Zhang, and Z. Wei, “An enriched time-series forecasting framework for long-short portfolio strategy,” IEEE Access, vol. 8, pp. 31992–32002, Feb. 2020. doi: 10.1109/ACCESS.2020.2973037
    [31]
    J. J. Wang, J. Z. Wang, Z. G. Zhang, and S. P. Guo, “Stock index forecasting based on a hybrid model,” Omega, vol. 40, no. 6, pp. 758–766, Dec. 2012. doi: 10.1016/j.omega.2011.07.008
    [32]
    Y. K. Kwon and B. R. Moon, “A hybrid neurogenetic approach for stock forecasting,” IEEE Trans. Neural Netw., vol. 18, no. 3, pp. 851–864, May 2007. doi: 10.1109/TNN.2007.891629
    [33]
    R. de A. Araujo, A. R. L. Junior, and T. A. E. Ferreira, “A quantum-inspired intelligent hybrid method for stock market forecasting,” in Proc. IEEE Congr. Evolutionary Computation (IEEE World Congr. Computational Intelligence), Hong Kong, China, 2008, pp. 1348−1355.
    [34]
    S. B. Wankhade, D. Surana, N. J. Mansatta, and K. Shah, “Hybrid model based on unification of technical analysis and sentiment analysis for stock price prediction,” Int. J. Comput. Technol., vol. 11, no. 9, pp. 3025–3033, Dec. 2013. doi: 10.24297/ijct.v11i9.3415
    [35]
    J. Masci, D. Boscaini, M. M. Bronstein, and P. Vandergheynst, “Geodesic convolutional neural networks on riemannian manifolds,” in Proc. IEEE Int. Conf. Computer Vision Workshops, Santiago, Chile, 2015, pp. 832−840.
    [36]
    M. Niepert, M. Ahmed, and K. Kutzkov, “Learning convolutional neural networks for graphs,” in Proc. 33rd Int. Conf. Machine Learning, New York, 2016, pp. 2014−2023.
    [37]
    J. Bruna, W. Zaremba, A. Szlam, and Y. LeCun, “Spectral networks and locally connected networks on graphs,” arXiv preprint arXiv: 1312.6203, 2013.
    [38]
    M. Defferrard, X. Bresson, and P. Vandergheynst, “Convolutional neural networks on graphs with fast localized spectral filtering,” in Proc. 30th Int. Conf. Neural Information Processing Systems, Red Hook, NY, United States, 2016, pp. 3844−3852.
    [39]
    T. N. Kipf and M. Welling, “Semi-supervised classification with graph convolutional networks,” arXiv preprint arXiv: 1609.02907, 2016.
    [40]
    B. Yu, H. T. Yin, and Z. X. Zhu, “Spatio-temporal graph convolutional networks: A deep learning framework for traffic forecasting,” arXiv preprint arXiv: 1709.04875, 2017.
    [41]
    C. L. Li, Z. Cui, W. M. Zheng, C. Y. Xu, and J. Yang, “Spatio-temporal graph convolution for skeleton based action recognition,” arXiv preprint arXiv: 1802.09834, 2018.
    [42]
    A. R. Rivera and O. Chae, “Spatiotemporal directional number transitional graph for dynamic texture recognition,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 37, no. 10, pp. 2146–2152, Oct. 2015. doi: 10.1109/TPAMI.2015.2392774
    [43]
    J. T. Ke, H. Y. Zheng, H. Yang, and X. M. Chen, “Short-term forecasting of passenger demand under on-demand ride services: A spatio-temporal deep learning approach,” Transp. Res. C:Emerg. Technol., vol. 85, pp. 591–608, Dec. 2017. doi: 10.1016/j.trc.2017.10.016
    [44]
    M. Khodayar and J. H. Wang, “Spatio-temporal graph deep neural network for short-term wind speed forecasting,” IEEE Trans. Sustain. Energy, vol. 10, no. 2, pp. 670–681, Apr. 2019. doi: 10.1109/TSTE.2018.2844102
    [45]
    M. Khodayar, S. Mohammadi, M. E. Khodayar, J. H. Wang, and G. Y. Liu, “Convolutional graph autoencoder: A generative deep neural network for probabilistic spatio-temporal solar irradiance forecasting,” IEEE Trans. Sustain. Energy, vol. 11, no. 2, pp. 571–583, Apr. 2020. doi: 10.1109/TSTE.2019.2897688
    [46]
    F. R. K. Chung, Spectral Graph Theory. Providence: American Mathematical Society, 1997.
    [47]
    D. K. Hammond, P. Vandergheynst, and R. Gribonval, “Wavelets on graphs via spectral graph theory,” Appl. Comput. Harmon. Anal., vol. 30, no. 2, pp. 129–150, Mar. 2011. doi: 10.1016/j.acha.2010.04.005
    [48]
    D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” arXiv preprint arXiv: 1412.6980, 2014.
    [49]
    T. Tieleman and G. Hinton, “Lecture 6.5-rmsprop: Divide the gradient by a running average of its recent magnitude,” COURSERA, vol. 4, no. 2, pp. 26–31, 2012.
    [50]
    P. F. Pai and C. S. Lin, “A hybrid ARIMA and support vector machines model in stock price forecasting,” Omega, vol. 33, no. 6, pp. 497–505, Dec. 2005. doi: 10.1016/j.omega.2004.07.024
    [51]
    R. J. Hyndman and A. B. Koehler, “Another look at measures of forecast accuracy,” Int. J. Forecasting, vol. 22, no. 4, pp. 679–688, Oct.-Dec. 2006. doi: 10.1016/j.ijforecast.2006.03.001
    [52]
    S. Chopra, T. Thampy, J. Leahy, A. Caplin, and Y. LeCun, “Discovering the hidden structure of house prices with a non-parametric latent manifold model,” in Proc. 13th ACM SIGKDD Int. Conf. Knowledge Discovery and Data Mining, San Jose, California, USA, 2007, pp. 173−182.
    [53]
    D. X. Deng, C. Shahabi, U. Demiryurek, L. H. Zhu, R. Yu, and Y. Liu, “Latent space model for road networks to predict time-varying traffic,” in Proc. 22nd ACM SIGKDD Int. Conf. Knowledge Discovery and Data Mining, San Francisco, California, USA, 2016, pp. 1525−1534.
    [54]
    S. C. Gao, M. C. Zhou, Y. R. Wang, J. J. Cheng, H. Yachi, and J. H. Wang, “Dendritic neuron model with effective learning algorithms for classification, approximation, and prediction,” IEEE Trans. Neural Netw. Learn. Syst., vol. 30, no. 2, pp. 601–614, Feb. 2019. doi: 10.1109/TNNLS.2018.2846646
    [55]
    J. J. Wang and T. Kumbasar, “Parameter optimization of interval type-2 fuzzy neural networks based on PSO and BBBC methods,” IEEE/CAA J. Autom. Sinica, vol. 6, no. 1, pp. 247–257, Jan. 2019. doi: 10.1109/JAS.2019.1911348

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(3)  / Tables(5)

    Article Metrics

    Article views (1071) PDF downloads(66) Cited by()

    Highlights

    • Learn latent features of companies’ fundamental variables via Variational Autoencoder
    • Model graph structured interaction among stocks and price fluctuations simultaneously
    • Experiments on both predicting numerical stock price and binary stock price movement

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return