A Survey on Text Prediction Techniques



EOI: 10.11242/viva-tech.01.02.05

Download Full Text here



Citation

Deepal S. Thakur, Rajiv N. Tarsarya, Akshay A. Vaskar, Ashwini Save, "A Survey on Text Prediction Techniques", VIVA-IJRI Volume 1, Issue 2, Article 5, pp. 1-6, 2019. Published by Computer Engineering Department, VIVA Institute of Technology, Virar, India.

Abstract

Writing long sentences is bit boring, but with text prediction in the keyboard technology has made this simple. Learning technology behind the keyboard is developing fast and has become more accurate. Learning technologies such as machine learning, deep learning here play an important role in predicting the text. Current trending techniques in deep learning has opened door for data analysis. Emerging technologies such has Region CNN, Recurrent CNN have been under consideration for the analysis. Many techniques have been used for text sequence prediction such as Convolutional Neural Networks (CNN), Recurrent Neural Networks (RNN), and Recurrent Convolution Neural Networks (RCNN). This paper aims to provide a comparative study of different techniques used for text prediction.

Keywords

CNN, Deep Learning, LSTM, Machine Learning, RCNN, RNN.

References

  1. S. Lai, L. Xu, K. Liu and J. Zhao, “Recurrent Convolutional Neural Networks for Text Classification”, Proceedings of the Twenty-Ninth AAAI Conference on AI 2015.
  2. P. Ongsulee, “Artificial Intelligence, Machine Learning and Deep Learning”, 15th International Conference on ICT and Knowledge Engineering (ICT&KE), 2017
  3. W. Yin, K. Kann, Mo Yu and H. Schütze, “Comparative study of CNN and RNN for Natural Language Processing”, Feb-17.
  4. Z.Shi, M. Shi and C. Li, “The prediction of character based on Recurrent Neural network language model”, IEEE/ACIS 16th International Conference on Computer and Information Science (ICIS), 2017
  5. V. Tran, K. Nguyen and D. Bui, “A Vietnamese Language Model Based on Recurrent Neural Network”, 2016 Eighth International Conference on Knowledge and Systems Engineering.
  6. K. C. Arnold, K.Z. Gajos and A. T. Kalai, “On Suggesting Phrases vs. Predicting Words for Mobile Text Composition”;https://www.microsoft.com/enus/research/wpcontent/uploads/2016/12/arnold16suggesting.pdf, 2016
  7. J. Lee and F. Dernoncourt, “Sequential Short-Text Classification with Recurrent and Convolutional Neural Networks”, Conference paper at NAACL 2016.
  8. M. Liang and X. Hu, “Recurrent Convolutional Neural Network for Object Recognition”, IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2015
  9. A. Hassan and A.Mahmood, “Deep Learning for Sentence Classification”, IEEE Long Island Systems, Applications and Technology Conference (LISAT), 2017
  10. J. Shin, Y. Kim and S. Yoon, “Contextual CNN: A Novel Architecture Capturing Unified Meaning for Sentence Classification”,IEEE International Conference on Big Data and Smart Computing (BigComp), 2018
  11. W. Yin and H. Schutze, “Multichannel Variable-Size Convolution for Sentence Classification”, 19th Conference on Computational Language Learning,Association for Computational Linguistics, 2015
  12. I.Sutskever, O.Vinyals and Q. V. Le, “Sequence to Sequence Learning with Neural Networks”, Dec-14.
  13. Y. Zhang, B. Wallace, “A Sensitivity Analysis of (and Practitioners’ Guide to) Convolutional Neural Networks for Sentence Classification”, arXiv: 1510.03820v4 [cs.CL], 2016.
  14. A. Salem, A. Almarimi, G Andrejková, “Text Dissimilarities Predictions Using Convolutional Neural Networks and Clustering” World Symposium on Digital Intelligence for Systems and Machines (DISA), 2018
  15. Y. Lin, J. Wang, “Research on text classification based on SVM-KNN” IEEE 5th International Conference on Software Engineering and Service Science, 2014
  16. A. Hassan, A. Mahmood, “Convolutional Recurrent Deep Learning Model for Sentence Classification”, IEEE Access, 2018