REKURRENT NEYRON TO'RLAR: ARXITEKTURASI, RIVOJLANISHI VA AMALIY QO'LLANILISHI

Authors

  • Tojimamatov Isroil Nurmamatovich,Asqaraliyeva Gulzoda Murodjon qizi Farg’ona davlat unversiteti Author

Keywords:

Rekurrent neyron to'rlar, RNN, LSTM, GRU, tabiiy tilni qayta ishlash, vaqt qatorlari, ovozni tanish, sun'iy intellekt, mashina o'qitish, neyron to'rlar.

Abstract

Ushbu maqola rekurrent neyron to'rlar (RNN)ning arxitekturasi, rivojlanishi va qo'llanilishiga bag'ishlangan. Rekurrent to'rlarning tabiiy tilni qayta ishlash (NLP), vaqt qatorlarini bashorat qilish va ovozni tanish sohalarida qanday yutuqlarga erishgani ko'rib chiqiladi. Maqolada RNNlarning oddiy neyron to'rlardan farqlari, LSTM va GRU kabi rivojlangan turlarining ishlash prinsiplari va afzalliklari, shuningdek, rekurrent to'rlarning hozirgi yutuqlari va cheklovlari tahlil qilinadi. Kelajakdagi tadqiqot va rivojlantirish yo'nalishlari hamda sun'iy intellekt sohasidagi rekurrent to'rlarning imkoniyatlari ham muhokama qilinadi.

References

Hochreiter, S., & Schmidhuber, J. (1997). Long Short-Term Memory. Neural Computation, 9(8), 1735-1780.

Cho, K., Van Merriënboer, B., Bahdanau, D., & Bengio, Y. (2014). On the Properties of Neural Machine Translation: Encoder-Decoder Approaches. arXiv preprint arXiv:1409.1259.

Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep Learning. MIT Press.

Graves, A., Mohamed, A.-r., & Hinton, G. (2013). Speech Recognition with Deep Recurrent Neural Networks. In 2013 IEEE International Conference on Acoustics, Speech and Signal Processing.

Lipton, Z. C., Berkowitz, J., & Elkan, C. (2015). A Critical Review of Recurrent Neural Networks for Sequence Learning. arXiv preprint arXiv:1506.00019.

Sutskever, I., Vinyals, O., & Le, Q. V. (2014). Sequence to Sequence Learning with Neural Networks. arXiv preprint arXiv:1409.3215.

Chung, J., Gulcehre, C., Cho, K., & Bengio, Y. (2014). Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling. arXiv preprint arXiv:1412.3555.

Mikolov, T., Karafiát, M., Burget, L., Cernocký, J., & Khudanpur, S. (2010). Recurrent Neural Network Based Language Model. In INTERSPEECH.

Bahdanau, D., Cho, K., & Bengio, Y. (2014). Neural Machine Translation by Jointly Learning to Align and Translate. arXiv preprint arXiv:1409.0473.

Greff, K., Srivastava, R. K., Koutník, J., Steunebrink, B. R., & Schmidhuber, J. (2016). LSTM: A Search Space Odyssey. IEEE Transactions on Neural Networks and Learning Systems, 28(10), 2222-2232.

Olah, C. (2015). Understanding LSTM Networks. http://colah.github.io/posts/2015-08-Understanding-LSTMs/

Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., ... & Polosukhin, I. (2017). Attention Is All You Need. In Advances in Neural Information Processing Systems.

Karpathy, A., & Fei-Fei, L. (2015). Deep Visual-Semantic Alignments for Generating Image Descriptions. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition.

Graves, A. (2013). Generating Sequences With Recurrent Neural Networks. arXiv preprint arXiv:1308.0850.

LeCun, Y., Bengio, Y., & Hinton, G. (2015). Deep Learning. Nature, 521(7553), 436-444.

Published

2024-05-20