Optimalisasi Pengalaman Pengguna Game Elden Ring Melalui Analisis Sentimen Berbasis BERT

M. Fadhilatur Ramadhan, Ade Irma Purnamasari, Agus Bahtiar

Abstract


Pertumbuhan pesat industri video game, khususnya di Steam dengan 30 juta pengguna aktif, menyoroti pentingnya memahami sentimen pengguna untuk meningkatkan pengalaman bermain game. Studi ini berfokus pada analisis sentimen terhadap Elden Ring menggunakan model BERT. Sebanyak 2.000 ulasan dari Januari hingga September 2024 dikumpulkan, dengan 80% digunakan untuk pelatihan dan 20% untuk pengujian. Langkah-langkah pra-pemrosesan meliputi pembersihan teks, tokenisasi, penghapusan kata henti, dan normalisasi. Hasilnya menunjukkan sentimen positif yang dominan, dengan BERT mencapai akurasi 99% dalam tugas klasifikasi. Evaluasi performa model menghasilkan metrik yang sangat baik, dengan akurasi 0.9900, presisi untuk sentimen negatif 0.96, recall 0.89, F1-score 0.92, serta presisi untuk sentimen positif 0.99, recall 1.00, dan F1-score 0.99. Ulasan umumnya mengapresiasi gameplay dan grafis, sementara mekanika kamera dan tingkat kesulitan game memicu pendapat yang beragam. Analisis temporal menunjukkan pola yang berfluktuasi sepanjang tahun, dengan munculnya masalah teknis pertengahan tahun dan perbaikan selanjutnya menyusul pembaruan. Berdasarkan temuan ini, direkomendasikan untuk menyesuaikan mekanika game guna mengatasi kekhawatiran terkait fungsionalitas kamera dan menyeimbangkan tingkat kesulitan. Lebih lanjut, peningkatan elemen naratif dapat meningkatkan pengalaman pengguna.


Keywords


analisis sentimen; elden ring; BERT; ulasan video game; pengalaman pengguna

Full Text:

PDF

References


Al-Omari, H., Abdullah, M. A., & Shaikh, S. (2020). EmoDet2: Emotion Detection in English Textual Dialogue using BERT and BiLSTM Models. 2020 11th International Conference on Information and Communication Systems (ICICS), 226–232. https://doi.org/10.1109/ICICS49469.2020.239539

Devlin, J., Chang, M. W., Lee, K., & Toutanova, K. (2019a). BERT: Pre-training of deep bidirectional transformers for language understanding. NAACL HLT 2019 - 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies - Proceedings of the Conference, 1(Mlm), 4171–4186.

Devlin, J., Chang, M. W., Lee, K., & Toutanova, K. (2019b). BERT: Pre-training of deep bidirectional transformers for language understanding. NAACL HLT 2019 - 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies - Proceedings of the Conference, 1(Mlm), 4171–4186.

FromSoftware, Inc. (2023). Award winning action-RPG ELDEN RING sold 20 million units worldwide. In Bandai Namco Entertainment Inc.

Geetha, M. P., & Karthika Renuka, D. (2021). Improving the performance of aspect based sentiment analysis using fine-tuned Bert Base Uncased model. International Journal of Intelligent Networks, 2, 64–69. https://doi.org/10.1016/j.ijin.2021.06.005

Handrizal, Manik, F. Y., & Misbah, H. A. (2024). Sentiment Analysis Based on Pubgm Player Aspects From App Store Reviews Using Bidirectional Encoder Representation From Transformer (Bert). Journal of Theoretical and Applied Information Technology, 102(4), 1740–1749.

Imron, S., Setiawan, E. I., Santoso, J., & Purnomo, M. H. (2023). Aspect Based Sentiment Analysis Marketplace Product Reviews Using BERT, LSTM, and CNN. Jurnal RESTI (Rekayasa Sistem Dan Teknologi Informasi), 7(3), 586–591. https://doi.org/10.29207/resti.v7i3.4751

Koto, F., Rahimi, A., Lau, J. H., & Baldwin, T. (2020). IndoLEM and IndoBERT: A Benchmark Dataset and Pre-trained Language Model for Indonesian NLP. https://doi.org/https://doi.org/10.48550/

Kusnadi, R., Yusuf, Y., Andriantony, A., Ardian Yaputra, R., & Caintan, M. (2021). Analisis Sentimen Terhadap Game Genshin Impact Menggunakan Bert. RABIT : Jurnal Teknologi dan Sistem Informasi Univrab, 6(2), 122–129. https://doi.org/10.36341/rabit.v6i2.1765

Mosbach, M., Andriushchenko, M., & Klakow, D. (2021). on the Stability of Fine-Tuning Bert: Misconceptions, Explanations, and Strong Baselines. ICLR 2021 - 9th International Conference on Learning Representations.

Mudding, A. A. (2024). Mengungkap Opini Publik: Pendekatan BERT-based-caused untuk Analisis Sentimen pada Komentar Film. Journal of System and Computer Engineering (JSCE), 5(1), 36–43. https://doi.org/10.61628/jsce.v5i1.1060

Potamias, R. A., Andreas, Siolas, G., & Stafylopatis, G. (2020). A transformer-based approach to irony and sarcasm detection. Neural Computing and Applications, 32(23), 17309–17320. https://doi.org/10.1007/s00521-020-05102-3

Rietzler, A., Stabinger, S., Opitz, P., & Engl, S. (2020). Adapt or get left behind: Domain adaptation through BERT language model finetuning for aspect-target sentiment classification. LREC 2020 - 12th International Conference on Language Resources and Evaluation, Conference Proceedings, 4933–4941.

Sayeed, M. S., Mohan, V., & Muthu, K. S. (2023). BERT: A Review of Applications in Sentiment Analysis. HighTech and Innovation Journal, 4(2), 453–462. https://doi.org/10.28991/HIJ-2023-04-02-015

Sun, C., Huang, L., & Qiu, X. (2019). Utilizing BERT for aspect-based sentiment analysis via constructing auxiliary sentence. NAACL HLT 2019 - 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies - Proceedings of the Conference, 1, 380–385.

Talaat, A. S. (2023). Sentiment analysis classification system using hybrid BERT models. Journal of Big Data, 10(1), 110. https://doi.org/10.1186/s40537-023-00781-w

Wijman, T. (2023). Global Games Market Report 2023. In Newzoo.

Xu, H., Liu, B., Shu, L., & Yu, P. S. (2019). BERT post-training for review reading comprehension and aspect-based sentiment analysis. Proceedings of NAACL-HLT 2019, 1, 2324–2335. https://doi.org/10.18653/v1/N19-1242

Zhang, L., Wang, S., & Liu, B. (2018). Deep learning for sentiment analysis: A survey. WIREs Data Mining and Knowledge Discovery, 8(4), e1253. https://doi.org/https://doi.org/10.1002/widm.1253.




DOI: https://doi.org/10.30998/jrami.v6i03.13839

Refbacks

  • There are currently no refbacks.