BERT Uncased and LSTM Multiclass Classification Model for Traffic Violation Text Classification
DOI:
https://doi.org/10.24843/LKJITI.2024.v15.i02.p04Keywords:
Multiclass Classification, BERT Uncased, LSTM, Traffic ViolationsAbstract
The increasing amount of internet content makes it difficult for users to find information using the search function. This problem is overcome by classifying news based on its context to avoid material that has many interpretations. This research combines the Uncased model BiDirectional Encoder Representations from Transformer (BERT) with other models to create a text classification model. Long Short-Term Memory (LSTM) architecture trains a model to categorize news articles about traffic violations. Data was collected through the crawling method from the online media application API through unmodified and modified datasets. The BERT Uncased-LSTM model with the best hyperparameter combination scenario of batch size 16, learning rate 2e-5, and average pooling obtained Precision, Recall, and F1 values of 97.25%, 96.90%, and 98.10%, respectively. The research results show that the test value on the unmodified dataset is higher than on the modified dataset because the selection of words that have high information value in the modified dataset makes it difficult for the model to understand the context in text classification.Downloads
Published
2025-10-12
How to Cite
[1]
Komang Ayu Triana Indah, I Ketut Gede Darma Putra, Made Sudarma, Rukmi Sari Hartati, and Minho Jo, “BERT Uncased and LSTM Multiclass Classification Model for Traffic Violation Text Classification”, LKJITI, vol. 15, no. 02, pp. 112–123, Oct. 2025.
Issue
Section
Articles
License
Copyright (c) 2025 Lontar Komputer : Jurnal Ilmiah Teknologi Informasi

This work is licensed under a Creative Commons Attribution 4.0 International License.
