Yildirim, Savas2024-07-182024-07-182020978-981-15-1216-2978-981-15-1215-52524-75652524-7573https://doi.org/10.1007/978-981-15-1216-2_12https://hdl.handle.net/11411/7023Traditional bag-of-words (BOW) draws advantage from distributional theory to represent document. The drawback of BOW is high dimensionality. However, this disadvantage has been solved by various dimensionality reduction techniques such as principal component analysis (PCA) or singular value decomposition (SVD). On the other hand, neural network-based approaches do not suffer from dimensionality problem. They can represent documents or words with shorter vectors. Especially, recurrent neural network (RNN) architectures have gained big attractions for short sequence representation. In this study, we compared traditional representation (BOW) with RNN-based architecture in terms of capability of solving sentiment problem. Traditional methods represent text with BOWapproach and produce one-hot encoding. Further well-known linear machine learning algorithms such as logistic regression and Naive Bayes classifier could learn the decisive boundary in the data points. On the other hand, RNN-based models take text as a sequence of words and transform the sequence using hidden and recurrent states. At the end, the transformation finally represents input text with dense and short vector. On top of it, a final neural layer maps this dense and short representation to a sentiment of a list. We discussed our findings by conducting several experiments in depth. We comprehensively compared traditional representation and deep learning models by using a sentiment benchmark dataset of five different topics such as books and kitchen in Turkish language.eninfo:eu-repo/semantics/closedAccessBowDeep LearningRnnTurkis LanguageComparing Deep Neural Networks to Traditional Models for Sentiment Analysis in Turkish LanguageBook Chapter10.1007/978-981-15-1216-2_12319311N/AWOS:000627405700013