Topic-Controlled Text Generation


Çağlayan C., Karakaya K. M.

6th International Conference on Computer Science and Engineering, UBMK 2021, Ankara, Türkiye, 15 - 17 Eylül 2021, ss.533-536 identifier

  • Yayın Türü: Bildiri / Tam Metin Bildiri
  • Doi Numarası: 10.1109/ubmk52708.2021.9558910
  • Basıldığı Şehir: Ankara
  • Basıldığı Ülke: Türkiye
  • Sayfa Sayıları: ss.533-536
  • Anahtar Kelimeler: Controllable textgeneration, Review generation, Text generation, Topic-controlled textgeneration
  • TED Üniversitesi Adresli: Evet

Özet

Today, the text generation subject in the field of Natural Language Processing (NLP) has gained a lot of importance. In particular, the quality of the text generated with the emergence of new transformer-based models has reached high levels. In this way, controllable text generation has become an important research area. There are various methods applied for controllable text generation, but since these methods are mostly applied on Recurrent Neural Network (RNN) based encoder decoder models, which were used frequently, studies using transformer-based models are few. Transformer-based models are very successful in long sequences thanks to their parallel working ability. This study aimed to generate Turkish reviews on the desired topics by using a transformer-based language model. We used the method of adding the topic information to the sequential input. We concatenated input token embedding and topic embedding (control) at each time step during the training. As a result, we were able to create Turkish reviews on the specified topics.