JCSE, vol. 14, no. 4, pp.154-162, 2020
DOI: http://dx.doi.org/10.5626/JCSE.2020.14.4.154
Semantic Vector Learning Using Pretrained Transformers in Natural Language Understanding
Sangkeun Jung
Chungnam National University, Daejeon, Korea
Abstract: Natural language understanding (NLU) is a core technology for implementing natural interfaces. To implement and support
robust NLU, previous studies introduced a neural network approach to learn semantic vector representation by
employing the correspondence between text and semantic frame texts as extracted semantic knowledge. In their work,
long short-term memory (LSTM)-based text and readers were used to encode both text and semantic frames. However,
there exists significant room for performance improvement using recent pretrained transformer encoders. In the present
work, as a key contribution, we have extended Jung's framework to work with pretrained transformers for both text and
semantic frame readers. In particular, a novel semantic frame processing method is proposed to directly feed the structural
form of the semantic frame to transformers. We conducted massive experiments by combining various types of
LSTM- or transformer-based text and semantic frame readers on the ATIS, SNIPS, Sim-M, Sim-R, and Weather datasets
to find the best suitable configurations for learning effective semantic vector representations. Through the experiments,
we concluded that the transformer-based text and semantic frame reader show a stable and rapid learning curve as well as
the best performance in similarity-based intent classification and semantic search tasks.
Keyword:
Semantic vector; Semantic vector learning; Natural language understanding; Transformer
Full Paper: 150 Downloads, 1174 View
|