Gating mechanism based Natural Language Generation for spoken dialogue systems
Gating mechanism based Natural Language Generation for spoken dialogue systems is a scholarly work, published in 2019 in ''Neurocomputing''. The main subjects of the publication include recurrent neural network, statistical machine translation, channel regulator activity, generator, natural language processing, sentence, Spoken dialog system, natural language generation, artificial intelligence, natural language, leverage, spoken language, feature, natural language understanding, and computer science. The paper presents an approach to leverage gating mechanisms, in which authors incrementally propose three additional semantic cells into a traditional RNN model: a Refinement cell to filter the sequential inputs before RNN computations, an Adjustment cell, and an Output cell to select semantic elements and gate a feature vector during generation.