Chapter 6. Chatbots
2023-07-26 11:16:37 4 举报
Practical Natural Language Processing
作者其他创作
大纲/内容
Applications
Shopping and e-commerce, News and content discovery, Customer service, Medical, Legal
Taxonomy
Goal-Oriented Dialog
FAQ bots
fixed set of responses
no dependency among reponses
flow-based bots
understand and track this information throughout the conversation
Chitchats
open-ended bots
converse with the user about various topics
A Pipeline for Building Dialog Systems [link]
Speech recognition (IVR: Google ASR)
voice to text
Natural language understanding (NLU)
gathering all possible information from text: sentiment detection, named entity extraction, coreference resolution
Dialog and task manager
gathers and systematically decides which pieces of information are important or not
Natural language generation
generates a response in a human-readable form according to the strategy devised by the dialog manager
Speech synthesis
text back to speech
Dialog Systems in Detail - pizza example using Dialogflow
Dialog act or intent
the aim of a user command
Slot or entity
the fixed ontological construct that holds information regarding specific entities related to the intent
Dialog state
contains both the information about the dialog act as well as state-value pairs
Context
a set of dialog states that also captures previous dialog states as history
Deep Dive into Components of a Dialog System
Natural Language Understanding (NLU)
Dialog Act Classification
CNN
dense feature representation, n-grams is indicative
RNN
Good at sequential data
Bert
Since BERT is pre-trained, the representation of content is much better
Identifying Slots - NER(named entity recognition)
CRF++ (Conditional random fields)
a popular sequence labeling technique and are used heavily in information extraction
Bert
named entity input is not well presented - Pre-trained models may overfit on smaller datasets, and handcrafted features may generalize well in those cases.
Response Generation
Fixed responses
Use of templates
Templates are very useful when the follow-up response is a clarifying question
Automatic generation
conditional generative model
Other Dialog Pipelines
End-to-End Approach
seq2seq
take a sequence as input and output another sequence
generally LSTM based
Deep Reinforcement Learning for Dialogue Generation
generated utterances with considering how to respond in order to have a good conversation
goal-oriented dialog and seq2seq-based generation
Human-in-the-Loop
Rasa NLU
Context-based conversations
performs NLU and captures required slots and their values
Interactive learning
create more training data and provide human feedback
Data annotation
API integration
Customized models in Rasa
收藏
0 条评论
下一页