-
summary generation # summary using [attention mechanism
-
Natural Language Processing with Deep Learning p.136 A Neural Attention Model for Abstractive Sentence Summarization
-
attention mechanism Using a fixed-length C forward propagation network instead of an RNN.
-
Later studies have shown RNNs to be more accurate, so this is just a historical story of how it was when it was first created. Using context length C The conditional probability of a summary statement output Y for an input statement X can be written as follows.
This page is auto-translated from /nishio/ę³Øęę©ę§ćēØććč¦ē“ēę using DeepL. If you looks something interesting but the auto-translated English is not good enough to understand it, feel free to let me know at @nishio_en. Iām very happy to spread my thought to non-Japanese readers.