- [[Extractive Summary Generation]] →Related: [[direct quotation model]].
- [[Generated summary generation]]
Forward propagating NN (model that takes the last C words as input) A. M. Rush, S. Chopra, and J. Weston. A neural attention model for abstractive sentence summarization. In Proceedings of EMNLP, pages 379-389, 2015
- copy mechanism C. Gulcehre et al.. Pointing the unknown words. In Proceedings of ACL, pages 140-149,20L6.
Using the copy mechanism for summary generation seq2seq. R. Nallapati, B. Xiang, and B. Zhou. Sequence-to-sequence RNNs for text summarization. arXiv:1602.06023, 2016.
AMR(abstract meaning representation) S. Takase et oL. Neural headline generation on abstract meaning representation. In Proceedi,ngs of EMNLP, pages 1054-1059, 2016
This page is auto-translated from /nishio/要約生成 using DeepL. If you looks something interesting but the auto-translated English is not good enough to understand it, feel free to let me know at @nishio_en. I’m very happy to spread my thought to non-Japanese readers.