Unsupervised Dual-Cascade Learning with Pseudo-Feedback Distillation for Query-Focused Extractive Summarization

Author(s):  
Haggai Roitman ◽  
Guy Feigenblat ◽  
Doron Cohen ◽  
Odellia Boni ◽  
David Konopnicki
Symmetry ◽  
2021 ◽  
Vol 13 (4) ◽  
pp. 600
Author(s):  
Ping Li ◽  
Jiong Yu

We present an extractive summarization model based on the Bert and dynamic memory network. The model based on Bert uses the transformer to extract text features and uses the pre-trained model to construct the sentence embeddings. The model based on Bert labels the sentences automatically without using any hand-crafted features and the datasets are symmetry labeled. We also present a dynamic memory network method for extractive summarization. Experiments are conducted on several summarization benchmark datasets. Our model shows comparable performance compared with other extractive summarization methods.


2021 ◽  
Vol 28 (2) ◽  
pp. 532-553
Author(s):  
Ryuji Kano ◽  
Tomoki Taniguchi ◽  
Tomoko Ohkuma

Sign in / Sign up

Export Citation Format

Share Document