个人杂谈之ACL2019

Posted by cyq on November 29, 2020

(完成)ACL2019

  • RankQA: Neural Question Answering with Answer Re-Ranking
  • Episodic Memory Reader: Learning What to Remember for Question Answering from Streaming Data
  • Retrieve, Read, Rerank: Towards End-to-End Multi-Document Reading Comprehension
  • Latent Retrieval for Weakly Supervised Open Domain Question Answering
  • Multi-Hop Paragraph Retrieval for Open-Domain Question Answering
  • Real-Time Open-Domain Question Answering with Dense-Sparse Phrase Index
  • A cross-sentence latent variable model for semi-supervised sequence matching
  • Answering while Summarizing: Multi-task Learning for Multi-hop QA with Evidence Extraction
  • Learning a Matching Model with Co-teaching for Multi-turn Response Selection in Retrieval-based Dialogue Systems
  • Matching Article Pairs with Graphical Decomposition and Convolutions
  • RE2: Simple and Effective Text Matching with Richer Alignment Features

  • A Lightweight Recurrent Network for Sequence Modeling
  • Encouraging Paragraph Embeddings to Remember Sentence Identity Improves Classification
  • Pretraining Methods for Dialog Context Representation Learning
  • Relational Word Embeddings
  • Training Neural Response Selection for Task-Oriented Dialogue Systems
  • Learning Transferable Feature Representations Using Neural Networks

第一次阅读列表

  • !!(多段落MRC)Multi-hop reading comprehension across multiple documents by reasoning over heterogeneous graphs

  • !!(生成式MRC)Multi-style Generative Reading Comprehension 【基于transformer】

  • (NLI数据集中的bias问题)Don’t Take the Premise for Granted: Mitigating Artifacts in Natural Language Inference 【给定假设和label,预测前提。。。方法新颖,但没看懂帖子】

  • (MRC、推理)Inferential Machine Comprehension: Answering Questions by Recursively Deducing the Evidence Chain from Text 【模型很复杂,,主要针对多步推理】

第二次论文列表(20篇)

  • !!(MRC,外部知识)Explicit Utilization of General Knowledge in Machine Reading Comprehension【提出一种使用外部知识WordNet的方式】
  • !!(MRC)Token-level Dynamic Self-Attention Network for Multi-Passage Reading Comprehension【提出词级别的动态self-attention机制用于多段落MRC】
  • !!(多任务学习)Multi-Task Deep Neural Networks for Natural Language Understanding 【在BERT的基础上对GLUE的所有任务联合学习】
  • (无监督MRC数据集生成)Unsupervised Question Answering by Cloze Translation
  • !!(BERT+知识图谱)ERNIE: Enhanced Language Representation with Informative Entities
  • !!(问题生成)Learning to Ask Unanswerable Questions for Machine Reading Comprehension 【论坛上的帖子没太看懂具体方法,但感觉有价值】
  • !!(QA对生成)Generating Question-Answer Hierarchies 【论坛上的帖子没太看懂具体方法,但感觉有价值】
  • !!(MRC)Exploiting Explicit Paths for Multi-hop Reading Comprehension
  • !!(self-attention)Analyzing Multi-Head Self-Attention: Specialized Heads Do the Heavy Lifting, the Rest Can Be Pruned
  • !!(对话)One Time of Interaction May Not Be Enough: Go Deep with an Interaction-over-Interaction Network for Response Selection in Dialogues 【论坛上的帖子没太看懂具体方法,但感觉有价值】

第三次论文列表

  • !!(QA模型的鲁棒性)Improving the Robustness of Question Answering Systems to Question Paraphrasing
  • !!(对话的问题生成)Reinforced Dynamic Reasoning for Conversational Question Generation
  • **Transformer-XL: Attentive Language Models beyond a Fixed-Length Context **【循环机制 + 相对位置编码】
  • !!!(多跳MRC)Cognitive Graph for Multi-Hop Reading Comprehension at Scale 【用两个system解决多跳MRC的推理】
  • !!!(开放域QA)Improving Question Answering over Incomplete KBs with Knowledge-Aware Reader【为了弥补KB无法提供开放域QA所需的全部知识,因此考虑加入一些非结构化文本知识】
  • !!(多跳MRC)Multi-hop Reading Comprehension through Question Decomposition and Rescoring
  • !!(MRC)Enhancing Pre-Trained Language Representations with Rich Knowledge for Machine Reading Comprehension 【在BERT上加入知识库】
  • (对话式QA)MCˆ2: Multi-perspective Convolutional Cube for Conversational Machine Reading Comprehension
  • MultiQA: An Empirical Investigation of Generalization and Transfer in Reading Comprehension
  • !!(对话)Are Training Samples Correlated? Learning to Generate Dialogue Responses with Multiple References【建模并利用多个valid response之间的关系】
  • !!(对话)Conversing by Reading: Contentful Neural Conversation with On-demand Machine Reading【融合外部知识,帮助对话系统生成response。。同时提供了一个包括外部知识的新数据集】

  • !!(对话)Improving Multi-turn Dialogue Modelling with Utterance ReWriter

  • (对话)Do Neural Dialog Systems Use the Conversation History Effectively? An Empirical Study

  • !!(对话系统)Reading Turn by Turn: Hierarchical Attention Architecture for Spoken Dialogue Comprehension

  • (对话,response生成)Incremental Transformer with Deliberation Decoder for Document Grounded Conversations

  • (对话,response选择)Constructing Interpretive Spatio-Temporal Features for Multi-Turn Responses Selection

  • (对话生成)ReCoSa: Detecting the Relevant Contexts with Self-Attention for Multi-turn Dialogue Generation

  • (response 生成)Retrieval-Enhanced Adversarial Training for Neural Response Generation

  • (response 生成)Learning to Abstract for Memory-augmented Conversational Response Generation

  • (response 生成)Neural Response Generation with Meta-words

  • (对话,元学习)Domain Adaptive Dialog Generation via Meta Learning

  • !!(不可回答的问题生成)Self-Attention Architectures for Answer-Agnostic Neural Question Generation

  • (图网络、多跳推理)Dynamically Fused Graph Network for Multi-hop Reasoning

  • !!(多步推理)Compositional Questions Do Not Necessitate Multi-hop Reasoning

  • (MRC)Simple and Effective Curriculum Pointer-Generator Networks for Reading Comprehension over Long Narratives

  • (MRC)Explore, Propose, and Assemble: An Interpretable Model for Multi-Hop Reading Comprehension

  • !!(预训练,迁移学习,自然语言生成)Large-Scale Transfer Learning for Natural Language Generation

  • Learning Compressed Sentence Representations for On-Device Text Processing

  • (BERT可解释性)What Does BERT Learn about the Structure of Language

  • Dual Supervised Learning for Natural Language Understanding and Generation

  • (梯度反转,域适应)Reversing Gradients in Adversarial Domain Adaptation for Question Deduplication and Textual Entailment Tasks

  • (知识蒸馏,多任务学习)BAM! Born-Again Multi-Task Networks for Natural Language Understanding