个人杂谈之ICLR2018

Posted by cyq on November 29, 2020

(完成)ICLR2018

(2.3%的口头展示,31.4%的poster接受,9%的workshop)

  • !!!!(DIIN)Natural Language Inference over Interaction Space

  • !!!!(IR,多任务学习)Multi-Task Learning for Document Ranking and Query Suggestion

  • !!!!(表达学习)An efficient framework for learning sentence representations

  • !!(表达学习,多任务学习)Learning General Purpose Distributed Sentence Representations via Large Scale Multi-task Learning

  • !!!!(序列建模)Bi-Directional Block Self-Attention for Fast and Memory-Efficient Sequence Modeling

  • !!!!(开放域QA)Evidence Aggregation for Answer Re-Ranking in Open-Domain Question Answering

  • !!!(QA,强化学习)Ask the Right Questions: Active Question Reformulation with Reinforcement Learning

  • !!(MRC)QANet: Combining Local Convolution with Global Self-Attention for Reading Comprehension

  • !!(MRC)DCN+: Mixed Objective And Deep Residual Coattention for Question Answering

  • !!(MRC)FusionNet: Fusing via Fully-aware Attention with Application to Machine Comprehension

  • (MRC)Multi-Mention Learning for Reading Comprehension with Neural Cascades

  • (摘要)Generating Wikipedia by Summarizing Long Sequences

  • (表达学习)A New Method of Region Embedding for Text Classification

  • (语言模型)breaking_the_softmax_bottleneck:a_high_rank_rnn_language_model

  • (语言模型)Neural Language Modeling by Jointly Learning Syntax and Lexicon

  • (文本生成,GAN)MaskGAN: Better Text Generation via Filling in the ___

  • (摘要)A Deep Reinforced Model for Abstractive Summarization

  • (LSTM的可解释性,情感分析)Beyond Word Importance: Contextual Decomposition to Extract Interactions from LSTMs

  • (迁移学习)Minimal-Entropy Correlation Alignment for Unsupervised Deep Domain Adaptation

  • (图卷积+self-attention)Graph Attention Networks

  • (多任务学习)Routing Networks: Adaptive Selection of Non-Linear Functions for Multi-Task Learning

  • All-but-the-Top: Simple and Effective Postprocessing for Word Representations

  • !!(一个逻辑蕴含的数据集)Can Neural Networks Understand Logical Entailment?

  • Spherical CNNs

  • (多任务学习)Beyond Shared Hierarchies: Deep Multitask Learning through Soft Layer Ordering

  • (transformer-based)Non-Autoregressive Neural Machine Translation