一个『机器学习领域综述大列表』,涵盖了自然语言处理、推荐系统、计算机视觉、深度学习、强化学习等主题。
另外发现源repo中NLP相关的综述不是很多,于是把一些觉得还不错的文章添加进去了,重新整理更新在 AI-Surveys[1] 中。
-
ml-surveys: https://github.com/eugeneyan/ml-surveys
-
AI-Surveys: https://github.com/KaiyuanGao/AI-Surveys
『收藏等于看完』系列,来看看都有哪些吧, enjoy!
自然语言处理
-
深度学习:Recent Trends in Deep Learning Based Natural Language Processing[2]
-
文本分类:Deep Learning Based Text Classification: A Comprehensive Review[3]
-
文本生成:Survey of the SOTA in Natural Language Generation: Core tasks, applications and evaluation[4]
-
文本生成:Neural Language Generation: Formulation, Methods, and Evaluation[5]
-
迁移学习:Exploring Transfer Learning with T5: the Text-To-Text Transfer Transformer[6] (Paper[7])
-
迁移学习:Neural Transfer Learning for Natural Language Processing[8]
-
知识图谱:A Survey on Knowledge Graphs: Representation, Acquisition and Applications[9]
-
命名实体识别:A Survey on Deep Learning for Named Entity Recognition[10]
-
关系抽取:More Data, More Relations, More Context and More Openness: A Review and Outlook for Relation Extraction[11]
-
情感分析:Deep Learning for Sentiment Analysis : A Survey[12]
-
ABSA情感分析:Deep Learning for Aspect-Level Sentiment Classification: Survey, Vision, and Challenges[13]
-
文本匹配:Neural Network Models for Paraphrase Identification, Semantic Textual Similarity, Natural Language Inference, and Question Answering[14]
-
阅读理解:Neural Reading Comprehension And Beyond[15]
-
阅读理解:Neural Machine Reading Comprehension: Methods and Trends[16]
-
机器翻译:Neural Machine Translation: A Review[17]
-
机器翻译:A Survey of Domain Adaptation for Neural Machine Translation[18]
-
预训练模型:Pre-trained Models for Natural Language Processing: A Survey[19]
-
注意力机制:An Attentive Survey of Attention Models[20]
-
注意力机制:An Introductory Survey on Attention Mechanisms in NLP Problems[21]
-
注意力机制:Attention in Natural Language Processing[22]
-
BERT:A Primer in BERTology: What we know about how BERT works[23]
-
Beyond Accuracy: Behavioral Testing of NLP Models with CheckList[24]
-
Evaluation of Text Generation: A Survey[25]
推荐系统
-
Recommender systems survey[26]
-
Deep Learning based Recommender System: A Survey and New Perspectives[27]
-
Are We Really Making Progress? A Worrying Analysis of Neural Recommendation Approaches[28]
-
A Survey of Serendipity in Recommender Systems[29]
-
Diversity in Recommender Systems – A survey[30]
-
A Survey of Explanations in Recommender Systems[31]
深度学习
-
A State-of-the-Art Survey on Deep Learning Theory and Architectures[32]
-
知识蒸馏:Knowledge Distillation: A Survey[33]
-
模型压缩:Compression of Deep Learning Models for Text: A Survey[34]
-
迁移学习:A Survey on Deep Transfer Learning[35]
-
神经架构搜索:A Comprehensive Survey of Neural Architecture Search-- Challenges and Solutions[36]
-
神经架构搜索:Neural Architecture Search: A Survey[37]
计算机视觉
-
目标检测:Object Detection in 20 Years[38]
-
对抗性攻击:Threat of Adversarial Attacks on Deep Learning in Computer Vision[39]
-
自动驾驶:Computer Vision for Autonomous Vehicles: Problems, Datasets and State of the Art[40]
强化学习
-
A Brief Survey of Deep Reinforcement Learning[41]
-
Transfer Learning for Reinforcement Learning Domains[42]
-
Review of Deep Reinforcement Learning Methods and Applications in Economics[43]
Embeddings
-
图:A Comprehensive Survey of Graph Embedding: Problems, Techniques and Applications[44]
-
文本:From Word to Sense Embeddings:A Survey on Vector Representations of Meaning[45]
-
文本:Diachronic Word Embeddings and Semantic Shifts[46]
-
文本:Word Embeddings: A Survey[47]
-
A Survey on Contextual Embeddings[48]
Meta-learning & Few-shot Learning
-
A Survey on Knowledge Graphs: Representation, Acquisition and Applications[49]
-
Meta-learning for Few-shot Natural Language Processing: A Survey[50]
-
Learning from Few Samples: A Survey[51]
-
Meta-Learning in Neural Networks: A Survey[52]
-
A Comprehensive Overview and Survey of Recent Advances in Meta-Learning[53]
-
Baby steps towards few-shot learning with multiple semantics[54]
-
Meta-Learning: A Survey[55]
-
A Perspective View And Survey Of Meta-learning[56]
其他
-
A Survey on Transfer Learning[57]
本文参考文献
[1]AI-Surveys: https://github.com/KaiyuanGao/AI-Surveys
[2]Recent Trends in Deep Learning Based Natural Language Processing: https://arxiv.org/pdf/1708.02709.pdf
[3]Deep Learning Based Text Classification: A Comprehensive Review: https://arxiv.org/pdf/2004.03705
[4]Survey of the SOTA in Natural Language Generation: Core tasks, applications and evaluation: https://www.jair.org/index.php/jair/article/view/11173/26378
[5]Neural Language Generation: Formulation, Methods, and Evaluation: https://arxiv.org/pdf/2007.15780.pdf
[6]Exploring Transfer Learning with T5: the Text-To-Text Transfer Transformer: https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html
[7]Paper: https://arxiv.org/abs/1910.10683
[8]Neural Transfer Learning for Natural Language Processing: https://aran.library.nuigalway.ie/handle/10379/15463
[9]A Survey on Knowledge Graphs: Representation, Acquisition and Applications: https://arxiv.org/abs/2002.00388
[10]A Survey on Deep Learning for Named Entity Recognition: https://arxiv.org/abs/1812.09449
[11]More Data, More Relations, More Context and More Openness: A Review and Outlook for Relation Extraction: https://arxiv.org/abs/2004.03186
[12]Deep Learning for Sentiment Analysis : A Survey: https://arxiv.org/abs/1801.07883
[13]Deep Learning for Aspect-Level Sentiment Classification: Survey, Vision, and Challenges: https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=8726353
[14]Neural Network Models for Paraphrase Identification, Semantic Textual Similarity, Natural Language Inference, and Question Answering: https://www.aclweb.org/anthology/C18-1328/
[15]Neural Reading Comprehension And Beyond: https://stacks.stanford.edu/file/druid:gd576xb1833/thesis-augmented.pdf
[16]Neural Machine Reading Comprehension: Methods and Trends: https://arxiv.org/abs/1907.01118
[17]Neural Machine Translation: A Review: https://arxiv.org/abs/1912.02047
[18]A Survey of Domain Adaptation for Neural Machine Translation: https://www.aclweb.org/anthology/C18-1111.pdf
[19]Pre-trained Models for Natural Language Processing: A Survey: https://arxiv.org/abs/2003.08271
[20]An Attentive Survey of Attention Models: https://arxiv.org/pdf/1904.02874.pdf
[21]An Introductory Survey on Attention Mechanisms in NLP Problems: https://arxiv.org/abs/1811.05544
[22]Attention in Natural Language Processing: https://arxiv.org/abs/1902.02181
[23]A Primer in BERTology: What we know about how BERT works: https://arxiv.org/pdf/2002.12327.pdf
[24]Beyond Accuracy: Behavioral Testing of NLP Models with CheckList: https://arxiv.org/pdf/2005.04118.pdf
[25]Evaluation of Text Generation: A Survey: https://arxiv.org/pdf/2006.14799.pdf
[26]Recommender systems survey: http://irntez.ir/wp-content/uploads/2016/12/sciencedirec.pdf
[27]Deep Learning based Recommender System: A Survey and New Perspectives: https://arxiv.org/pdf/1707.07435.pdf
[28]Are We Really Making Progress? A Worrying Analysis of Neural Recommendation Approaches: https://arxiv.org/pdf/1907.06902.pdf
[29]A Survey of Serendipity in Recommender Systems: https://www.researchgate.net/publication/306075233_A_Survey_of_Serendipity_in_Recommender_Systems
[30]Diversity in Recommender Systems – A survey: https://papers-gamma.link/static/memory/pdfs/153-Kunaver_Diversity_in_Recommender_Systems_2017.pdf
[31]A Survey of Explanations in Recommender Systems: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.418.9237&rep=rep1&type=pdf
[32]A State-of-the-Art Survey on Deep Learning Theory and Architectures: https://www.mdpi.com/2079-9292/8/3/292/htm
[33]Knowledge Distillation: A Survey: https://arxiv.org/pdf/2006.05525.pdf
[34]Compression of Deep Learning Models for Text: A Survey: https://arxiv.org/pdf/2008.05221.pdf
[35]A Survey on Deep Transfer Learning: https://arxiv.org/pdf/1808.01974.pdf
[36]A Comprehensive Survey of Neural Architecture Search-- Challenges and Solutions: https://arxiv.org/abs/2006.02903
[37]Neural Architecture Search: A Survey: https://arxiv.org/abs/1808.05377
[38]Object Detection in 20 Years: https://arxiv.org/pdf/1905.05055.pdf
[39]Threat of Adversarial Attacks on Deep Learning in Computer Vision: https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=8294186
[40]Computer Vision for Autonomous Vehicles: Problems, Datasets and State of the Art: https://arxiv.org/pdf/1704.05519.pdf
[41]A Brief Survey of Deep Reinforcement Learning: https://arxiv.org/pdf/1708.05866.pdf
[42]Transfer Learning for Reinforcement Learning Domains: http://www.jmlr.org/papers/volume10/taylor09a/taylor09a.pdf
[43]Review of Deep Reinforcement Learning Methods and Applications in Economics: https://arxiv.org/pdf/2004.01509.pdf
[44]A Comprehensive Survey of Graph Embedding: Problems, Techniques and Applications: https://arxiv.org/pdf/1709.07604
[45]From Word to Sense Embeddings:A Survey on Vector Representations of Meaning: https://www.jair.org/index.php/jair/article/view/11259/26454
[46]Diachronic Word Embeddings and Semantic Shifts: https://arxiv.org/pdf/1806.03537.pdf
[47]Word Embeddings: A Survey: https://arxiv.org/abs/1901.09069
[48]A Survey on Contextual Embeddings: https://arxiv.org/abs/2003.07278
[49]A Survey on Knowledge Graphs: Representation, Acquisition and Applications: https://arxiv.org/abs/2002.00388
[50]Meta-learning for Few-shot Natural Language Processing: A Survey: https://arxiv.org/abs/2007.09604
[51]Learning from Few Samples: A Survey: https://arxiv.org/abs/2007.15484
[52]Meta-Learning in Neural Networks: A Survey: https://arxiv.org/abs/2004.05439
[53]A Comprehensive Overview and Survey of Recent Advances in Meta-Learning: https://arxiv.org/abs/2004.11149
[54]Baby steps towards few-shot learning with multiple semantics: https://arxiv.org/abs/1906.01905
[55]Meta-Learning: A Survey: https://arxiv.org/abs/1810.03548
[56]A Perspective View And Survey Of Meta-learning: https://www.researchgate.net/publication/2375370_A_Perspective_View_And_Survey_Of_Meta-Learning
[57]A Survey on Transfer Learning: http://202.120.39.19:40222/wp-content/uploads/2018/03/A-Survey-on-Transfer-Learning.pdf文章来源:https://www.toymoban.com/news/detail-452304.html
作者:kaiyuan,来源:NewBeeNLP文章来源地址https://www.toymoban.com/news/detail-452304.html
到了这里,关于各机器学习领域综述清单!的文章就介绍完了。如果您还想了解更多内容,请在右上角搜索TOY模板网以前的文章或继续浏览下面的相关文章,希望大家以后多多支持TOY模板网!