Tensorflow实现的深度NLP模型集锦(附资源)
本文收集整理了一批基于Tensorflow实现的深度学习/机器学习的深度NLP模型。
基于Tensorflow的自然语言处理模型,为自然语言处理问题收集机器学习和Tensorflow深度学习模型,100%Jupeyter NoteBooks且内部代码极为简洁。
资源整理自网络,源地址:
https://github.com/huseinzol05
目录
Text classification
Chatbot
Neural Machine Translation
Embedded
Entity-Tagging
POS-Tagging
Dependency-Parser
Question-Answers
Supervised Summarization
Unsupervised Summarization
Stemming
Generator
Language detection
OCR (optical character recognition)
Speech to Text
Text to Speech
Text Similarity
Miscellaneous
Attention
目标
原始的实现稍微有点复杂,对于初学者来说有点难。所以我尝试将其中大部分内容简化,同时,还有很多论文的内容亟待实现,一步一步来。
内容
文本分类 :
链接:
https://github.com/huseinzol05/NLP-Models-Tensorflow/tree/master/text- classification
Basic cell RNN
Bidirectional RNN
LSTM cell RNN
GRU cell RNN
LSTM RNN + Conv2D
K-max Conv1d
LSTM RNN + Conv1D + Highway
LSTM RNN with Attention
Neural Turing Machine
Seq2Seq
Bidirectional Transformers
Dynamic Memory Network
Residual Network using Atrous CNN + Bahdanau Attention
Transformer-XL
完整列表包含(66 notebooks)
聊天机器人 :
链接:
https://github.com/huseinzol05/NLP-Models-Tensorflow/tree/master/chatbot
Seq2Seq-manual
Seq2Seq-API Greedy
Bidirectional Seq2Seq-manual
Bidirectional Seq2Seq-API Greedy
Bidirectional Seq2Seq-manual + backward Bahdanau + forward Luong
Bidirectional Seq2Seq-API + backward Bahdanau + forward Luong + Stack Bahdanau Luong Attention + Beam Decoder
Bytenet
Capsule layers + LSTM Seq2Seq-API + Luong Attention + Beam Decoder
End-to-End Memory Network
Attention is All you need
Transformer-XL + LSTM
GPT-2 + LSTM
完整列表包含(51 notebooks)
机器翻译 (英语到越南语):
链接:
https://github.com/huseinzol05/NLP-ModelsTensorflow/tree/master/neural- machine-translation
Seq2Seq-manual
Seq2Seq-API Greedy
Bidirectional Seq2Seq-manual
Bidirectional Seq2Seq-API Greedy
Bidirectional Seq2Seq-manual + backward Bahdanau + forward Luong
Bidirectional Seq2Seq-API + backward Bahdanau + forward Luong + Stack Bahdanau Luong Attention + Beam Decoder
Bytenet
Capsule layers + LSTM Seq2Seq-API + Luong Attention + Beam Decoder
End-to-End Memory Network
Attention is All you need
完整列表包含(49 notebooks)
词向量:
链接:
https://github.com/huseinzol05/NLP-Models-Tensorflow/tree/master/embedded
Word Vector using CBOW sample softmax
Word Vector using CBOW noise contrastive estimation
Word Vector using skipgram sample softmax
Word Vector using skipgram noise contrastive estimation
Lda2Vec Tensorflow
Supervised Embedded
Triplet-loss + LSTM
LSTM Auto-Encoder
Batch-All Triplet-loss LSTM
Fast-text
11. ELMO (biLM)
词性标注 :
链接:
https://github.com/huseinzol05/NLP-Models-Tensorflow/tree/master/pos-tagging
Bidirectional RNN + Bahdanau Attention + CRF
Bidirectional RNN + Luong Attention + CRF
3. Bidirectional RNN + CRF
实体识别:
链接:
https://github.com/huseinzol05/NLP-Models-Tensorflow/tree/master/entity- tagging
Bidirectional RNN + Bahdanau Attention + CRF
Bidirectional RNN + Luong Attention + CRF
Bidirectional RNN + CRF
Char Ngrams + Bidirectional RNN + Bahdanau Attention + CRF
5. Char Ngrams + Residual Network + Bahdanau Attention + CRF
依存分析:
链接:
https://github.com/huseinzol05/NLP-ModelsTensorflow/tree/master/dependency- parser
Bidirectional RNN + Bahdanau Attention + CRF
Bidirectional RNN + Luong Attention + CRF
Residual Network + Bahdanau Attention + CRF
4. Residual Network + Bahdanau Attention + Char Embedded + CRF
问答:
链接:
https://github.com/huseinzol05/NLP-Models-Tensorflow/tree/master/question- answer
End-to-End Memory Network + Basic cell
End-to-End Memory Network + GRU cell
3. End-to-End Memory Network + LSTM cell
词干抽取:
链接:
https://github.com/huseinzol05/NLP-Models-Tensorflow/tree/master/stemming
LSTM + Seq2Seq + Beam
GRU + Seq2Seq + Beam
LSTM + BiRNN + Seq2Seq + Beam
GRU + BiRNN + Seq2Seq + Beam
5. DNC + Seq2Seq + Greedy
有监督摘要抽取:
链接:
https://github.com/huseinzol05/NLP-Models-Tensorflow/tree/master/summarization
LSTM Seq2Seq using topic modelling
LSTM Seq2Seq + Luong Attention using topic modelling
LSTM Seq2Seq + Beam Decoder using topic modelling
LSTM Bidirectional + Luong Attention + Beam Decoder using topic modelling
LSTM Seq2Seq + Luong Attention + Pointer Generator
6. Bytenet
无监督摘要抽取:
链接:
https://github.com/huseinzol05/NLP-Models-Tensorflow/tree/master/unsupervised- summarization
Skip-thought Vector (unsupervised)
Residual Network using Atrous CNN (unsupervised)
3. Residual Network using Atrous CNN + Bahdanau Attention (unsupervised)
OCR (字符识别):
链接:
https://github.com/huseinzol05/NLP-Models-Tensorflow/tree/master/ocr
1. CNN + LSTM RNN
语音识别 :
链接:
https://github.com/huseinzol05/NLP-Models-Tensorflow/tree/master/speech-to- text
Tacotron
Bidirectional RNN + Greedy CTC
Bidirectional RNN + Beam CTC
Seq2Seq + Bahdanau Attention + Beam CTC
Seq2Seq + Luong Attention + Beam CTC
Bidirectional RNN + Attention + Beam CTC
7. Wavenet
语音合成 :
链接:
https://github.com/huseinzol05/NLP-Models-Tensorflow/tree/master/text-to- speech
Tacotron
Wavenet
Seq2Seq + Luong Attention
4. Seq2Seq + Bahdanau Attention
生成器:
链接:
https://github.com/huseinzol05/NLP-Models-Tensorflow/tree/master/generator
Character-wise RNN + LSTM
Character-wise RNN + Beam search
Character-wise RNN + LSTM + Embedding
Word-wise RNN + LSTM
Word-wise RNN + LSTM + Embedding
Character-wise + Seq2Seq + GRU
Word-wise + Seq2Seq + GRU
Character-wise RNN + LSTM + Bahdanau Attention
9. Character-wise RNN + LSTM + Luong Attention
语言检测:
链接:
https://github.com/huseinzol05/NLP-Models-Tensorflow/tree/master/language- detection
1. Fast-text Char N-Grams
文本相似性:
链接:
https://github.com/huseinzol05/NLP-Models-Tensorflow/tree/master/text- similarity
Character wise similarity + LSTM + Bidirectional
Word wise similarity + LSTM + Bidirectional
Character wise similarity Triplet loss + LSTM
4. Word wise similarity Triplet loss + LSTM
注意力机制 :
链接:
https://github.com/huseinzol05/NLP-Models-Tensorflow/tree/master/attention
Bahdanau
Luong
Hierarchical
Additive
Soft
Attention-over-Attention
Bahdanau API
8. Luong API
其他:
链接:
https://github.com/huseinzol05/NLP-Models-Tensorflow/tree/master/misc
- Attention heatmap on Bahdanau Attention
2. Attention heatmap on Luong Attention
非 深度学习:
链接:
https://github.com/huseinzol05/NLP-Models-Tensorflow/tree/master/not-deep- learning
- Markov chatbot
2. Decomposition summarization (3 notebooks)
作者暂无likerid, 赞赏暂由本网站代持,当作者有likerid后会全部转账给作者(我们会尽力而为)。Tips: Until now, everytime you want to store your article, we will help you store it in Filecoin network. In the future, you can store it in Filecoin network using your own filecoin.
Support author:
Author's Filecoin address:
Or you can use Likecoin to support author: