NLP / NLU: Things to Learn About
Tasks
A task can be considered complete when I have reviewed the material AND added notes here.
- Make sure I understand well the things I told Michelle I understand (RNN, LSTM, GRU, embedding layers, attention layers, encoder and decoder networks)
- Review LSTM and GRU architectures
- Watch unwatched lectures in the NLP nanodegree
- Read about transformers
- Read about different embedding approaches
- Read AIMA NLP chapters
- Skim NLP textbook and choose witch chapters to dive into
- Make a list of resources from the Spirit Slack channels
- Read about BERT and understand the architecture at a deep level
- Read papers referenced in the RNV response document
Udacity Sections
NLP Nanodegree
- Intro to NLP
- Text Processing
- Spam Classifier with Naive Bayes
- Part of Speech Tagging with HMMs
- Using an HMM to determine part of speech (project)
- Feature Extraction and Embeddings
- Topic Modeling
- Sentiment Analysis
- Sequence to Sequence
- Deep Learning Attention
- Machine Translation (project)
- Intro to Voice User Interfaces
- Speech Recognition
- DNN Speech Recogniser (project)
Deep Learning Nanodegree
- Sentiment Analysis with Andrew Trask
- Recurrent Neural Networks
- Long Short-Term Memory Networks
- Implementation of RNN and LSTM
- Hyperparameters
- Embeddings and Word2Vec
- Sentiment Prediction RNN
- Generate TV Scripts (project)
- Attention