CITS4012 Natural Language Processing
HOME
Lab 01: Conda Environment and Python Refresher
1. CITS4012 Base Environment
2. CITS4012 MISC Enviornment
3. Use Lab Machines
4. Python Basics
5. Iterables
6. Numpy
7. Matplotlib
Lab02: NLTK
1. Starting with NLTK
2. A Closer Look at Python: Texts as Lists of Words
3. Computing with Language: Simple Statistics
4. Back to Python: Making Decisions and Taking Control
5. Exercises
Lab03: spaCy NLP pipelines
1. Container Objects in spaCy
2. NLP Pipelines
3. Finding Patterns
4. Your first chatbot
5. Exercise
Lab04: Count-Based Models
1. TF-IDF in scikit-learn and Gensim
2. Document Classification
Lab05: Introduction to Neural Networks and Pytorch
1. Linear Models in Numpy
2. Introduction to Pytorch Tensors
3. Linear Models in Pytorch
Lab06: Neural Network Building Blocks
1. Activation Functions and their derivatives
2. Loss Functions
3. Dynamic Computational Graph in PyTorch
4. Neural Networks in PyTorch
Lab07: Word Embeddings
1. Word Vectors from Word-Word Coocurrence Matrix
2. GloVe: Global Vectors for Word Representation
3. Word2Vec
Lab08: Document Classification
1. Perceptron
2. Dataset and DataLoader
3. Yelp Dataset at a glance
4. Yelp Review Dataset - Document Classification
Lab09: CNN for NLP
1. Frankenstein Dataset At a Glance
2. Learning Embeddings with Continuous Bag of Words (CBOW)
3. Convolution Basics
4. AG News Dataset at A Glance
5. Using CNN for Document Classification with Embeddings
Lab10: RNN for NLP
1. Recurrent Neural Networks - Introduction
2. Classifying Synthetic Sequences - The Square Model
3. Surname Dataset Processing
4. Surname Classification with RNN
Lab11: GRU, LSTM and Seq2Seq
1. Gated Recurrent Units (GRUs)
2. Long Short Term Memories (LSTMs)
3. The Square Model Using GRU and LSTM
4. Sequence to Sequence Models
5. Surname Generation - Unconditioned
6. Surname Generation Conditioned
Lab12: Sequence to Sequence Learning with Attention
1. Attention Mechanism
2. Self-Attention
3. Neual Machine Translation
4. Packed Sequences
5. Neural Machine Translation - No Sampling
6. Neural Machine Translation - Scheduled Sampling
Extra: Transformers
1. Positional Encoding
2. Layer Normalization
3. Transform and Roll Out
data.zip
PDF
Index