CS 565: Intelligent Systems and Interfaces

Jan-May, 2017


Table of Contents

Syllabus and References

Event Date Description References
Introductory Lecture Jan 10 Course Introduction [ Lecture Slides ]
Lecture 1 Jan 11 Getting Started with NLP [ Lecture Slides ]
Lecture 2 Jan 17 Words - Collocations [ Lecture Slides ]
Lecture 3 Jan 18 Words - Finding Collocations [ Lecture Slides ]
[ Text Reference: Maximum Likelihood Estimate: DHS, Chapter 3-3.2 ]
Lecture 4 Jan 24 Language Modeling [ Lecture Slides ]
[ References: Prof Collins, Columbia University Lecture Notes ]
[                     SLP (3rd ed.), Chapter 4 ]
Lecture 5 Jan 25 Smoothing Techniques [ Laplace, Add-k, Witten-Bell, Backoff and Interpolation ]
Lecture 6 Jan 27 Smoothing Techniques [Backoff and Interpolation, Absolute Discount]
Project guideline [Slide]
Lecture 7 Jan 31 Introduction to Neural Language Model: improving over n-gram model;
Introduction to Neural Networks
[Video Lecture(all 1.*)]
Lecture 8 Feb 1 1. Summarizing discussion on probabilistic neural language model with flat and hierarchical output layer
2. vector semantics
[Video Lecture(all 2.*)]
[Vector semantics reference]
Lecture 9 Feb 7 Neural Network Language Model [Video Lectures(10.5 [NLP-LM], 10.6 [NNLM] and 10.7 [Hierarchical output layer])]
[ Optional Video Lectures: 10.1 - 10.7 ]
Feb 10 No Class
Hands-on-Session Feb 12
(2 - 5 pm)
A gentle introduction to Neural Networks and Tensorflow [Reference: Hands-on-session]
Lecture 10 Feb 14 Vector Semantics: Short and Dense representation (Skip-gram with negative sampling) [References: 1. Semantics with Dense Vectors ]
[2. word2vec explained, Goldberg and Levy]
Lecture 11 Feb 15 GloVe (Global Vectors) [Reference: Global vectors for word representation]
Lecture 12 Feb 17 1. Skip-gram with negative sampling-implicit matrix factorization
2. Improving vector representation further: Retrofitting and Counter-fitting
References: 1. [ Neural Word Embedding as Implicit Matrix Factorization ]
[ 2. Retrofitting Word Vectors to Semantic Lexicons
Counter-fitting Word Vectors to Linguistic Constraints]
Lecture 13 Feb 21 Sequence Tagging Problem and HMM [Reference: Collins Notes on "Tagging problems and HMM"]
Lecture 14 Feb 22 Sequence Tagging problem and MEMM References: Collins Notes on "MEMMS (Log-Linear Tagging Models)"
[1. MEMMs (Log-Linear Tagging Models)]
[Log-Linear Models]
Mid Sem Feb 27 -
Mar 5
To be updated
Lecture 15 Mar 7 Sequence Tagging problem and Linear Conditional Random Field (CRF) [Reference: LogLinear Models, MEMMs and CRFs]
Lecture 16 Mar 8 Neural Net architectures for sequence labeling [Reference: NLP (Almost) from scratch]
Lecture 17 Apr 4 Syntactic Parsing References: 1. [ SLP-3rd ed-chapter 12 ]
[ 2. Collins notes on PCFGs]

Assignments

  1. Assignment 1 (Due date: 07/02/2017): Questions   Solutions
  2. Assignment 2 (Due date: 10/03/2017): Questions   Solutions

Text and Reference Book(s)

  1. FSNLP: Chris Manning and Hinrich Schütze. Foundations of Statistical Natural Language Processing. MIT Press, Cambridge, MA: May 1999. Companion Website
  2. DHS: Duda, Richard O., Peter E. Hart, and David G. Stork. Pattern Classification. John Wiley & Sons, 2012. Companion Website
  3. SLP: Jurafsky, Dan, and James H. Martin. Speech and Language Processing. Pearson Education India, 2000. Companion Website
  4. NNLM: Simon O. Haykin. Neural Networks and Learning Machines. Pearson Education India, 2009. Companion Website

NLP Tools

  1. Five open source NLP tools: Link
  2. Tools for different NLP tasks: Link

Tutorials: NLP + Python

  1. Natural language Toolkit (NLTK) Tutorial: Book Set Up
  2. Python Numpy Tutorial: Stanford CS231n
  3. python-crfsuite Tutorial: Official Homepage
  4. Theano Tutorial: Speeding up your Neural Network with Theano and the GPU

Similar Courses

  1. Columbia University, Advanced NLP by Prof. Collins
  2. Stanford University, Deep Learning for Natural Language Processing
  3. Stanford University, Natural Language Understanding
  4. IIT Delhi, NLP by Dr. Mausam
  5. Stanford University, Convolutional Neural Networks for Visual Recognition
  6. Stanford University, Natural Language Processing with Deep Learning

NLP Conference Calendar

Click here to access unofficially official conference calendar for the fields of Computational Linguistics and Natural Language Processing