Home > Teaching > CS 698O: Special Topics in Natural Language Processing

CS 698O: Special Topics in Natural Language Processing

Units: 3-0-0-0-9
Pre-requisites:

Instructor's consent and Must: Introduction to Machine Learning (CS771), Proficiency in Linear Algebra, Probability and Statistics, Proficiency in Python Programming

Desirable:

Probabilistic Machine Learning (CS772), Topics in Probabilistic Modeling and Inference (CS775), Deep Learning for Computer Vision (CS776)

Departments which may be interested:

CSE, EE, MTH, IME, ECO

Level of the course:

Senior UG and PG (6xx level)

Course Description:

Natural language (NL) refers to the language spoken/written by humans. NL is the primary mode of commu- nication for humans. With the growth of the world wide web, data in the form of textual natural language has grown exponentially. This calls for development of algorithms and techniques for processing natural language for the purposes of automation and for the development of intelligent machines. This course will primarily focus on understanding and developing techniques/learning algorithms/models for processing text. We will have a statistical approach to Natural Language Processing (NLP), wherein we will learn how one could develop natural language understanding models from regularities in large corpora of natural language texts.

Tentative Topics:
  1. Introduction to Natural Language (NL): why is it hard to process NL, linguistics fundamentals, etc
  2. Language Models: n-grams, smoothing, class-based, brown clustering
  3. Sequence Labeling: HMM, MaxEnt, CRFs, related applications of these models e.g. Part of Speech tagging, etc.
  4. Parsing: CFG, Lexicalized CFG, PCFGs, Dependency parsing
  5. Applications: Named Entity Recognition, Coreference Resolution, text classification, toolkits e.g. Spacy, etc.
  6. Distributional Semantics: distributional hypothesis, vector space models, etc.
  7. Distributed Representations: Neural Networks (NN), Backpropogation, Softmax, Hierarchical Softmax
  8. Word Vectors: Feedforward NN, Word2Vec, GloVE, Contextualization (ELMo etc.), Sub- word information (FastText, etc.)
  9. Deep Models: RNNs, LSTMs, Attention, CNNs, applications in language, etc.
  10. Sequence to Sequence models: machine translation and other applications
  11. Transformers: BERT, transfer learning and applications
References:

The course is based on the following text books:

  1. Introduction to Natural Language Processing, Jacob Eisenstein
  2. Speech and Language Processing, Daniel Jurafsky, James H.Martin
  3. Foundations of Statistical Natural Language Processing, CH Manning, H Schutze
  4. Natural Language Understanding, James Allen

Other than text books mentioned above, this course gleans information from a variety of sources like books, research papers, other courses, etc. Relevant references would be suggested in the lectures.