CDSC530 module: Natural Language Processing

This page (and its links) are still under construction and subject to change

Class Time and Location: Monday/Wednesday 10:25 - 11:40, Mel. 269

Course Description:

This module is an introduction to natural language processing, with emphasis on computational techniques for deriving the structure and meaning of natural language.
Topics include English phrase structure, dependency structure, parsing with traditional algorithms and with neural nets, representing meaning and knowledge, and a brief introduction to inference.

Instructor:

Len Schubert
Department of Computer Science
Room 3003, Wegmans Hall
University of Rochester
E-mail: lastname at cs dot rochester dot edu
Office Hours: Tuesday and Thursday 5:00 - 6:30 (or by appointment)

Prerequisites: No formal prerequisites, but programming experience and mathematical maturity are presupposed.

Syllabus, Quizzes, Assignments and Grading

Some Broad-Coverage Sources for NLU:

  • James Allen, Natural Language Understanding, Benjamin/Cummings, 2nd edition, 1995. (Some excerpts will be supplied.)

  • Lenhart Schubert, Computational Linguistics, Stanford Encyclopedia of Philosophy. (The discussion of connectionism and neural nets needs updating.)

  • D. Jurafsky and J.H. Martin, Speech and Language Processing, Prentice-Hall, 2018, 3rd edition draft.

  • C.D. Manning and H. Schuetze, Foundations of Statistical Natural Language Processing, MIT Press, 1999. Amazon link

  • Yoav Goldberg, Neural Network Methods for Natural Language Processing
       (Very comprehensive, up to 2017, but excludes semantic interpretation of language)

    Academic honesty policy:

    In your problem-solving and programming, you are expected to use your own ideas, your own words, and your own code, unless otherwise indicated (e.g., you are advised to use some existing code). Where you do use someone else's solutions or code (not prescribed by the instructor), explicitly cite your source.

    Assignments:

    Assignment 1
    Assignment 2

    Supplementary Course Materials:

    Some course materials may be made available through this web page.

    Meaning, significance, and methods of NLP

    Top-down parse example>

    Bottom-up-parse example

    Neural Networks

    Long Short-Term Memory (LSTM) NNs for Natural Language Processing

    Peephole LSTM (from Wikipedia)

    NN-based inorder parsing example.pdf

    Transition-based parser architecture

    Shift-project-reduce actions for inorder parsing

    Inorder parsing in symbolic form

    Stack-LSTM figure

    Neural-Network-Based Relation Inference

    NN-Based Inference, Slide 1

    NN-Based Inference, Slide 2

    James Allen's basic parser is located here
    (or look for it on James Allen's web page).

    Slides by James Allen

    Overview of NLU, Syntax, and Chart Parsing

    Allen's NLU, Chap.2 copy

    Allen's NLU, Chap.3 copy

    Additional lecture slides:
    01: basic linguistic concepts

    02: English phrase structure

    03: Syntactic analysis