Stanford University. — Coursera, 2012. — eLeaing (Video+PDF
slides) 960x540 / H264 ~54 kbps / AAC ~705 / Kbps
Видеокурс по обработке текстов, написанных на естественных языках.
Курс разбит по неделям обучения. Недели содержат разное количество
тем. Слайды недели собраны в файлы PDF и презентации PPTX, которыми
можно пользоваться для закрепления знаний.
Архив разбит по неделям на восемь частей. Каждой частью архива
можно пользоваться отдельно. В настоящем архиве материалы первой
недели.
This course covers a broad range of topics in natural language
processing, including word and sentence tokenization, text
classification and sentiment analysis, spelling correction,
information extraction, parsing, meaning extraction, and question
answering, We will also introduce the underlying theory from
probability, statistics, and machine leaing that are crucial for
the field, and cover fundamental algorithms like n-gram language
modeling, naive bayes and maxent classifiers, sequence models like
Hidden Markov Models, probabilistic dependency and constituent
parsing, and vector-space models of meaning.
We are offering this course on Natural Language Processing free and
online to students worldwide, continuing Stanford's exciting forays
into large scale online instruction. Students have access to
screencast lecture videos, are given quiz questions, assignments
and exams, receive regular feedback on progress, and can
participate in a discussion forum. Those who successfully complete
the course will receive a statement of accomplishment. Taught by
Professors Jurafsky and Manning, the curriculum draws from
Stanford's courses in Natural Language Processing. You will need a
decent inteet connection for accessing course materials, but
should be able to watch the videos on your smartphone.
Courses list:
Week 1 - Course Introduction
Week 1 - Basic Text Processing
Week 1 - Edit Distance
Week 2 - Language Modeling
Week 2 - Spelling Correction
Week 3 - Text Classification
Week 3 - Sentiment Analysis
Week 4 - Discriminative classifiers: Maximum Entropy classifiers
Week 4 - Named entity recognition and Maximum Entropy Sequence Models
Week 4 - Relation Extraction
Week 5 - Advanced Maximum Entropy Models
Week 5 - POS Tagging
Week 5 - Parsing Introduction
Week 5 - Instructor Chat
Week 6 - Probabilistic Parsing
Week 6 - Lexicalized Parsing
Week 6 - Dependency Parsing (Optional)
Week 7 - Information Retrieval
Week 7 - Ranked Information Retrieval
Week 8 - Semantics
Week 8 - Question Answering
Week 8 - Summarization
Week 8 - Instructor Chat II
Week 1 - Basic Text Processing
Week 1 - Edit Distance
Week 2 - Language Modeling
Week 2 - Spelling Correction
Week 3 - Text Classification
Week 3 - Sentiment Analysis
Week 4 - Discriminative classifiers: Maximum Entropy classifiers
Week 4 - Named entity recognition and Maximum Entropy Sequence Models
Week 4 - Relation Extraction
Week 5 - Advanced Maximum Entropy Models
Week 5 - POS Tagging
Week 5 - Parsing Introduction
Week 5 - Instructor Chat
Week 6 - Probabilistic Parsing
Week 6 - Lexicalized Parsing
Week 6 - Dependency Parsing (Optional)
Week 7 - Information Retrieval
Week 7 - Ranked Information Retrieval
Week 8 - Semantics
Week 8 - Question Answering
Week 8 - Summarization
Week 8 - Instructor Chat II