Probabilistic Graphical Models

Daphne Koller, Professor, Stanford University

In this class, you will learn the basics of the PGM representation and how to construct them, using both human knowledge and machine learning techniques.

What are Probabilistic Graphical Models?

Uncertainty is unavoidable in real-world applications: we can almost never predict with certainty what will happen in the future, and even in the present and the past, many important aspects of the world are not observed with certainty. Probability theory gives us the basic foundation to model our beliefs about the different possible states of the world, and to update these beliefs as new evidence is obtained. These beliefs can be combined with individual preferences to help guide our actions, and even in selecting which observations to make. While probability theory has existed since the 17th century, our ability to use it effectively on large problems involving many inter-related variables is fairly recent, and is due largely to the development of a framework known as Probabilistic Graphical Models (PGMs). This framework, which spans methods such as Bayesian networks and Markov random fields, uses ideas from discrete data structures in computer science to efficiently encode and manipulate probability distributions over high-dimensional spaces, often involving hundreds or even many thousands of variables. These methods have been used in an enormous range of application domains, which include: web search, medical and fault diagnosis, image understanding, reconstruction of biological networks, speech recognition, natural language processing, decoding of messages sent over a noisy communication channel, robot navigation, and many more. The PGM framework provides an essential tool for anyone who wants to learn how to reason coherently from limited and noisy observations.

In this class, you will learn the basics of the PGM representation and how to construct them, using both human knowledge and machine learning techniques; you will also learn algorithms for using a PGM to reach conclusions about the world from limited and noisy evidence, and for making good decisions under uncertainty. The class covers both the theoretical underpinnings of the PGM framework and practical skills needed to apply these techniques to new problems.

Syllabus

Topics covered include:

  1. The Bayesian network and Markov network representation, including extensions for reasoning over domains that change over time and over domains with a variable number of entities
  2. Reasoning and inference methods, including exact inference (variable elimination, clique trees) and approximate inference (belief propagation message passing, Markov chain Monte Carlo methods)
  3. Learning parameters and structure in PGMs
  4. Using a PGM for decision making under uncertainty.

There will be short weekly review quizzes and programming assignments (Octave/Matlab) focusing on case studies and applications of PGMs to real-world problems:

  1. Credit Scoring and Factors
  2. Modeling Genetic Inheritance and Disease
  3. Markov Networks and Optical Character Recognition (OCR)
  4. Inference: Belief Propagation
  5. Markov Chain Monte Carlo and Image Segmentation
  6. Decision Theory: Arrhythmogenic Right Ventricular Dysplasia
  7. Conditional Random Field Learning for OCR
  8. Structure Learning for Identifying Skeleton Structure
  9. Human Action Recognition with Kinect

To prepare for the class in advance, you may consider reading through the following sections of the textbook (discount code DKPGM12) by Daphne and Nir Friedman:

  1. Introduction and Overview. Chapters 1, 2.1.1 - 2.1.4, 4.2.1.
  2. Bayesian Network Fundamentals. Chapters 3.1 - 3.3.
  3. Markov Network Fundamentals. Chapters 4.1, 4.2.2, 4.3.1, 4.4, 4.6.1.
  4. Structured CPDs. Chapters 5.1 - 5.5.
  5. Template Models. Chapters 6.1 - 6.4.1.

These will be covered in the first two weeks of the online class.

The slides for the whole class can be found here.

FAQ

  • Will I get a statement of accomplishment after completing this class?

    Yes. Students who successfully complete the class will receive a statement of accomplishment signed by the instructor.

  • What are the pre-requisites for the class?

    You should be able to program in at least one programming language and have a computer (Windows, Mac or Linux) with internet access (programming assignments will be conducted in Matlab or Octave). It also helps to have some previous exposure to basic concepts in discrete probability theory (independence, conditional independence, and Bayes' rule).

  • What textbook should I buy?

    Although the lectures are designed to be self-contained, students wanting to expand their knowledge beyond what we can cover in a one-quarter class can find a much more extensive coverage of this topic in the book "Probabilistic Graphical Models", by Koller and Friedman, published by MIT Press. MIT Press has generously provided a discount code (DKPGM12)  for students enrolled in this course.

  • How difficult is the class?

    This class does require some abstract thinking and mathematical skills. However, it is designed to require fairly little background, and a motivated student can pick up the background material as the concepts are introduced. We hope that, using our new learning platform, it should be possible for everyone to understand all of the core material.

会期:
  • 2013年4月08日, 11 星期
  • 2012年9月24日, 11 星期
  • 2012年3月19日, 10 星期
介绍:
  • 免费:
  • 收费:
  • 证书:
  • MOOC:
  • 视频讲座:
  • 音频讲座:
  • Email-课程:
  • 语言: 英语 Gb

反馈

目前这个课程还没有反馈。您想要留第一个反馈吗?

请注册, 为了写反馈

Show?id=n3eliycplgk&bids=695438
已经在列表:
Small-icon.hover Machine Learning
Machine learning: from the basics to advanced topics. Includes statistics...
NVIDIA
还有这个题目的:
6-867f06 Machine Learning
6.867 is an introductory course on machine learning which gives an overview...
Istock_000020359734_small Natural Language Processing
Have you ever wondered how to build a system that automatically translates between...
6-092iap05 Bioinformatics and Proteomics
This interdisciplinary course provides a hands-on approach to students in the...
Data_stat_graphic_600x340 Data Analysis and Statistical Inference
The Coursera course, Data Analysis and Statistical Inference has been revised...
Extensionflag Monte Carlo Methods for Inference and Data Analysis
Monte Carlo methods are a diverse class of algorithms that rely on...
还有标题«计算机科学»:
E84f731a-6611-4d90-9317-3a32bfd49ccd-582b2ac243c8.small Artificial Intelligence (AI)
Learn the fundamentals of Artificial Intelligence (AI), and apply them. Design...
A35c8b84-f0ef-4eb0-ad44-52f4bc61d7df-6b753882d8f8.small Machine Learning
Master the essentials of machine learning and algorithms to help improve learning...
95c877f3-076a-4dee-a640-9c6069ca0114-e3a2f8507f67.small Animation and CGI Motion
Learn the science behind movie animation from the Director of Columbia’s Computer...
9d918753-9409-4a56-ba00-54d1e0724c28-626b8a29512a.small Robotics
Learn the core techniques for representing robots that perform physical tasks...
Logo-white Neo4j Koans
A koan-style tutorial in Java for Neo4j. This set of Koans provides a hands...
还有Coursera:
Success-from-the-start-2 First Year Teaching (Secondary Grades) - Success from the Start
Success with your students starts on Day 1. Learn from NTC's 25 years developing...
New-york-city-78181 Understanding 9/11: Why Did al Qai’da Attack America?
This course will explore the forces that led to the 9/11 attacks and the policies...
Small-icon.hover Aboriginal Worldviews and Education
This course will explore indigenous ways of knowing and how this knowledge can...
Ac-logo Analytic Combinatorics
Analytic Combinatorics teaches a calculus that enables precise quantitative...
Talk_bubble_fin2 Accountable Talk®: Conversation that Works
Designed for teachers and learners in every setting - in school and out, in...

© 2013-2019