Probabilistic graphical models (PGMs) are a rich framework for encoding probability distributions over complex domains: joint (multivariate) distributions over large numbers of random variables that interact with each other. These representations sit at the intersection of statistics and computer science, relying on concepts from probability theory, graph algorithms, machine learning, and more. They are the basis for the state-of-the-art methods in a wide variety of applications, such as medical diagnosis, image understanding, speech recognition, natural language processing, and many, many more. They are also a foundational tool in formulating many machine learning problems.
The Leland Stanford Junior University, commonly referred to as Stanford University or Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto, California, United States.
- 5 stars
- 4 stars
- 3 stars
- 2 stars
- 1 star
來自PROBABILISTIC GRAPHICAL MODELS 3: LEARNING的熱門評論
very good course for PGM learning and concept for machine learning programming. Just some description for quiz of final exam is somehow unclear, which lead to a little bit confusing.
Great course! Very informative course videos and challenging yet rewarding programming assignments. Hope that the mentors can be more helpful in timely responding for questions.
Great course, especially the programming assignments. Textbook is pretty much necessary for some quizzes, definitely for the final one.
Great course! It is pretty difficult - be prepared to study. Leave plenty of time before the final exam.
關於 概率图模型 專項課程
Access to lectures and assignments depends on your type of enrollment. If you take a course in audit mode, you will be able to see most course materials for free. To access graded assignments and to earn a Certificate, you will need to purchase the Certificate experience, during or after your audit. If you don't see the audit option:
- The course may not offer an audit option. You can try a Free Trial instead, or apply for Financial Aid.
- The course may offer 'Full Course, No Certificate' instead. This option lets you see all course materials, submit required assessments, and get a final grade. This also means that you will not be able to purchase a Certificate experience.
Learning Outcomes: By the end of this course, you will be able to
Compute the sufficient statistics of a data set that are necessary for learning a PGM from data
Implement both maximum likelihood and Bayesian parameter estimation for Bayesian networks
Implement maximum likelihood and MAP parameter estimation for Markov networks
Formulate a structure learning problem as a combinatorial optimization task over a space of network structure, and evaluate which scoring function is appropriate for a given situation
Utilize PGM inference algorithms in ways that support more effective parameter estimation for PGMs
Implement the Expectation Maximization (EM) algorithm for Bayesian networks
Honors track learners will get hands-on experience in implementing both EM and structure learning for tree-structured networks, and apply them to real-world tasks