forked from probml/pml-book
-
Notifications
You must be signed in to change notification settings - Fork 0
/
book2.html
217 lines (167 loc) · 9.41 KB
/
book2.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
<h1>Probabilistic Machine Learning: Advanced Topics</h1>
by <a href="https://www.cs.ubc.ca/~murphyk/">Kevin Patrick Murphy</a>.
<br>
MIT Press, 2023.
<p>
<img src="https://raw.githubusercontent.com/probml/pml2-book/main/cover2.jpg"
alt="Book cover"
style="height:200;">
<p>
<h2>Key links</h2>
<ul>
<li> <a href="#toc">Short table of contents</a>
<li> <a href="https://github.com/probml/pml2-book/blob/main/toc2-long-2022-07-29.pdf">
Long table of contents</a>
<li> <a href="https://github.com/probml/pml2-book/blob/main/preface2-2022-07-29.pdf">Preface</a>
<li> <a href="https://github.com/probml/pml2-book/releases/latest">
Draft pdf of the main book</a>, 2022-10-16. CC-BY-NC-ND license. (Please cite the official reference below.)
<li> <a href="https://github.com/probml/pml-book/blob/main/supp2.md">Supplementary material</a>
<li> <a href="https://github.com/probml/pml2-book/issues">Issue tracker</a>.
<li> <a href="https://github.com/probml/pyprobml/tree/master/notebooks/book2">
Code to reproduce most of the figures</a>
<li> <a href="#ack">Acknowledgements</a>
<li> <a href="#endorsements">Endorsements</a>
</ul>
If you use this book, please be sure to cite
<pre><code>
@book{pml2Book,
author = "Kevin P. Murphy",
title = "Probabilistic Machine Learning: Advanced Topics",
publisher = "MIT Press",
year = 2023,
url = "http://probml.github.io/book2"
}
</code></pre>
<p> Downloads since 2022-02-28.
<img src="https://img.shields.io/github/downloads/probml/pml2-book/total"
alt="download stats shield">
<h2><a id="toc">Table of contents</h2>
<img src="https://raw.githubusercontent.com/probml/pml2-book/main/toc2-short-2022-07-29.png"
alt="TOC 2022-06-28"
style="height:700;">
<p>
<h2><a id="endorsements"></a>Endorsements</h2>
<ul>
<p>
<li> "Kevin Murphy had already impressed and greatly benefited the machine learning community with his introductory
book on probabilistic ML
and I am delighted to see the depth and breadth of material in his new sequel on advanced probabilistic ML.
The book covers topics which I believe are at the heart of past and upcoming advances in our field,
while often lacking in the training of graduate students in computer science,
and I therefore recommend it highly to all of them."
-- <a href="https://yoshuabengio.org/">Yoshua Bengio</a>, U. Montreal
<p>
<li> "This book is an amazing tour de force: Murphy and his co-authors have described and systematized virtually
all of the important advances in machine learning over the past 30 years. Pick any topic, and they provide a
crisp description of the state-of-the-art methods in a common, well-chosen notation and using a set of core concepts
in modeling, statistics, and optimization. This book will be a valuable starting point for students entering
the field and a wonderful reference for seasoned researchers. It is hard to imagine a better antidote to the vast,
confusing, and voluminous literature in machine learning. This is such an amazing book! I learned things even in the sections
where I’m fairly up to date with the literature. " --- <a href="https://web.engr.oregonstate.edu/~tgd/">Tom Dietterich</a>,
Oregon State University
<p>
<li> "The prior version of Dr. Murphy's book was amongst the 3-4 books I recommended to students and machine learning colleagues
at all levels as being essential to own, and, perhaps more importantly, to always have readily at hand.
As machine learning has matured and evolved, no other comprehensive resource of this nature has even remotely
kept pace with modern methodological developments. This new version is a blessing for the machine learning community and frankly,
at this moment in time, is the only truly necessary machine learning book to own.
Very few people in the world can do what Dr. Murphy has done here and the world owes him its thanks."
-- <a href="https://www.cs.ubc.ca/~fwood/">Frank Wood</a>, UBC.
<p>
<li> "Whether teaching machine learning to undergrads, master students, or PhD students,
I found myself time and time again choosing the 2012 "Machine Learning: A Probabilistic Perspective" as the primary textbook.
When I heard about the new "Probabilistic Machine Learning" series, I was thrilled to see the expanded and modernized set of topics;
this will be the go-to book for my ML courses at Stanford. Kevin Murphy has a phenomenal ability to go deep while making topics
digestible to a broad audience. His writing is clear and concise with great visuals throughout. I highly recommend this as "the book"
for anyone wanting to become a well-versed ML expert." -- <a href="https://emilybfox.su.domains/">Emily Fox</a>, Stanford.
<p>
<li> "Murphy’s book is certainly the most comprehensive resource on machine learning available today.
With the growing body of research in the field,
it is a daunting challenge to provide an organized perspective of the current state of knowledge.
This book achieves this feat by integrating classic material, like MCMC inference,
with very recent developments like denoising diffusion models.
The material is organized and presented in a very accessible and intuitive manner,
making the book an asset for any researcher or practitioner in the field." --
<a href="https://cs3801.wixsite.com/amirgloberson">Amir Globerson</a>, Tel Aviv University.
<p>
<li> "This new Advanced Topics volume will provide a crucial path from the foundations of machine
learning to state-of-the-art probabilistic models, many of which have only been described in the research literature.
I particularly like the way it draws parallels between variational and Monte Carlo inference,
and between graphical models and (Bayesian) deep learning, so one can see the deep links between
methods that are sometimes cast as competitors.
I look forward to the next generations of probabilistic machine learning that this volume inspires."
--- <a href="https://www.ics.uci.edu/~sudderth/">Erik Sudderth</a>, UC Irvine.
<p>
<li> "This book provides an outstanding and deep tour of the most foundational ideas in probabilistic machine learning.
It explains the essential mathematical and computational tools that a student needs to move beyond the basics.
With this book,
Murphy provides a comprehensive resource that will be great not just to learn from, but also as a reference
to return to again and again." -- <a href="https://www.cs.princeton.edu/~rpa/">Ryan Adams</a>, Princeton.
<p>
<li> "Kevin Murphy's book is a landmark achievement in machine learning. It provides an in-depth coverage of
a wide range of topics in probabilistic machine learning, from inference methods to generative models
and decision making. It gives a modern perspective on these topics, bringing them up to date with recent
advances in deep learning and representation learning. The insights in this
book are essential for a solid understanding of the field,
and I highly recommend it to students and experts alike."
-- <a href="http://mlg.eng.cam.ac.uk/zoubin/">Zoubin Ghahramani</a>, Cambridge/Google.
<p>
<li> "As a
researcher trained by Murphy's 2012 book, I was excited to read
its sequel "Probabilistic Machine Learning: Advanced Topics." This new
book brings us to the forefront of cutting-edge probabilistic ML,
distilling the recent advances into a systemic exposition. It
communicates the core ideas of these recent advances in an
impressively clear and intuitive way, while managing to convey the
depth of these ideas by situating them in a broad context. It will
become a major reference that I constantly return to."
-- <a href="https://yixinwang.github.io/">Yixin Wang</a>, U. Michigan.
</ul>
<p>
<h2><a id="ack"></a>Acknowledgements</h2>
I would like to thank the following people for helping with this book.
<ul>
<li> People who helped to write various sections and chapters:
Alex Alemi (Google),
Jeff Bilmes (U. Washington),
Peter Chang,
Marco Cuturi (Apple, work done at Google),
Alexander D'Amour (Google),
Finale Doshi-Velez (Harvard),
Roy Frostig (Google),
Justin Gilmer (Google),
Giles Harper-Donnelly,
Been Kim (Google),
Durk Kingma (Google),
Simon Kornblith (Google),
Balaji Lakshminarayanan (Google),
Lihong Li (Amazon, work done at Google),
Xinglong Li (UBC),
Shakir Mohamed (Deepmind),
George Papamakarios (Deepmind),
Zeel Patel (IIT Gandhinagar),
Ben Poole (Google),
Mihaela Rosca (Deepmind / UCL),
Vinayak Rao (Purdue),
Yang Song (Stanford),
Victor Veitch (Google / U. Chicago),
Andrew Wilson (NYU).
<li> Many people who helped make or improve the figures, including:
Vishal Ghoniya, Ankita Kumari Jain, Aleyna Kara, Zeel Patel, Karm Patel,
Dhruv Patel, Nitish Sharma, Mahmoud Soliman.
<li> Participants in the Google Summer of Code (GSOC) for 2021,
including
Ming Liang Ang,
Aleyna Kara, Gerardo Duran-Martin,
Srikar Reddy Jilugu, Drishti Patel,
and co-mentor Mahmoud Soliman.
<li> Participants in the Google Summer of Code (GSOC) for 2022,
including
Peter Chang, Giles Harper-Donnelly,
Xinglong Li,
Zeel Patel, Karm Patel, Qingyao Sun,
and co-mentors Nipun Batra and Scott Linderman.
<li> Many other people who contributed code
(see auto-generated list <a href="https://github.com/probml/pyprobml#acknowledgements">here</a>).
<li> Many people who helped with proof reading (see book preface for list).
</ul>