【学术报告】哥伦比亚大学数据学院John Paisley教授短期讲学
日期:2018年03月14日 16:15:00

    报告主题:Computation Intelligence and Big Data Statistics

    (机器智能与大数据分析)

    报  人:John Paisley教授(哥伦比亚大学数据学院)

    邀  人:曾德炉 副教授


    时间安排:
 


                       数学学院欢迎广大师生参加。 

             

                                                201839


附件:    
    1.主讲人简介:
John Paisley教授任职于哥伦比亚大学数据科学联合学院(Data Science Institute)及EE,曾师从国际统计机器学习鼻祖M.I. Jordan,及主题模型之父David Blei。他在统计计算,贝叶斯推理及机器学习领域有深厚的研究基础,及在相关前沿领域一流及顶级期刊与学术会议上发表许多篇具有重要意义的研究论文。

John Paisley is an Assistant Professor at Columbia University since 2013, where he is a member of the Data Science Institute. He received the Ph.D. in Electrical and Computer Engineering from Duke University in 2010, and prior to that, the B.S. in Electrical and Computer Engineering from Duke University in 2004. From 2010 to 2013, he was a postdoctoral fellow in the Computer Science departments at Princeton University with David Blei and UC Berkeley with Michael I. Jordan.  He has published nearly 75 journal and conference papers on Bayesian statistics, machine learning, and pattern recognition. His course “Machine Learning” on edX has enrolled nearly 50,000 students around the world and he is currently under contract to write a textbook on “Bayesian Models for Machine Learning” for Cambridge University Press, which will appear in 2018. He has obtained the following awards:

Distinguished Faculty Teaching Award, Columbia University, 2017

Top 10% Paper Recognition, IEEE International Conference on Image Processing, 2013

Notable Paper Award, International Conference on Artificial Intelligence and Statistics, 2011

Confucius Institute Language Scholarship, Xiamen University (host), 12/2011-1/2012

Charles R. Vail Outstanding Graduate Scholarship Award, Duke University, May 2010

David Randall Fuller Prize, Duke University, May 2004


       2. 课程简介:

       This course provides an introduction to Bayesian approaches to machine learning. Topics will include mixed-membership models, latent factor models and Bayesian nonparametric methods. We will also focus on mean-field variational Bayesian inference, an optimization-based approach to approximate posterior learning. Applications of these methods include image processing, topic modeling, collaborative filtering and recommendation systems. We will discuss a selection of these in class.

Textbooks:
Christopher Bishop, Pattern Recognition and Machine Learning
David MacKay, Information Theory, Inference, and Learning Algorithms

Content of the lecture:

      1. Probability review, Bayes rule, conjugate priors, exponential family;

      2. Bayesian approaches to regression and classification;

      3. Hierarchical models, matrix factorization, sparse regression models, sampling

      4. EM algorithm, mixture models, factor models

      5. Variational inference I, mixture of Gaussians

      6. Variational inference II, conjugate exponential models, latent Dirichlet allocation

      7. Variational inference III, approximations for non-conjugate models

      8. Variational inference IV, inference for big data sets


附件:
地址:广州天河五山路381号华南理工大学一号楼东侧1202室 邮编:510640
电话:(020)87111484 传真:(020)87110668 E-mail:alumni@scut.edu.cn