“學萃講壇”第880期—深度神經網絡與主題建模

時間:2018-11-22作者:文章來源:伟徳国际官网登录入口浏覽:1271

    “學萃講壇”秉承學名家風範、萃科技精華的理念,以學術為魂,以育人為本,追求技術創新,提升學術品位,營造濃郁學術氛圍,共品科技饕餮盛宴!
報告題目:深度神經網絡與主題建模
報   告 人:Wray Buntine
報告時間:2018年11月21日  9:00
報告地點:21#426多媒體報告廳
主辦單位:科學技術研究院
承辦單位:伟徳国际官网登录入口
報告人簡介:Wray  Buntine is a full professor at Monash University from 2014 and is director of  the Master of Data Science, the Faculty of IT's newest and in-demand degree. He  was previously at NICTA Canberra, Helsinki Institute for Information Technology  where he ran a semantic search project, NASA Ames Research Center, University of  California, Berkeley, and Google. He is known for his theoretical and applied  work and in probabilistic methods for document and text analysis, social  networks, data mining and machine learning.
報告内容:Something Old: In this talk  I will first describe some of our recent work with hierarchical probabilistic  models that are not deep neural networks. Nevertheless, these are currently  among the state of the art in classification and in topic modelling:  k-dependence
Bayesian networks and hierarchical topic models, respectively,  and both are deep models in a different sense. These represent some of the  leading edge machine learning technology prior to the advent of deep neural  networks. Something New: On deep neural networks, I will describe as a point of  comparison some of the state of the art applications I am familiar with:  multi-task learning, document classification, and learning to learn. These build  on the RNNs widely used in semi-structured learning. The old and the new are  remarkably different. So what are the new capabilities deep neural networks have  yielded? Do we even need the old technology? What can we do next? Something  Borrowed: to complete the story, I'll introduce some efforts to combine the two  approaches, borrowing from earlier work in statistics.




Baidu
sogou