BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//talks.cam.ac.uk//v3//EN
BEGIN:VTIMEZONE
TZID:Europe/London
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:19700329T010000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:19701025T020000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
CATEGORIES:Isaac Newton Institute Seminar Series
SUMMARY:The tensor graphical lasso (Teralasso) - Alfred He
ro (University of Michigan)
DTSTART;TZID=Europe/London:20171031T140000
DTEND;TZID=Europe/London:20171031T145000
UID:TALK94117AThttp://talks.cam.ac.uk
URL:http://talks.cam.ac.uk/talk/index/94117
DESCRIPTION:Co-authors: Kristjian Greenewald (Harvard U
niversity)\, Shuheng Zhou (University of Michigan
)\, Alfred Hero (University of Michigan) <
br>
We propose a new ultrasparse graphic
al model for representing multiway data based on a
Kronecker sum representation of the process inver
se covariance matrix. This statistical model decom
poses the inverse covariance into a linear Kroneck
er sum representation with sparse Kronecker factor
s.
Under the assumption that the multiwa
y observations are matrix-normal the l1 sparsity
regularized log-likelihood function is convex and
admits significantly faster statistical rates of c
onvergence than other sparse matrix normal algorit
hms such as graphical lasso or Kronecker graphical
lasso.
We specify a scalable composite gr
adient descent method for minimizing the objective
function and analyze both the statistical and the
computational convergence ratesm\, showing that t
he composite gradient descent algorithm is guarant
eed to converge at a geometric rate to the global
minimizer. We will illustrate the method on severa
l real multiway datasets\, showing that we can rec
over sparse graphical structures in high dimension
al data.
Related Links