BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Higher Order Fused Regularization for Supervised Learning with Gro
 uped Parameters - Koh Takeuchi (NTT Communication Science Laboratories)
DTSTART:20150915T093000Z
DTEND:20150915T103000Z
UID:TALK60263@talks.cam.ac.uk
CONTACT:Dr Jes Frellsen
DESCRIPTION:We often encounter situations in supervised learning where the
 re exist possibly overlapping groups that consist of more than two paramet
 ers. For example\, we might work on parameters that correspond to words ex
 pressing the same meaning\, music pieces in the same genre\, and books rel
 eased in the same year. Based on such auxiliary information\, we could sup
 pose that parameters in a group have similar roles in a problem and simila
 r values. In this paper\, we propose the Higher Order Fused (HOF) regulari
 zation that can incorporate smoothness among parameters with group structu
 res as prior knowledge in supervised learning. We define the HOF penalty a
 s the Lov\\'{a}sz extension of a submodular higher-order potential functio
 n\, which encourages parameters in a group to take similar estimated value
 s when used as a regularizer. Moreover\, we develop an efficient network f
 low algorithm for calculating the proximity operator for the regularized p
 roblem. We investigate the empirical performance of the proposed algorithm
  by using synthetic and real-world data.
LOCATION:Engineering Department\, CBL Room BE-438
END:VEVENT
END:VCALENDAR
