# Logistic regression with a Laplacian prior on the singular values: convex duality and application to EEG classification.

Note unusual time and location

We consider a matrix coefficient probabilistic classification problem. We put a Laplacian prior on the singular values of the coefficient matrix. The Laplacian prior not only keeps the singular value spectrum of the regression coefficient sparse, thus offering good interpretation of the solution, but also is a key to good generalization. In addition, we propose an efficient optimization algorithm based on interior-point method. The convex duality plays the key role in this implementation. We apply the mehtod to motor-imagery EEG classification problem in the context of Brain-Computer Interface (BCI). Classification results on 162 BCI datasets show significant improvement in the classification accuracy against $\ell_2$-regularized logistic regression, rank=2 approximated logistic regression as well as Common Spatial Pattern (CSP) based classifier, which is a popular technique in BCI . Connections to LASSO , GP classification with a second order polynomial kernel, and SVM are discussed.

I would also like to talk about “Multiple-ouput Gaussian Process for Nonlinear System Identification”, which is still a vague idea but I hope I get some stimulating feedbacks.

This talk is part of the Inference Group series.