News
News 12/15/08: Abstracts Posted
The submitted abstracts for the accepted talks and posters have been linked to in the schedule below. Once we have collected all the presented slides, they will also be posted.
News 09/25/08: Call for Submissions
Submissions are solicited for a Kernel Learning workshop to be held on December 13th, 2008 at this year's NIPS workshop session in Whistler, Canada. Submissions to the workshop should be on the topic of automatic kernel selection or more broadly feature selection, multi-task learning and multi-view learning.
Submissions should concern using sampled data to select or learn a kernel function or kernel matrix appropriate for the specific task at hand. Submissions are not constrained regarding the area in which which the learned kernel is applied: this can include including classification, regression, and ranking, where the use of kernels is ubiquitous, and different settings including inductive, transductive, or semi-supervised learning. Presentations on the closely related topics of feature selection, multi-task learning and multi-view learning are also encouraged. Submissions should focus on theoretical, algorithmic or large-scale empirical results.
Accepted submissions will result in a 25 minute talk or a poster presentation. The deadline for submission is October 24th and notifications will be sent out by November 7th. Submissions should be sent in NIPS format (maximum 4 pages) to rostami (at) cs (dot) nyu (dot) edu. The program committee will consist of the workshop organizers and the invited speakers
News 09/12/08: Workshop Date
The workshop has been scheduled for December 13th, and will be located in Whistler, B.C. at the Hilton Whistler and Spa. This is part of a two day workshop (December 12-13), held after the main NIPS conference.
Workshop Description
Kernel methods are widely used to address a variety of learning tasks including classification, regression, ranking, clustering, and dimensionality reduction. The appropriate choice of a kernel is often left to the user. But, poor selections may lead to sub-optimal performance. Furthermore, searching for an appropriate kernel manually may be a time-consuming and imperfect art. Instead, the kernel selection process can be included as part of the overall learning problem. In this way, better performance guarantees can be given and the kernel selection process can be made automatic.
In this workshop, we will be concerned with using sampled data to select or learn a kernel function or kernel matrix appropriate for the specific task at hand. We will discuss several scenarios, including classification, regression, and ranking, where the use of kernels is ubiquitous, and different settings including inductive, transductive, or semi-supervised learning.
The goal is to cover all questions related to the problem of learning kernels: different problem formulations, the computational efficiency and accuracy of the algorithms that address these problems and their different strengths and weaknesses, and the theoretical guarantees provided. What is the computational complexity? Does it work in practice? Equally as important, we will discuss improvements, new algorithms and motivate further interesting scenarios. The formulation of some other learning problems, e.g. multi-task learning problems, is often very similar. These problems and their solutions will also be discussed in this workshop.
Schedule
PDF w/ abstracts
7:30am | Shai Ben-David (Invited Speaker): The Sample Complexity of Learning the Kernel |
---|---|
8:00am | Olivier Chapelle and Alain Rakotomamonjy: Second Order Optimization of Kernel Parameters |
8:20am | William Stafford Noble (Invited Speaker): Multi-Kernel Learning for Biology |
8:50am | Poster Session, Discussion and Coffee Break |
9:20am | Corinna Cortes, Mehryar Mohri and Afshin Rostamizadeh: Learning Sequence Kernels |
9:40am | Maria-Florina Balcan, Avrim Blum and Nathan Srebro: Learning with Multiple Similarity Functions |
10:00am | Andreas Argyriou (Invited Speaker): Multi-Task Learning via Matrix Regularization |
10:30am | Break until afternoon session |
3:30pm | Isabelle Guyon (Invited Speaker): Feature Selection - From Correlation to Causality |
4:00pm | Nathan Srebro and Shai Ben-David: Learning Bounds for Support Vector Machines with Learned Kernels |
4:20pm | Alex Smola (Invited Speaker): Mixed Norm Kernels, Hyperkernels and Other Variants |
4:50pm | Poster Session, Discussion and Coffee Break |
5:20pm | Marius Kloft, Ulf Brefeld, Pavel Laskov and Sören Sonnenburg: Non-sparse Multiple Kernel Learning |
5:40pm | Peter Gehler: Infinite Kernel Learning | 6:00pm | John Shawe-Taylor (Invited Speaker): Kernel Learning for Novelty Detection |
6:30pm | Closing Remarks |
Poster Presentations:
-Ravi S. Ganti, Nikolaos Vasiloglou and Alexander Gray: Hyperkernel Based Density Estimation-Andrew G. Howard and Tony Jebara: Learning Large Margin Mappings
-S. Mosci, M. Santoro, A. Verri, S. Villa and L. Rosasco: A New Algorithm to Learn an Optimal Kernel Based on Fenchel Duality
-Hua Ouyang and Alexander Gray: Learning Nearest-Neighbor Classifiers with Hyperkernels
-Nikolaos Vasiloglou, Alexander G. Gray and David V. Anderson: Learning Isometric Separation Maps
All other submitted talks are also encouraged to give posters.
Organizers:
- Corinna Cortes
- Google Research New York
- corinna at google.com
- Arthur Gretton
- Max Planck Institute for Biological Cybernetics
- arthur.gretton at tuebingen.mpg.de
- Gert Lanckriet
- University of California, San Diego
- gert at ece.ucsd.edu
- Mehryar Mohri
- Courant Institute of Mathematical Sciences & Google Research
- mohri at cims.nyu.edu
- Afshin Rostamizadeh
- Courant Institute of Mathematical Sciences
- rostami at cs.nyu.edu
Invited Speakers:
Important Dates:
- Submission Deadline: Oct 24
- Notification Sent: Nov 7
- Workshop Date: Dec. 13