Title: Factor Graphs for  Relational Regression 


Authors: Sumit Chopra, Trivikaraman Thampy, John Leahy, Andrew Caplin, Yann LeCun 

Traditional methods for supervised learning involve treating the input
data as a set of independent, identically distributed
samples. However, in many situations, the samples are related in
such a way that variables associated with one sample depend on other
samples.  We present a new form of relational graphical model that, in
addition to capturing the dependence of the output on sample specific
features, can also capture hidden relationships among samples through
a non-parametric latent manifold.
Learning in the proposed graphical model involves
simultaneously learning the non-parametric latent manifold along with
a non-relational parametric model.
Efficient inference algorithms are introduced to accomplish this task.
The method is applied to the prediction of house prices.
A non-relational model predicts an ``intrinsic" price of the house
which depends only on its individual characteristics,
and a relational model estimates a hidden surface of ``desirability''
coefficients which links the price of a house to that of similar
houses in the neighborhood.