Theses & Reports
Instructions for submitting a technical report or thesis.
You can find technical reports published prior to 1990 archived here.
Instructions for submitting a technical report or thesis.
You can find technical reports published prior to 1990 archived here.
Title: Function Space Reasoning in Gaussian Processes and Neural Networks
Candidate: Benton, Gregory
Advisor(s): Andrew Gordon Wilson
Abstract:
In a typical modeling setting we have prior notions of what types of functions we want to learn. For example, in regression we may want to learn a smooth function or a periodic function and in image classification we may want to learn a function that is invariant to rotations. While function space provides us the benefit of being able to reason about traits like invariance or smoothness, it is often difficult to directly quantify the functional properties of models, in particular for large parametric models like neural networks.
In this thesis we leverage our ability to reason about function space to build more powerful models in both Gaussian processes (GPs) and neural networks. By generating GP kernels as functions themselves of latent processes, we introduce methods for providing uncertainty over what types of functions we produce, not just over the functions themselves in GP models. We also introduce methods for learning levels of invariance and equivariance in neural networks, enabling us to imbue the functions our models produce with soft or limited equivariance constraints. Finally, we show how we can leverage our understanding of parameter space in neural networks to efficiently ensemble diverse collections of functions to improve the accuracy and robustness of our models. Through the introduction of these methods we show that by carefully considering the types of functions we are producing we can describe models with a range of desirable properties. These properties include more flexible models, models that better align with domain knowledge, and models that are both accurate and robust. We demonstrate these results on a broad range of problems, including time series forecasting, image classification, and reinforcement learning.
Title: Expanding Structural Design through Shape Optimization and Microstructures
Candidate: Tozoni, Davi Colli
Advisor(s): Denis Zorin
Abstract:
3D printing and other modern manufacturing tools allow users to design and produce customized objects for their needs at a considerably low cost. However, designing structures that are able to perform well is not an easy task and doing it manually can be a very slow and tedious process. In this context, structural optimization techniques can be very useful and help automating the design and analysis process.
This thesis describes techniques that can expand the usage of structural optimization for digital fabrication by formulating optimization to be used with simulation models that are closer to reality, through the addition of contact and friction. Moreover, we show a fast method to compute gradients from differentiable simulations, which can be used to optimize shape, material and physical properties of our domain. In addition, we provide ways of expanding the use of two-scale topology optimization by presenting microstructures that have a smooth map from material to geometry and which can be used on curved shapes defined by irregular lattices with close to rhombic cells. Finally, we introduce two low-parametric microstructures that together are able to cover almost the whole possible range of elastic properties for isotropic metamaterials.
Our results in simulation and physical experiments, both for static and time-dependent scenarios, show the advantages of our techniques and how they can be used in practice.