# Vitruvion: A Generative Model of Parametric CAD Sketches

Parametric computer-aided design (CAD) tools are the predominant way that engineers specify physical structures, from bicycle pedals to airplanes to printed circuit boards.

# Autobahn: Automorphism-based Graph Neural Nets

Our formalism also encompasses novel architectures: as an example, we introduce a graph neural network that decomposes the graph into paths and cycles.

26

# SketchGraphs: A Large-Scale Dataset for Modeling Relational Geometry in Computer-Aided Design

1 code implementation16 Jul 2020, , ,

Parametric computer-aided design (CAD) is the dominant paradigm in mechanical engineering for physical design.

234

# Error bounds in estimating the out-of-sample prediction error using leave-one-out cross validation in high-dimensions

We study the problem of out-of-sample risk estimation in the high dimensional regime where both the sample size $n$ and number of features $p$ are large, and $n/p$ can be less than one.

0

# Discrete Object Generation with Reversible Inductive Construction

The success of generative modeling in continuous domains has led to a surge of interest in generating discrete data such as molecules, source code, and graphs.

29

# Approximate Leave-One-Out for High-Dimensional Non-Differentiable Learning Problems

$\mathcal{C} \subset \mathbb{R}^{p}$ is a closed convex set.

6

# Approximate Leave-One-Out for Fast Parameter Tuning in High Dimensions

Consider the following class of learning schemes: $$\hat{\boldsymbol{\beta}} := \arg\min_{\boldsymbol{\beta}}\;\sum_{j=1}^n \ell(\boldsymbol{x}_j^\top\boldsymbol{\beta}; y_j) + \lambda R(\boldsymbol{\beta}),\qquad\qquad (1)$$ where $\boldsymbol{x}_i \in \mathbb{R}^p$ and $y_i \in \mathbb{R}$ denote the $i^{\text{th}}$ feature and response variable respectively.

6

# Empirical Risk Minimization and Stochastic Gradient Descent for Relational Data

We solve this problem using recent ideas from graph sampling theory to (i) define an empirical risk for relational data and (ii) obtain stochastic gradients for this empirical risk that are automatically unbiased.

16

# Non-Vacuous Generalization Bounds at the ImageNet Scale: A PAC-Bayesian Compression Approach

Our main technical result is a generalization bound for compressed networks based on the compressed size.

24
Cannot find the paper you are looking for? You can Submit a new open access paper.