Search Results for author: Minseong Kim

Found 5 papers, 1 papers with code

ContrastCAD: Contrastive Learning-based Representation Learning for Computer-Aided Design Models

1 code implementation2 Apr 2024 Minseop Jung, Minseong Kim, Jibum Kim

However, learning CAD models is still a challenge, because they can be represented as complex shapes with long construction sequences.

Contrastive Learning Data Augmentation +1

Brick partition problems in three dimensions

no code implementations20 Jan 2021 Ilkyoo Choi, Minseong Kim, Kiwon Seo

As a generalization of the above question, we also seek the minimum size $s(d, k)$ of a brick partition $\mathcal{P}_d$ of a $d$-dimensional brick where each axis-parallel plane intersects at least $k$ bricks in $\mathcal{P}_d$.

Combinatorics

Total Style Transfer with a Single Feed-Forward Network

no code implementations ICLR 2019 Minseong Kim, Hyun-Chul Choi

To transfer the style of an arbitrary image to a content image, these methods used a feed-forward network with a lowest-scaled feature transformer or a cascade of the networks with a feature transformer of a corresponding scale.

Style Transfer

Uncorrelated Feature Encoding for Faster Image Style Transfer

no code implementations4 Jul 2018 Minseong Kim, Jongju Shin, Myung-Cheol Roh, Hyun-Chul Choi

Although the pre-trained network is used to generate responses of receptive fields effective for representing style and content of image, it is not optimized for image style transfer but rather for image classification.

Image Classification Style Transfer

Unbiased Image Style Transfer

no code implementations4 Jul 2018 Hyun-Chul Choi, Minseong Kim

To solve this problem of the biased network, we propose an unbiased learning technique which uses unbiased training data and corresponding unbiased loss for alpha = 0. 0 to make the feed-forward networks to generate a zero-style image, i. e., content image when alpha = 0. 0.

regression Style Transfer

Cannot find the paper you are looking for? You can Submit a new open access paper.