no code implementations • 24 Sep 2023 • Stone Yun, Alexander Wong
Graph Hypernetworks (GHN) can predict the parameters of varying unseen CNN architectures with surprisingly good accuracy at a fraction of the cost of iterative optimization.
no code implementations • 26 Aug 2022 • Stone Yun, Alexander Wong
We conduct the first-ever study exploring the use of graph hypernetworks for predicting parameters of unseen quantized CNN architectures.
1 code implementation • 10 Jan 2022 • Harry Nguyen, Stone Yun, Hisham Mohammad
In the paper, the author describes BowNet as a network consisting of a convolutional feature extractor $\Phi(\cdot)$ and a Dense-softmax layer $\Omega(\cdot)$ trained to predict BoW features from images.
no code implementations • 24 Apr 2021 • Stone Yun, Alexander Wong
As the "Mobile AI" revolution continues to grow, so does the need to understand the behaviour of edge-deployed deep neural networks.
no code implementations • 30 Nov 2020 • Stone Yun, Alexander Wong
Depth factorization and quantization have emerged as two of the principal strategies for designing efficient deep convolutional neural network (CNN) architectures tailored for low-power inference on the edge.
no code implementations • 30 Nov 2020 • Stone Yun, Alexander Wong
The fine-grained, layerwise analysis enables us to gain deep insights on how initial weights distributions will affect final accuracy and quantized behaviour.