1 code implementation • Distill 2021 • Gabriel Goh, Nick Cammarata, Chelsea Voss, Shan Carter, Michael Petrov, Ludwig Schubert, Alec Radford, Chris Olah
It’s the fact that you plug visual information into the rich tapestry of memory that brings it to life."
no code implementations • Distill 2020 • Nick Cammarata, Gabriel Goh, Shan Carter, Ludwig Schubert, Michael Petrov, Chris Olah
Every vision model we've explored in detail contains neurons which detect curves.
no code implementations • Distill 2020 • Nick Cammarata, Shan Carter, Gabriel Goh, Chris Olah, Michael Petrov, Ludwig Schubert, Chelsea Voss, Ben Egan, Swee Kiat Lim
To facilitate exploration of this direction, Distill is inviting a “thread” of short articles on circuits, interspersed with critical commentary by experts in adjacent fields.
1 code implementation • Distill 2019 • Shan Carter, Zan Armstrong, Ludwig Schubert, Ian Johnson, Chris Olah
By using feature inversion to visualize millions of activations from an image classification network, we create an explorable activation atlas of features the network has learned which can reveal how the network typically represents some concepts.
1 code implementation • Distill 2018 • Chris Olah, Arvind Satyanarayan, Ian Johnson, Shan Carter, Ludwig Schubert, Katherine Ye, Alexander Mordvintsev
In this article, we treat existing interpretability methods as fundamental and composable building blocks for rich user interfaces.
no code implementations • 12 Aug 2017 • Daniel Smilkov, Shan Carter, D. Sculley, Fernanda B. Viégas, Martin Wattenberg
The recent successes of deep learning have led to a wave of interest from non-experts.