Shifted Softplus is an activation function ${\rm ssp}(x) = \ln( 0.5 e^{x} + 0.5 )$, which SchNet employs as non-linearity throughout the network in order to obtain a smooth potential energy surface. The shifting ensures that ${\rm ssp}(0) = 0$ and improves the convergence of the network. This activation function shows similarity to ELUs, while having infinite order of continuity.
Source: SchNet: A continuous-filter convolutional neural network for modeling quantum interactionsPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Formation Energy | 3 | 33.33% |
Property Prediction | 2 | 22.22% |
BIG-bench Machine Learning | 2 | 22.22% |
Drug Discovery | 1 | 11.11% |
Total Energy | 1 | 11.11% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |