Activation Functions

An oscillatory function defined as $x \cdot cos(x)$ that reports better performance than Sigmoid, Mish, Swish, and ReLU on several benchmarks.

Source: Growing Cosine Unit: A Novel Oscillatory Activation Function That Can Speedup Training and Reduce Parameters in Convolutional Neural Networks

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Feature Engineering 1 100.00%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories