We consider the problem of learning function classes computed by neural networks with various activations (e.g. ReLU or Sigmoid), a task believed to be computationally intractable in the worst-case. A major open problem is to understand the minimal assumptions under which these classes admit provably efficient algorithms... (read more)
PDFMETHOD | TYPE | |
---|---|---|
![]() |
Activation Functions |