Skip to content

Instantly share code, notes, and snippets.

@motivic
Created October 30, 2016 16:13
Show Gist options
  • Save motivic/84787f1dec8344c68793288a665bd34b to your computer and use it in GitHub Desktop.
Save motivic/84787f1dec8344c68793288a665bd34b to your computer and use it in GitHub Desktop.
Comparing neural network activation units: tanh, relu, sigmoid/softmax, softplus
import numpy as np
import matplotlib.pyplot as plt
x = np.arange(-2, 2, 0.1)
# Hyperbolic Tangent
y_tanh = np.tanh(x)
# Sigmoid/Softmax
sigmoid = lambda t: 1/(1+np.exp(-t))
sigmoid_vec = np.vectorize(sigmoid, otypes=[np.float])
y_sigmoid = sigmoid_vec(x)
# Rectified Linear Units
relu = lambda t: float(t) if t > 0 else 0
relu_vec = np.vectorize(relu, otypes=[np.float])
y_relu = relu_vec(x)
# Softplus
softplus = lambda t: np.log(1+np.exp(t))
softplus_vec = np.vectorize(softplus, otypes=[np.float])
y_softplus = softplus_vec(x)
plt.plot(x, y_tanh, x, y_sigmoid, x, y_relu, x, y_softplus)
plt.show()
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment