Created
October 30, 2016 16:13
-
-
Save motivic/84787f1dec8344c68793288a665bd34b to your computer and use it in GitHub Desktop.
Comparing neural network activation units: tanh, relu, sigmoid/softmax, softplus
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| import numpy as np | |
| import matplotlib.pyplot as plt | |
| x = np.arange(-2, 2, 0.1) | |
| # Hyperbolic Tangent | |
| y_tanh = np.tanh(x) | |
| # Sigmoid/Softmax | |
| sigmoid = lambda t: 1/(1+np.exp(-t)) | |
| sigmoid_vec = np.vectorize(sigmoid, otypes=[np.float]) | |
| y_sigmoid = sigmoid_vec(x) | |
| # Rectified Linear Units | |
| relu = lambda t: float(t) if t > 0 else 0 | |
| relu_vec = np.vectorize(relu, otypes=[np.float]) | |
| y_relu = relu_vec(x) | |
| # Softplus | |
| softplus = lambda t: np.log(1+np.exp(t)) | |
| softplus_vec = np.vectorize(softplus, otypes=[np.float]) | |
| y_softplus = softplus_vec(x) | |
| plt.plot(x, y_tanh, x, y_sigmoid, x, y_relu, x, y_softplus) | |
| plt.show() |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment