{ "cells": [ { "cell_type": "markdown", "metadata": { "id": "view-in-github", "colab_type": "text" }, "source": [ "\"Open" ] }, { "cell_type": "markdown", "metadata": { "id": "2nUaq79P2PSd" }, "source": [ "## Calculating cost with gradient descent and learning rate\n", "- Change the iteration and learning rate vaules and see the impact on cost.\n", "- Low iteration values with high learning rate (i.e. big steps) may lead to miss the global minimum\n", "- Goal is to reach minimum cost with minimum iteration" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "fS49cDDW2PSe" }, "outputs": [], "source": [ "# code credit:codebasics https://codebasics.io/coming-soon\n", "\n", "import numpy as np\n", "\n", "def gradient_descent(x,y):\n", " m_curr = b_curr = 0\n", " iterations = 100 #change value\n", " n = len(x)\n", " learning_rate = 0.08 #change value\n", "\n", " for i in range(iterations):\n", " y_predicted = m_curr * x + b_curr\n", " cost = (1/n) * sum([val**2 for val in (y-y_predicted)])\n", " md = -(2/n)*sum(x*(y-y_predicted))\n", " bd = -(2/n)*sum(y-y_predicted)\n", " m_curr = m_curr - learning_rate * md\n", " b_curr = b_curr - learning_rate * bd\n", " print (\"m {}, b {}, cost {} iteration {}\".format(m_curr,b_curr,cost, i))\n", "\n", "x = np.array([1,2,3,4,5])\n", "y = np.array([5,7,9,11,13])\n", "\n", "gradient_descent(x,y)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "ZM_oaGzK2PSf" }, "outputs": [], "source": [] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.7.6" }, "colab": { "provenance": [], "include_colab_link": true } }, "nbformat": 4, "nbformat_minor": 0 }