Created
October 16, 2025 12:01
-
-
Save mattbullen/e6f91c776ab3ffca8f165c98f2de7fba to your computer and use it in GitHub Desktop.
Unit08 Ex4 gradient_descent_cost_function.ipynb
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| { | |
| "cells": [ | |
| { | |
| "cell_type": "markdown", | |
| "metadata": { | |
| "id": "view-in-github", | |
| "colab_type": "text" | |
| }, | |
| "source": [ | |
| "<a href=\"https://colab.research.google.com/gist/mattbullen/e6f91c776ab3ffca8f165c98f2de7fba/unit08-ex4-gradient_descent_cost_function.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "metadata": { | |
| "id": "2nUaq79P2PSd" | |
| }, | |
| "source": [ | |
| "## Calculating cost with gradient descent and learning rate\n", | |
| "- Change the iteration and learning rate vaules and see the impact on cost.\n", | |
| "- Low iteration values with high learning rate (i.e. big steps) may lead to miss the global minimum\n", | |
| "- Goal is to reach minimum cost with minimum iteration" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "metadata": { | |
| "id": "fS49cDDW2PSe" | |
| }, | |
| "outputs": [], | |
| "source": [ | |
| "# code credit:codebasics https://codebasics.io/coming-soon\n", | |
| "\n", | |
| "import numpy as np\n", | |
| "\n", | |
| "def gradient_descent(x,y):\n", | |
| " m_curr = b_curr = 0\n", | |
| " iterations = 100 #change value\n", | |
| " n = len(x)\n", | |
| " learning_rate = 0.08 #change value\n", | |
| "\n", | |
| " for i in range(iterations):\n", | |
| " y_predicted = m_curr * x + b_curr\n", | |
| " cost = (1/n) * sum([val**2 for val in (y-y_predicted)])\n", | |
| " md = -(2/n)*sum(x*(y-y_predicted))\n", | |
| " bd = -(2/n)*sum(y-y_predicted)\n", | |
| " m_curr = m_curr - learning_rate * md\n", | |
| " b_curr = b_curr - learning_rate * bd\n", | |
| " print (\"m {}, b {}, cost {} iteration {}\".format(m_curr,b_curr,cost, i))\n", | |
| "\n", | |
| "x = np.array([1,2,3,4,5])\n", | |
| "y = np.array([5,7,9,11,13])\n", | |
| "\n", | |
| "gradient_descent(x,y)" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "metadata": { | |
| "id": "ZM_oaGzK2PSf" | |
| }, | |
| "outputs": [], | |
| "source": [] | |
| } | |
| ], | |
| "metadata": { | |
| "kernelspec": { | |
| "display_name": "Python 3", | |
| "language": "python", | |
| "name": "python3" | |
| }, | |
| "language_info": { | |
| "codemirror_mode": { | |
| "name": "ipython", | |
| "version": 3 | |
| }, | |
| "file_extension": ".py", | |
| "mimetype": "text/x-python", | |
| "name": "python", | |
| "nbconvert_exporter": "python", | |
| "pygments_lexer": "ipython3", | |
| "version": "3.7.6" | |
| }, | |
| "colab": { | |
| "provenance": [], | |
| "include_colab_link": true | |
| } | |
| }, | |
| "nbformat": 4, | |
| "nbformat_minor": 0 | |
| } |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment