Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Select an option

  • Save mattbullen/e6f91c776ab3ffca8f165c98f2de7fba to your computer and use it in GitHub Desktop.

Select an option

Save mattbullen/e6f91c776ab3ffca8f165c98f2de7fba to your computer and use it in GitHub Desktop.

Revisions

  1. mattbullen revised this gist Oct 16, 2025. 1 changed file with 12 additions and 1 deletion.
    13 changes: 12 additions & 1 deletion unit08-ex4-gradient_descent_cost_function.ipynb
    Original file line number Diff line number Diff line change
    @@ -1,5 +1,15 @@
    {
    "cells": [
    {
    "cell_type": "markdown",
    "metadata": {
    "id": "view-in-github",
    "colab_type": "text"
    },
    "source": [
    "<a href=\"https://colab.research.google.com/gist/mattbullen/e6f91c776ab3ffca8f165c98f2de7fba/unit08-ex4-gradient_descent_cost_function.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
    ]
    },
    {
    "cell_type": "markdown",
    "metadata": {
    @@ -74,7 +84,8 @@
    "version": "3.7.6"
    },
    "colab": {
    "provenance": []
    "provenance": [],
    "include_colab_link": true
    }
    },
    "nbformat": 4,
  2. mattbullen created this gist Oct 16, 2025.
    82 changes: 82 additions & 0 deletions unit08-ex4-gradient_descent_cost_function.ipynb
    Original file line number Diff line number Diff line change
    @@ -0,0 +1,82 @@
    {
    "cells": [
    {
    "cell_type": "markdown",
    "metadata": {
    "id": "2nUaq79P2PSd"
    },
    "source": [
    "## Calculating cost with gradient descent and learning rate\n",
    "- Change the iteration and learning rate vaules and see the impact on cost.\n",
    "- Low iteration values with high learning rate (i.e. big steps) may lead to miss the global minimum\n",
    "- Goal is to reach minimum cost with minimum iteration"
    ]
    },
    {
    "cell_type": "code",
    "execution_count": null,
    "metadata": {
    "id": "fS49cDDW2PSe"
    },
    "outputs": [],
    "source": [
    "# code credit:codebasics https://codebasics.io/coming-soon\n",
    "\n",
    "import numpy as np\n",
    "\n",
    "def gradient_descent(x,y):\n",
    " m_curr = b_curr = 0\n",
    " iterations = 100 #change value\n",
    " n = len(x)\n",
    " learning_rate = 0.08 #change value\n",
    "\n",
    " for i in range(iterations):\n",
    " y_predicted = m_curr * x + b_curr\n",
    " cost = (1/n) * sum([val**2 for val in (y-y_predicted)])\n",
    " md = -(2/n)*sum(x*(y-y_predicted))\n",
    " bd = -(2/n)*sum(y-y_predicted)\n",
    " m_curr = m_curr - learning_rate * md\n",
    " b_curr = b_curr - learning_rate * bd\n",
    " print (\"m {}, b {}, cost {} iteration {}\".format(m_curr,b_curr,cost, i))\n",
    "\n",
    "x = np.array([1,2,3,4,5])\n",
    "y = np.array([5,7,9,11,13])\n",
    "\n",
    "gradient_descent(x,y)"
    ]
    },
    {
    "cell_type": "code",
    "execution_count": null,
    "metadata": {
    "id": "ZM_oaGzK2PSf"
    },
    "outputs": [],
    "source": []
    }
    ],
    "metadata": {
    "kernelspec": {
    "display_name": "Python 3",
    "language": "python",
    "name": "python3"
    },
    "language_info": {
    "codemirror_mode": {
    "name": "ipython",
    "version": 3
    },
    "file_extension": ".py",
    "mimetype": "text/x-python",
    "name": "python",
    "nbconvert_exporter": "python",
    "pygments_lexer": "ipython3",
    "version": "3.7.6"
    },
    "colab": {
    "provenance": []
    }
    },
    "nbformat": 4,
    "nbformat_minor": 0
    }