Skip to content

Instantly share code, notes, and snippets.

@CodeKiwi
Created June 24, 2014 04:37
Show Gist options
  • Save CodeKiwi/1033d7ea35d713ac6d07 to your computer and use it in GitHub Desktop.
Save CodeKiwi/1033d7ea35d713ac6d07 to your computer and use it in GitHub Desktop.

Revisions

  1. CodeKiwi created this gist Jun 24, 2014.
    45 changes: 45 additions & 0 deletions gradientdescentforlinearregression
    Original file line number Diff line number Diff line change
    @@ -0,0 +1,45 @@
    class LinearRegression < Object::Base

    def initialize(options)
    @training_set = options['training_set']
    @theta0 = options['theta0start']
    @theta1 = options['theta1start']
    @alpha = options['alpha']
    end

    def minimize_theta
    isconverged = false
    while !isconverged do
    temp_theta0 = perform_gradient_descent_for_theta0(@theta0)
    temp_theta1 = perform_gradient_descent_for_theta1(@theta1)
    if (temp_theta0 - @theta0 == 0 && temp_theta1 - @theta1 == 0)
    isconverged = true
    end
    @theta0 = temp_theta0
    @theta1 = temp_theta1
    end
    return [@theta0,@theta1]
    end

    def perform_gradient_descent_for_theta0(theta)
    return theta - @alpha * calculate_deritive(theta) * cost_function(theta, @theta1)
    end

    def perform_gradient_descent_for_theta1(theta)
    result = theta - @alpha calculate_deritive(theta) * cost_function(@theta0, theta)
    end

    def calculate_deritive(theta)
    #calculate deritive for theta
    end

    def cost_function(theta0, theta1)
    #calculate cost function
    sum_of_squared_errors = 0
    @training_set.each do |training_pair|
    squared_error = (theta0 * taining_pair['x']) - (theta1 * taining_pair['y']) ^ 2
    sum_of_squared_errors += squared_error
    end
    return sum_of_squared_errors
    end
    end