Skip to content

Instantly share code, notes, and snippets.

@rishabhjain
Created May 11, 2013 13:28
Show Gist options
  • Save rishabhjain/5559975 to your computer and use it in GitHub Desktop.
Save rishabhjain/5559975 to your computer and use it in GitHub Desktop.
Logistic Regression
require 'matrix'
require 'debugger'
def online_gradient_descent(sample,label,weight,learning_rate=0.1)
# debugger
# Expecting Sample and Weight to be a Vector
# Expecting label to be from {1,-1} and learning_rate to be float
# Single epoch, updates weight vector using a single sample
weight -= (-1)*(learning_rate)*label*sample/(1+Math.exp(label*(sample.inner_product(weight))))
end
def prediction(weight,sample)
exponent = Math.exp(weight.inner_product(sample))
Math.exp(exponent)/(1+Math.exp(exponent))
end
def stochastic_gradient_descent(samples,labels,weight=nil,learning_rate=0.1,epochs=5,shuffle=true)
# Expecting samples to be N `cross` M Matrix, N is total number of samples and M is total features.
weight ||= Vector.elements([0]*samples.column_count) #think about how to initialize this properly
1.upto(epochs).each do
samples = shuffle ? Matrix[*samples.to_a.shuffle] : samples #is it efficient?
# p weight
0.upto(samples.row_count-1).each do |row_index|
sample = samples.row(row_index)
label = labels[row_index]
# p weight,row_index
weight = online_gradient_descent(sample,label,weight,learning_rate)
end
# p weight
end
weight
end
sample = Matrix[[1,1],[0,1],[0,0],[1,0]]
label = Vector[1,-1,-1,1]
weight = Vector[1,1]
p stochastic_gradient_descent(sample,label)
# Regularization
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment