Saturday, May 19, 2018

Matlab program to implement gradient descent algorithm to fit linear regression parameter

Data: fisherIrisSetosaData

implementation of  Linear Regression to predict sepal length given sepal width. 



This program finds the regression parameter w0 and w1. With this parameter, the test instance output can be obtained.
Code:



clc
clear all

fisherIrisSetosaData = importdata("fisherIrisSetosaData.txt");

Y = fisherIrisSetosaData(1:end,1);%sepal length
X = fisherIrisSetosaData(1:end,2);%sepal Width

w0_old = 0;
w1_old = 0;
w0diff =1;
w1diff = 1;
alpha = 0.1;
count =0;
tic;
while (w0diff >0.000001 || w1diff >0.000001)
        count = count +1;
        sum_w0 = 0;
        sum_w1 = 0;
        for i =1:50
            sum_w0 = sum_w0+ ((w0_old+w1_old.*X(i,1))-Y(i,1));
            sum_w1 = sum_w1+ ((w0_old+w1_old.*X(i,1))-Y(i,1)).*X(i,1);
        
        
            
            
            
        end 
       % sum_w0 = sum((w0_old+w1_old.*X)-Y);
       % sum_w1 =sum(((w0_old+w1_old.*X)-Y).*X);
        cost_w0 = alpha/50 *sum_w0;
        cost_w1 =alpha/50 *sum_w1;
        w0_new = w0_old-cost_w0;
        w1_new = w1_old -cost_w1;
        w0diff = abs(w0_old - w0_new);
        w1diff = abs(w1_old - w1_new);
        w0_old = w0_new;
        w1_old = w1_new;
        
        
     
    
end 

time_descent = toc;


fprintf('value of Parameter w0 and w1 is with alpha  = %f\n',alpha)
%disp("value of Parameter w0 and w1 is given as ");
fprintf('W0 = %f\n',w0_old);
fprintf('W0 = %f\n',w1_old);

plot(X,Y,"g*");
hold on
calculated_y = w0_old+w1_old.* X;
plot(X,calculated_y,'r')
legend("real data","calculated line",'Location','NorthWest');

xlabel("SepalWidth");
ylabel("Sepal Length");
title("linear regression with gradient descent");
fprintf("Solving linear function takes %f seconds.\n",time_descent);
fprintf("Total number of Iteration  = %d.\n",count);
disp(count)



No comments:

Post a Comment