Visualizing Loss Surface Of A Regression Line

This article is focused on showing how a point on the loss surface is equivalent to a line between X and Y. The example that I have taken here is of a simple linear regression model between 2 variables sunshine (in hours) and attendance (in thousands).
  1. # import libraries    
  2. import matplotlib.pyplot as plt    
  3. import numpy as np    
  4. import pandas as pd    
  5. # Importing the dataset    
  6. dataset = pd.read_csv('sunshine.csv')    
  7. # check the data    
  8. dataset.head()    
  9. # check correlation between dependant and    
  10. # independant variables    
  11. dataset.corr()    
  12. # assign columns to X and y    
  13. X = dataset.iloc[:,[0]].values    
  14. y = dataset.iloc[:,1].values    
  15. print(X.shape)    
  16. print(y.shape)    
  17. # check the scatter plot    
  18. plt.scatter(X,y)    
  19. plt.xlabel("Sunshine in hrs")    
  20. plt.ylabel("Attendance in '000s")    
  21. plt.title("Sunshine vs Attendance")    
  22. plt.show()    
  23.     
  24. # Create LinearRegression model     
  25. from sklearn.linear_model import LinearRegression    
  26. # Create linear regression object    
  27. model = LinearRegression()    
  28. model.fit(X, y)    
  29. print(model.coef_)    
  30. print(model.intercept_)    
  31. # Draw the predicted line    
  32. plt.scatter(X,y)    
  33. plt.plot(X,model.predict(X))    
  34. plt.xlabel("Sunshine in hrs")    
  35. plt.ylabel("Attendance in '000s")    
  36. plt.title("Sunshine vs Attendance")    
  37. plt.show()     
 
 
Now the best fit line has a loss which is defined as Least Sum of Squared Errors i.e L2 loss which has the formula
 
Min of Σ(Actual y – Predicted y)2
 
So for coefficient 5.45 the loss is
 
Let’s plot this loss against the coefficient and our regression line side by side
  1. loss = sum((y - ypred)**2)  
  2. plt.scatter(model.coef_, loss)  
  3. plt.xlabel('w')  
  4. plt.ylabel('loss')  
  5. plt.show()  
 
Now, let's change the coefficient range from 2.5 to 9 and plot the different lines that we get.
 
So for each coefficient, you get a line and a corresponding loss. So each loss point on the LHS figure is actually a regression line on the RHS figure. We have ignored the bias/intercept so far in this visualization.
 
 

Plotting L2 loss

 
Suppose we plot for the bias, we will following the curve. The L2 loss function is quadratic in nature hence we get bowl shaped curve.
  1. slope = np.arange(2.5,7.5,0.5)  
  2. bias = np.arange(13.2180.5)  
  3. w0, w1 = np.meshgrid(slope, bias)  
  4. ypred = w0*X + w1  
  5. loss = np.power((y-ypred),2)  
  6. fig = plt.figure()  
  7. ax = fig.gca(projection='3d')  
  8. surf = ax.plot_surface(w0,  
  9. w1,  
  10. loss,  
  11. label="Loss surface",  
  12. cmap='viridis', edgecolor='none')  
  13. surf._facecolors2d=surf._facecolors3d  
  14. surf._edgecolors2d=surf._edgecolors3d  
  15. ax.set_xlabel('Slope')  
  16. ax.set_ylabel('Bias')  
  17. ax.legend()  
 
Geometrically loss function is a convex function as shown above.
 

Plotting L1 Loss

 
Similarly you can plot the L1 loss which is abs(y-ypred). Here there is no quadratic term. So how does the geometry of this loss function look? It looks V shaped.
You can visualize the other loss functions in the same way.
I have made a video on this topic, and uploaded it  here.
 
The code is also uploaded 


Similar Articles