# Six Types Of Regression | Detailed Explanation

## Introduction

In this article we will learn about:
• What is Regression
• Types of Regression

## What is Regression?

If you are a student or Machine Learning developer or Data Science practitioner you probably have heard about Regression, in this article we will learn about exactly what is meant by Regression. Well, Regression can be defined as a method that generally involves statistics that produces or attempts to produce the relationship between a variable with other variables. Generally, the implementation is done to estimate the effect of a variable on other variables by establishing a relationship.

## Types of Regression

1. Linear Regression
2. Ordinal Regression
3. Polynomial Regression
4. Logistic Regression
5. Ridge Regression
6. Principal Components Regression
Linear Regression

Linear Regression is an approach commonly used for predictive analysis, one of the simplest form of regression, generally, in this technique, the relation between a dependent variable and independent variable is linear in aspect, which may be termed as building a relationship between a scalar response with one or more dependent or independent variables. Linear regression is usually used for some forecasting or prediction, the one thing that sometimes linear regression models are fitted using the least square approach.

Ordinal Regression

Ordinal Regression can be defined as a regression method where it is used to find or predict the ranked values, in other explanation, or in simple words, it is a type of regression where the dependent variable is ordinal in aspect. The very suitable examples of ordinal regression are ordered logit and ordered probit.

Polynomial Regression

Polynomial Regression is a type of Regression where it is used to fit a nonlinear equation by taking polynomial functions. The Polynomial Regression is a modified Linear model to fit complicated and nonlinear functions. In this regression, the model is fit using the least square method in the operation.

Logistic Regression

The logistic regression can be defined as a situation where the variable is binary in nature, thus having the requirement of logistic regression to conduct the analysis if dichotomous. Logistic Regression is used in many fields such as machine learning or medical fields. It can be ordinal or binomial or even multinomial.

Ridge Regression

It is a technique used for the purpose of analyzing the multiple regression data that suffers between two explanatory variables. This type of regression belongs to the L2 type regularization. The very important task when applying this method is the question of when and how this technique should be applied for the purpose to get the desired output.

Principal Components Regression

When you have many independent variables in the operation then the application may be considered to apply to extract new features and when the features are correlated. It is mainly used when the data nature is having multicollinearity in nature. In such a case the least-squares are unbiased but the variances are large.

The above were some of the Regression types that deal with the operation of data required by the process. Thus it is to be noted that there are many other types of regression that may be used depending upon the data given and the type of output we want or the desired output we wish for. Linear Regression or Ordinal or Polynomial or Logistic or Ridge or PCR, should be examined and understand the application and the most important is to understand the use cases of each individual before the operation that which type of data you are made available and the desired output you need to get as output. Then applying these in your operations you would understand why and which one of these is best efficient for you in terms of both resource consumptions and model accuracy.