I wrote an article titled “Kernel Logistic Regression Using C#” in the November 2017 issue of MSDN Magazine. See https://msdn.microsoft.com/en-us/magazine/mt845620.

The goal of kernel logistic regression (KLR) is to make a binary prediction — a scenario where there are just two possible outcomes. For example, KLR could predict if a person will repay a loan (fail to repay = 0, successfully repay = 1) based on predictor variables such as age, income and existing debt amount. KLR is an advanced variation of ordinary logistic regression.

Ordinary logistic regression is very simple but can only handle data that is “linearly separable”, that is, you can conceptually draw a straight line that separates the two classes to predict. KLR extends ordinary logistic regression so that non-linearly separable data can be handled, like this data I used in the article:

Kernel logistic regression is a fascinating topic. It uses something called a radial basis function (RBF) kernel. An RBF kernel is a clever function that measures the similarity (or, equivalently, the difference) between two numeric vectors.

KLR isn’t used very much. I’m not quite sure why this is so. KLR is very simple. The major downside to KLR is that it doesn’t scale very well to very large datasets.

*“Not Stable Kernel” – Dmitry Bitus*