Site icon The Happy Optimist

Video time! Discretization methods on quadratic and logistic regression

Hi all, this is another “tiny project” that I’ve been wanting to do for a while. Lately, I’ve been somewhat obsessed with method flows and discretization methods. While we can sit here and write integrals until our faces turn blue, I think it’s time to just simulate some stuff, and just see what happens. In this post I explore 4 problems:

Problems

The methods include

Here, an approximate implicit method = 1 gradient step, and a full implicit method = 100 gradient steps.

 

 

Quadratic

$$f(x) = \frac{1}{2}x^TQx$$

 

 

Normalized quadratic

$$f(x) = \sqrt{x^TQx}$$

 

Logistic regression

In the logistic regression models,

$$f(\theta) = \frac{1}{m}\log(1+\exp(-y_ix_i^T\theta)$$

where we generate data using the 2-blob model

$$x_i = y_i c + z_i $$

and $\|c\|_2$ is tuned to create separation. Included is not just the trajectory over the landscape, but also the classification barrier, so you can see whether the suboptimality actually has real effect on bad classification or not.

 

Well separated logistic regression

 

 

 

 

Poorly separated logistic regression

 

 

 

 

Discussion

I had a few questions going into this tiny experiment, some of which were answered, some of which are still ephemeral.

More studies need to be made in order to capture embedding learning as well, but anyway, it’s called a tiny project for a reason!

Finally, I leave you an image of our two favorite cheerleaders.

 

Exit mobile version