mlpack.logistic_regression

logistic_regression(...)
L2-regularized Logistic Regression and Prediction

>>> from mlpack import logistic_regression

An implementation of L2-regularized logistic regression using either the L-BFGS optimizer or SGD (stochastic gradient descent). This solves the regression problem

y = (1 / 1 + e^-(X * b))

where y takes values 0 or 1.

This program allows loading a logistic regression model (via the 'input_model' parameter) or training a logistic regression model given training data (specified with the 'training' parameter), or both those things at once. In addition, this program allows classification on a test dataset (specified with the 'test' parameter) and the classification results may be saved with the 'output' output parameter. The trained logistic regression model may be saved using the 'output_model' output parameter.

The training data, if specified, may have class labels as its last dimension. Alternately, the 'labels' parameter may be used to specify a separate matrix of labels.

When a model is being trained, there are many options. L2 regularization (to prevent overfitting) can be specified with the 'lambda_' option, and the optimizer used to train the model can be specified with the 'optimizer' parameter. Available options are 'sgd' (stochastic gradient descent) and 'lbfgs' (the L-BFGS optimizer). There are also various parameters for the optimizer; the 'max_iterations' parameter specifies the maximum number of allowed iterations, and the 'tolerance' parameter specifies the tolerance for convergence. For the SGD optimizer, the 'step_size' parameter controls the step size taken at each iteration by the optimizer. The batch size for SGD is controlled with the 'batch_size' parameter. If the objective function for your data is oscillating between Inf and 0, the step size is probably too large. There are more parameters for the optimizers, but the C++ interface must be used to access these.

For SGD, an iteration refers to a single point. So to take a single pass over the dataset with SGD, 'max_iterations' should be set to the number of points in the dataset.

Optionally, the model can be used to predict the responses for another matrix of data points, if 'test' is specified. The 'test' parameter can be specified without the 'training' parameter, so long as an existing logistic regression model is given with the 'input_model' parameter. The output predictions from the logistic regression model may be saved with the 'output' parameter.

This implementation of logistic regression does not support the general multi-class case but instead only the two-class case. Any labels must be either 0 or 1. For more classes, see the softmax_regression program.

As an example, to train a logistic regression model on the data ''data'' with labels ''labels'' with L2 regularization of 0.1, saving the model to ''lr_model'', the following command may be used:

>>> output = logistic_regression(training=data, labels=labels, lambda_=0.1)
>>> lr_model = output['output_model']

Then, to use that model to predict classes for the dataset ''test'', storing the output predictions in ''predictions'', the following command may be used:

>>> output = logistic_regression(input_model=lr_model, test=test)
>>> predictions = output['output']

input options

output options

The return value from the binding is a dict containing the following elements: