Scikit Learn - Multi-task LASSO



It allows to fit multiple regression problems jointly enforcing the selected features to be same for all the regression problems, also called tasks. Sklearn provides a linear model named MultiTaskLasso, trained with a mixed L1, L2-norm for regularisation, which estimates sparse coefficients for multiple regression problems jointly. In this the response y is a 2D array of shape (n_samples, n_tasks).

The parameters and the attributes for MultiTaskLasso are like that of Lasso. The only difference is in the alpha parameter. In Lasso the alpha parameter is a constant that multiplies L1 norm, whereas in Multi-task Lasso it is a constant that multiplies the L1/L2 terms.

And, opposite to Lasso, MultiTaskLasso doesn’t have precompute attribute.

Implementation Example

Following Python script uses MultiTaskLasso linear model which further uses coordinate descent as the algorithm to fit the coefficients.

from sklearn import linear_model
MTLReg = linear_model.MultiTaskLasso(alpha=0.5)
MTLReg.fit([[0,0], [1, 1], [2, 2]], [[0, 0],[1,1],[2,2]])

Output

MultiTaskLasso(alpha = 0.5, copy_X = True, fit_intercept = True, max_iter = 1000,
   normalize = False, random_state = None, selection = 'cyclic', tol = 0.0001,
   warm_start = False)

Example

Now, once fitted, the model can predict new values as follows −

MTLReg.predict([[0,1]])

Output

array([[0.53033009, 0.53033009]])

Example

For the above example, we can get the weight vector with the help of following python script −

MTLReg.coef_

Output

array([[0.46966991, 0. ],
[0.46966991, 0. ]])

Example

Similarly, we can get the value of intercept with the help of following python script −

MTLReg.intercept_

Output

array([0.53033009, 0.53033009])

Example

We can get the total number of iterations to get the specified tolerance with the help of following python script −

MTLReg.n_iter_

Output

2

We can change the values of parameters to get the desired output from the model.

Advertisements