sklearn study notes[3]

发布于:2025-08-12 ⋅ 阅读:(18) ⋅ 点赞:(0)

Non-Negative Least Squares

  1. all the coefficients in the linear regression have to be non-negative to meet requirement of representing a lot of physical or naturally non-negative quantities through applying Non-Negative Least Squares , just setting up the positive parameter to True when calling the linearRegression function.
    for example:
from sklearn.linear_model import LinearRegression
import numpy as np

np.random.seed(42)

n_samples, n_features = 500, 100
X = np.random.randn(n_samples, n_features)
actual_coef = 2.9 * np.random.randn(n_features)
# Generate sample data
y = np.dot(X, actual_coef)
y += 2 * np.random.normal(size=(n_samples,))

from sklearn.model_selection import train_test_split

X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.6)

# Standard linear regression (may have negative coefficients)
lr = LinearRegression().fit(X_train, y_train)
print("Linear regression coefficients:", lr.coef_)

# Non-negative least squares
nnls = LinearRegression(positive=True).fit(X_train, y_train)
print("NNLS coefficients:", nnls.coef_)

calling LinearRegression without the True value of the positive parameter will gerenate a linear regressional object , to apply the fit function of that object achieve the trainning with datas.
through setting the positive parameter as True to call LinearRegression,the object of LinearRegression will be builded for creating Non-negative coefficients which will be used to make a regressional model.
2. subsequently,you can predict the y as follows.

y_pred_nnls = reg_nnls.fit(X_train, y_train).predict(X_test)

references

  1. https://scikit-learn.org/