Skip to content

Linear model

LinearRegression

reg = LinearRegression()
reg.fit(X_train,y_train)
predictions = reg.predict(X_test)

Ridge

\[ \text{Loss Function} = \text{OLS Loss Function} + \alpha * \sum_{i=1}^n a_{i}^2 \]
ridge = Ridge(alpha=10.0)
ridge.fit(X_train, y_train)
print(ridge.score(X_test, y_test))

Lasso

$$
\text{Loss Function} = \text{OLS Loss Function} + \alpha * \sum_{i=1}^n \lvert a_{i} \rvert
$$

lasso = Lasso(alpha = 0.3)

  • Useful for feature importance, will start making non-significant features' coefficients zero (0).

LogisticRegression

logreg.fit(X_train, y_train)
y_pred_probs = logreg.predict_proba(X_test)[:, 1] # (1)
  1. Gives probabilities for predicting 0 and 1 from which we select [:, 1] (all records from column 1, meaning "yes")