Hi,
Is there an out-of-the-box implementation of Python's logloss function in Breeze?
//implementation from kaggle:
import scipy as sp
def logloss(act, pred):
epsilon = 1e-15
pred = sp.maximum(epsilon, pred)
pred = sp.minimum(1-epsilon, pred)
ll = sum(act*sp.log(pred) + sp.subtract(1,act)*sp.log(sp.subtract(1,pred)))
ll = ll * -1.0/len(act)
return ll
Implementation from skkit-learn:
def log_loss(y_true, y_prob): |
| """Compute Logistic loss for classification. |
| |
| Parameters |
| ---------- |
| y_true : array-like or label indicator matrix |
| Ground truth (correct) labels. |
| |
| y_prob : array-like of float, shape = (n_samples, n_classes) |
| Predicted probabilities, as returned by a classifier's |
| predict_proba method. |
| |
| Returns |
| ------- |
| loss : float |
| The degree to which the samples are correctly predicted. |
| """ |
| y_prob = np.clip(y_prob, 1e-10, 1 - 1e-10) |
|
|
| if y_prob.shape[1] == 1: |
| y_prob = np.append(1 - y_prob, y_prob, axis=1) |
|
|
| if y_true.shape[1] == 1: |
| y_true = np.append(1 - y_true, y_true, axis=1) |
|
|
| return -np.sum(y_true * np.log(y_prob)) / y_prob.shape[0] |
Regards, Eirik