bgd.cost module
This module contains all the cost (loss) functions that are implemented. There exists a separation between regression and classification classes.
Each cost function implemented needs to implement the
computation of the cost (obviously) and also the gradient
of the function w.r.t. y_hat
.
Any new cost function needs to inherit from either
bgd.cost.RegressionCost
or ClassificationCost
depending on its use.
- class bgd.cost.ClassificationCost[source]
Bases:
bgd.cost.Cost
Base class for all the cost functions used in classification.
- class bgd.cost.Cost[source]
Bases:
object
Base class for error operators.
- Parameters
y (
np.ndarray
) – Array of shape (n_samples, output_size) representing the expected outputs. Must have same length as X.y_hat (
np.ndarray
) – Array of shape (n_samples, output_size) representing the predictions. Must have same length as Y.
- abstract _eval(y, y_hat)[source]
Wrapped method for evaluating the error operator. Subclasses must override this method.
- abstract _grad(y, y_hat)[source]
Wrapped method for evaluating the error gradient. Subclasses must override this method.
- abstract _print_fitness(y, y_hat, end, f)[source]
Wrapped method for printing the cost value/accuracy of the model on provided data.
:param see
print_fitness()
.:
- eval(y, y_hat)[source]
Evaluates the error operator and returns an array of shape (n_samples,) with the per-sample cost.
- Returns
- loss:
The error function value for each sample of the batch. shape == (len(y),)
- Return type
np.ndarray
- grad(y, y_hat)[source]
Evaluates the gradient of the error. The error must be not depend on to the model parameters.
- Returns
- grad:
The gradient of the error for each sample of the batch. shape == (len(y),)
- Return type
np.ndarray
- print_fitness(y, y_hat, end='\n', files=<_io.TextIOWrapper name='<stdout>' mode='w' encoding='utf-8'>)[source]
Prints a measure of the fitness/cost.
- Parameters
y (
numpy.ndarray
) – Ground truth.y_hat (
numpy.ndarray
) – Prediction of the model.end (
str
) – String to write at the end. Defaults to ‘n’.files (
tuple
of file-like objects or file-like object) – File(s) to write the fitness in. Defaults to stdout.
- class bgd.cost.CrossEntropy(epsilon=1e-15)[source]
Bases:
bgd.cost.ClassificationCost
Differentiable cross-entropy operator.
\[C(y, \hat y) = -\frac 1n\sum_{i=1}^n\big[y_i\log\hat y_i + (1-y_i)\log(1-\hat y_i)\big]\]- epsilon
parameter for numerical stability
- _eval(y, y_hat)[source]
Return cross-entropy metric.
- Parameters
y (
np.ndarray
) – Ground truth labels.y_hat (
np.ndarray
) – Predicted values.
- Returns
- loss:
The error function value for each sample of the batch. shape == (len(y),)
- Return type
np.ndarray
- class bgd.cost.MSE[source]
Bases:
bgd.cost.RegressionCost
Differentiable Mean Squared Error operator.
\[C(y, \hat y) = \frac 1{2n}\sum_{i=1}^n\big(y_i - \hat y_i\big)^2\]- _eval(y, y_hat)[source]
Return mean squared error.
- Parameters
y (
np.ndarray
) – Ground truth values.y_hat (
np.ndarray
) – Predicted values.
- Returns
- loss:
The error function value for each sample of the batch. shape == (len(y),)
- Return type
np.ndarray
- class bgd.cost.RegressionCost[source]
Bases:
bgd.cost.Cost
Base class for all the cost functions used in regression.