Losses
centimators.losses
Custom loss functions for neural network training.
This module provides specialized loss functions that extend beyond standard metrics. The main focus is on rank-based losses that better capture relative ordering patterns in predictions, which can be particularly useful for financial or ranking tasks.
Highlights
- SpearmanCorrelation – Differentiable approximation of Spearman's rank correlation coefficient that can be used as a loss function.
- CombinedLoss – Weighted combination of MSE and Spearman correlation losses for balancing absolute accuracy with rank preservation.
SpearmanCorrelation
Bases: Loss
Differentiable Spearman rank correlation loss.
This loss function computes a soft approximation of Spearman's rank correlation coefficient between predictions and targets. Unlike the standard non-differentiable rank correlation, this implementation uses sigmoid-based soft rankings that allow gradient flow during backpropagation.
The loss is computed as the negative correlation (to minimize during training) between the soft ranks of predictions and targets.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
regularization_strength
|
float, default=1e-3
|
Temperature parameter for the sigmoid function used in soft ranking. Smaller values create sharper (more discrete) rankings, while larger values create smoother approximations. Typically ranges from 1e-4 to 1e-1. |
0.001
|
name
|
str, default="spearman_correlation"
|
Name of the loss function. |
'spearman_correlation'
|
**kwargs
|
Additional keyword arguments passed to the base Loss class. |
{}
|
Examples:
>>> import keras
>>> loss_fn = SpearmanCorrelation(regularization_strength=0.01)
>>> model = keras.Sequential([...])
>>> model.compile(optimizer='adam', loss=loss_fn)
Source code in src/centimators/losses.py
21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 | |
call(y_true, y_pred)
Compute the Spearman correlation loss.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
y_true
|
Ground truth values of shape (batch_size,) or (batch_size, 1). |
required | |
y_pred
|
Predicted values of shape (batch_size,) or (batch_size, 1). |
required |
Returns:
| Type | Description |
|---|---|
|
Scalar loss value (negative correlation). |
Source code in src/centimators/losses.py
CombinedLoss
Bases: Loss
Weighted combination of MSE and Spearman correlation losses.
This loss function combines mean squared error (for absolute accuracy) with Spearman correlation loss (for rank preservation). This can be particularly useful when both the exact values and their relative ordering are important.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
mse_weight
|
float, default=2.0
|
Weight applied to the MSE component. Higher values prioritize absolute accuracy. |
2.0
|
spearman_weight
|
float, default=1.0
|
Weight applied to the Spearman correlation component. Higher values prioritize rank preservation. |
1.0
|
spearman_regularization
|
float, default=1e-3
|
Regularization strength passed to the SpearmanCorrelation loss. |
0.001
|
name
|
str, default="combined_loss"
|
Name of the loss function. |
'combined_loss'
|
**kwargs
|
Additional keyword arguments passed to the base Loss class. |
{}
|
Examples:
>>> # Prioritize ranking accuracy over absolute values
>>> loss_fn = CombinedLoss(mse_weight=0.5, spearman_weight=2.0)
>>> model.compile(optimizer='adam', loss=loss_fn)
Source code in src/centimators/losses.py
call(y_true, y_pred)
Compute the combined loss.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
y_true
|
Ground truth values of shape (batch_size,) or (batch_size, 1). |
required | |
y_pred
|
Predicted values of shape (batch_size,) or (batch_size, 1). |
required |
Returns:
| Type | Description |
|---|---|
|
Scalar loss value (weighted sum of MSE and negative Spearman correlation). |