Skip to content

Instantly share code, notes, and snippets.

@qiqipipioioi
Forked from sfblake/negbin_loss.py
Created May 10, 2024 01:55
Show Gist options
  • Select an option

  • Save qiqipipioioi/3ef5dbbe6c33858c24ad75f7aca2034a to your computer and use it in GitHub Desktop.

Select an option

Save qiqipipioioi/3ef5dbbe6c33858c24ad75f7aca2034a to your computer and use it in GitHub Desktop.

Revisions

  1. @sfblake sfblake created this gist Dec 16, 2019.
    37 changes: 37 additions & 0 deletions negbin_loss.py
    Original file line number Diff line number Diff line change
    @@ -0,0 +1,37 @@
    import tensorflow as tf

    def negative_binomial_loss(y_true, y_pred):
    """
    Negative binomial loss function.
    Assumes tensorflow backend.
    Parameters
    ----------
    y_true : tf.Tensor
    Ground truth values of predicted variable.
    y_pred : tf.Tensor
    n and p values of predicted distribution.
    Returns
    -------
    nll : tf.Tensor
    Negative log likelihood.
    """

    # Separate the parameters
    n, p = tf.unstack(y_pred, num=2, axis=-1)

    # Add one dimension to make the right shape
    n = tf.expand_dims(n, -1)
    p = tf.expand_dims(p, -1)

    # Calculate the negative log likelihood
    nll = (
    tf.math.lgamma(n)
    + tf.math.lgamma(y_true + 1)
    - tf.math.lgamma(n + y_true)
    - n * tf.math.log(p)
    - y_true * tf.math.log(1 - p)
    )

    return nll