Implement Sigmoid Cross-entropy Loss with Masking in TensorFlow – TensorFlow Tutorial

By | August 24, 2020

Sigmoid cross-entropy loss is also often used in deep learning mode. In this tutorial, we will introduce how to compute sigmoid cross-entropy loss with masking in tensorflow.

Sigmoid cross-entropy loss

In tensorflow, we can use tf.nn.sigmoid_cross_entropy_with_logits() function to calculate the sigmoid cross-entropy loss. Here is the tutorial:

Understand tf.nn.sigmoid_cross_entropy_with_logits(): A Beginner Guide – TensorFlow Tutorial

However, tf.nn.sigmoid_cross_entropy_with_logits() does not support mask. In order to compute the sigmoid cross-entropy loss with mask, we should create a custom function.

We have known how to create softmax cross-entropy loss with mask in this tutorial.

Implement Softmax Cross-entropy Loss with Masking in TensorFlow – TensorFlow Tutorial

We also can write a function to compute sigmoid cross-entropy loss with mask.

Calculate sigmoid cross-entropy loss with mask

Here is the function:

    def masked_sigmoid_cross_entropy(logits, labels, mask):
        """Sigmoid cross-entropy loss with masking."""
        labels = tf.cast(labels, dtype=tf.float32)
        loss = tf.nn.sigmoid_cross_entropy_with_logits(logits=logits, labels=labels)
        loss=tf.reduce_mean(loss,axis=1)
        mask = tf.cast(mask, dtype=tf.float32)
        mask /= tf.reduce_mean(mask)
        loss *= mask
        return tf.reduce_mean(loss)

Leave a Reply

Your email address will not be published. Required fields are marked *