# Implement Softmax Cross-entropy Loss with Masking in TensorFlow – TensorFlow Tutorial

By | August 24, 2020

We often need to process variable length sequence in deep learning. In that situation, we will need use mask in our model. In this tutorial, we will introduce how to calculate softmax cross-entropy loss with masking in TensorFlow.

## Softmax cross-entropy loss

In tensorflow, we can use tf.nn.softmax_cross_entropy_with_logits() to compute cross-entropy. For example:

loss = tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=labels)

However, how to calculate softmax cross-entropy loss with masking?

We will use an tensorflow function to implement it.

## Calculate softmax cross-entropy loss with masking

This function is:

    def masked_softmax_cross_entropy(logits, labels, mask):
"""Softmax cross-entropy loss with masking."""
loss = tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=labels)
return tf.reduce_mean(loss)