# Implement GELU Activation Function in TensorFlow – TensorFlow Tutorial

By | January 4, 2022

In this tutorial, we will introduce how to implement gelu activation function in tensorflow.

You should know this function is also used in Bert model.

To understand gelu, you can read:

An Explain to GELU Activation Function – Deep Learning Tutorial

## How to Implement GELU Activation Function in TensorFlow

We will use an example to show you how to do.

Here is an example:

import tensorflow as tf
import numpy as np

def gelu(x):
"""Gaussian Error Linear Unit.
This is a smoother version of the RELU.
Original paper: https://arxiv.org/abs/1606.08415
Args:
x: float Tensor to perform activation.
Returns:
x with the GELU activation applied.
"""
cdf = 0.5 * (1.0 + tf.tanh(
(np.sqrt(2 / np.pi) * (x + 0.044715 * tf.pow(x, 3)))))
return x * cdf

b = tf.constant([[2, 0, 4, 1],[1, 2, 4, 1]],  dtype = tf.float32)
bc = gelu(b)
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
sess.run(tf.local_variables_initializer())
np.set_printoptions(precision=3, suppress=True)

c = sess.run(bc)
print(c)

Run this code, we will get the result:

[[1.955 0.    4.    0.841]
[0.841 1.955 4.    0.841]]