# Understand tanh(x) Activation Function: Why You Use it in Neural Networks

By | October 17, 2020

tanh(x) activation function is widely used in neural networks. In this tutorial, we will discuss some features on it and disucss why we use it in nerual networks.

## tanh(x)

tanh(x) is defined as:

The graph of tanh(x) likes:

We can find:

tanh(1) = 0.761594156

tanh(1.5) = 0.905148254

tanh(2) = 0.96402758

tanh(3) = 0.995054754

## The feature of tanh(x)

tanh(x) contains some important features, they are:

• tanh(x)∈[-1,1]
• nonlinear function, derivative

## tanh(x) derivative

The derivative is:

tanh(x)’ = 1-(tanh(x))2

The graph looks like:

## Why should we use tanh(x) in neural networks?

There are two main reasons:

1. tanh(x) can limit the value in [-1, 1]
2. tanh(x) can convert a linear function to nonlinear, meanwhile it is derivative.

## Useful Equations

$tanh(x+y)=\frac{tanh(x)+tanh(y)}{1+tanh(x)tanh(y)}$

$tanh(x-y)=\frac{tanh(x)-tanh(y)}{1-tanh(x)tanh(y)}$

$tanh(2x)=\frac{2tanh(x)}{1+tanh^2(x)}$