Implement Highway Networks in TensorFlow: A Step Guide – TensorFlow Tutorial

By | March 16, 2022

In this tutorial, we will use an example to show you how to implement highway networks in tensorflow.

Highway Networks

You can view detailed information on highway networks in this tutorial:

A Beginner Introduction to Highway Networks – Machine Learning Tutorial

How to implement highway networks in tensorflow?

Here is a tensorflow function for creating.

def highwaynet(inputs, scope):
  with tf.variable_scope(scope):
    H = tf.layers.dense(
      inputs,
      units=128,
      activation=tf.nn.relu,
      name='H')
    T = tf.layers.dense(
      inputs,
      units=128,
      activation=tf.nn.sigmoid,
      name='T',
      bias_initializer=tf.constant_initializer(-1.0))
    return H * T + inputs * (1.0 - T)

Here T is the gate in a high network.

In this function, we use tf.layers.dense() to create a forward network and the activation function relu. Of course, you can create your own network. However, T is the same.

Understand tf.layers.Dense(): How to Use and Regularization – TensorFlow Tutorial

Here is an evaluation example code:

import tensorflow as tf

def highwaynet(inputs, scope):
  with tf.variable_scope(scope):
    H = tf.layers.dense(
      inputs,
      units=128,
      activation=tf.nn.relu,
      name='H')
    T = tf.layers.dense(
      inputs,
      units=128,
      activation=tf.nn.sigmoid,
      name='T',
      bias_initializer=tf.constant_initializer(-1.0))
    return H * T + inputs * (1.0 - T)

w = tf.Variable(tf.glorot_uniform_initializer()([32, 128]), name="w")

v = highwaynet(w, scope = 'highwaynet')
with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())
    x = sess.run(v)
    print(x.shape)
    print(x)

We will get x with 32*128.

(32, 128)
[[-0.10192373 -0.01479287 -0.04186717 ...  0.12031765  0.14088765
   0.1117066 ]
 [ 0.01563119 -0.08516118 -0.01701555 ... -0.03293652  0.01203252
   0.12288218]
 [-0.09575637  0.11603582 -0.03526285 ...  0.15853445  0.01996651
   0.06135607]
 ...
 [ 0.16240281 -0.06246712 -0.07050803 ...  0.13739622  0.05379849
   0.10180917]
 [ 0.12460369  0.09667172  0.09879336 ...  0.09712815  0.02070327
  -0.08640412]
 [ 0.00021924  0.06265902  0.16971348 ...  0.1329451  -0.10995799
  -0.04932537]]