Can End-To-End Memory Networks Solve Time Series Problems – A Simple Tutorial for NLP Beginners

By | June 10, 2019

End-To-End Memory Networks in paper:

If you ask me can End-To-End Memory Networks solve time series problem like lstm, i will tell you model in paper(  can not. I call this model is basic memory networks. This model has no time series feature like lstm.

I will tell you the reason.

The basic structure of memory networks like this:

As inputs xi and q, they are simple words and are maped two vectors.

Here we will find mi and u have no time information.

For example:

Sentence A: I like this movie

Sentence B: This movie i like

You will find this two sentenes map to the same mi as to same word in each sentence.

As to pi

You also find sentence A and sentence B are also the same

so as to output o

sentence A and sentence B are also the same, however, if you use lstm, i will find sentence A and B is different in embeding space.

On the other hand, if the input xi and o is time series (pre-processed by lstm or bi-lstm), the memory networks also can time serires problem.