# LSTM Python

LSTM Python means creating simple Long Short Term Memory models using the flexible Python programming language when working with big data. Python is an ideal language for fast scripting and rapid application developmentthat in turn makes it interesting for the machine learning modeling process in general and LSTMs in particular. **One of the key reasons is that this modeling process requires iterative and highly flexible approaches for network topology prototyping and hyper-parameter tuning such as finding the right number of neurons, batch-sizes, epochs, regularization values, etc.**

Keras is a high-level deep learning library implemented in Python that works on top of existing other rather low-level deep learning frameworks like Tensorflow, CNTK, or Theano. **The key idea behind using the Keras tool for LSTM Python models is to enable faster experimentation with those recurrent neural network.** Created deep learning models run seamlessly on CPU and GPU via those low-level frameworks while one can focus on the LSTM network topology creation and parameter setups. The following code describes the key building block of creating simple LSTM Python models very fast. Close to a graphical representation when thinking about deep learning networks in general the ‘Sequential model’ in Keras enables a linear stack of layers. One way to add layers to a model is the ‘.add()’ method that in the example below is used to add one LSTM layer and one Dense layer.

# imports for specific model elements from keras.models import Sequential from keras.layers import LSTM from keras.layers import Dense ... # design network topology model = Sequential() model.add( LSTM(...) ) model.add( Dense(...) )

**The example above is the simplest LSTM python model consisting of an input layer, a fully connected LSTM layer, and a fully connected output layer (Dense).** There are more elaborate LSTM models such as stacked LSTMs or encoder-decoder LSTM models.

## LSTM Python Details

We recommend to take a look to the following video: