A tour of recurrent neural networks
Article
Neuroscience

A tour of recurrent neural networks

RegisterListen now
Denisa Buzan

Denisa Buzan

15/6/2021

 - 

4

 min read

Client

Location

Platform

Team

Event Type

Date And Time

 

 

 at 

Organizer

Hosted By

Location

Guest
No items found.
Podcast

Hosted By
No items found.
TwitterLinkedinMessanger

Key Takeaways

Machine learning unlocks tremendous insights into how humans work, together with providing incredibly powerful tools for pattern prediction, trend forecasting, and sentiment analysis. Understanding machine learning is just the first step into the future.

The adoption of machine learning in an era that has embraced technology has given us the power to utilize information as a resource of immense value — if interpreted correctly. 

The adoption of machine learning in an era that has embraced technology has given us the power to utilize information as a resource of immense value — if interpreted correctly. 

Its value on the business side has increased tremendously and it is still growing since we can extract information from the client side in great volumes while still retaining high quality. With this newfound knowledge, we are now (ideally) able to accurately predict patterns, forecast trends, and understand the behavior and habits of the application’s users.

In order to uncover this outstanding treasure of useful information in a vast sea of noise, we can implement a variety of algorithms. 

One particular category under which all of these falls is called neural networks. Their ability to mimic human activity makes them the perfect candidate for any real-world problem.

This article is part of a series that will focus on one of the most remarkable applications developed in the last decades: the Recurrent Neural Network.

A short introduction to RNN

Recurrent Neural Networks (RNN) have been enjoying great popularity in the Machine Learning field since they were first proposed in the 1980s as a way of handling and forecasting sequential information, which is classified as order-dependent data.

RNNs are called recurrent because they perform the same task for every element of a sequence, with the output being dependent on the previous computations.

When it comes to their application, using RNN over more traditional ANN (Artificial Neural Networks) has proven advantageous in systems that involve speech recognition, time-series prediction, handwriting recognition, music composition, or short-term traffic forecast.

One of the most prominent applications of RNNs can be found in the field of sentiment analysis. Often applied in Market Research, sentiment analysis is considered to be a powerful tool when it is used to collect information about some aspects of the market, such as the audience’s reaction to a product.

Sentiment analysis is one of the most exciting applications of recurrent neural networks. The reason for that is simple — versatility.

The RNN concept

Before diving deeper into the RNN field, let’s have a glance at the traditional artificial neural networks, also known as feed-forward neural networks. In traditional neural networks, the information moves linearly in one direction, starting from the input layer, sequentially passing through the hidden layers, until it reaches the output layer.

An illustration of ANN’s architecture can be seen in ‘Figure 1. It consists of: an input layer, followed by the hidden one, and finally the output layer.

Figure 1. The architecture of an artificial neural network

In contrast to this, just like the human brain, RNN has a 'memory' that contains ‘all the information’, which has been computed in the previous step. This information is being carried over time, through the so-called 'hidden state'.

A visual representation of a recurrent neural network can be seen in Figure 2:

Figure 2: Representation of a recurrent neural network

Now, let us unpack the RNN and present it as a sequence of neural networks, through which information can circulate. In order to do this, we will take a glance at Figure 3 below, which illustrates an unrolled recurrent neural network.

Structurally speaking, this type of network consists of a chain of multiple feed-forward neural networks (labeled A) that receive a vector of inputs (labeled x) in order to produce a vector of outputs (labeled y). During the feedforward process, at every timestep, the network receives not only the input but also the information of the previous timestamp in order to create an output.

Figure 3: Illustration of an unrolled recurrent neural network

Supposing we are feeding the RNN with a sentence, we will first encode every word of it. So, when it has to read the second word in the sentence, x2, instead of only predicting y2 using x2, it also receives some information from timestamp 1. This process is repeated and the information received at every timestep is passed on as activation to the next timestep

In order to train the network, the feedforward process has to be followed by the back-propagation one.

Back-propagation through time

RNN follows a back-propagation through time technique, developed by different researchers, in order to memorize the information.

Being an extension of the back-propagation algorithm, BPTT is applied to sequential data, like time series.

Figure 4: Illustration of an unrolled recurrent neural network

For a better understanding of the BPTT process, we will present the steps that describe it:

  1. First, the network receives a sequence of input-output pairs. In figure 4, X represents the input sequence while y is the output vector.
  2. After being unrolled in time, it computes each error across each time-step in order to add them up. For a better understanding, in Figure 4 we have illustrated how all the losses are accumulated together in a new variable called L<total>.
  3. The next step consists of computing the gradient with respect to each parameter and adjusting the weights to minimize the loss function.
BPTT can be computationally expensive as the number of timesteps increases.

Advantages and disadvantages of RNNs

Advantages of Recurrent Neural Networks:

  • RNN is an extremely useful type of neural network when it comes to time series prediction, because of the feature of remembering previous inputs.
  • RNN can process inputs of any lengths and the model size doesn’t increase. This makes the network more flexible than the traditional artificial neural networks.
  • In an RNN, weights are shared across time, which results in a lower computational cost.

Disadvantages of Recurrent Neural Networks:

  • The occurrence of 'Vanishing Gradient' or 'Exploding Gradient'. (*This will be later explained in the series).
  • The computation is a slow process and the training of RNN can become very difficult.
  • It becomes difficult to access any information that was given to the network if a long time has passed.

Conclusion

Recurrent Neural Networks as a concept represent a good option for processing large amounts of information but they are not an all-around solution, having both advantages and disadvantages compared to other types of networks.

They facilitate an advanced and exciting way of developing applications, by first simulating and subsequently taking advantage of the versatility of the human brain.

Stay tuned to find out more about RNN and its types in our next articles.

Further reading:

A Gentle Tutorial of Recurrent Neural Network with Error Back-propagation

Back-propagation through time — Recurrent Neural Networks

Recurrent Neural Networks Applications Guide

Recurrent Neural Networks, Part 1 — Introduction to RNNs

Tags

neuronal network;recurrent neuronal network;neuronal network advantages;neuronal network disadvantages;pattern prediction;data processing;artificial neuronal network architecture

Contributors

No items found.

Speakers

No items found.

Guest

No items found.

Host

No items found.

Immerse yourself in a world of inspiration and innovation – be part of the action at our upcoming event

Download
the full guide

Denisa Buzan

Denisa Buzan is a past Lead Data Scientist at Linnify.

She is truly passionate about Data Science and AI/ML Software Development, and Bioengineering. Denisa also has a MA in Applied Computational Intelligence.

Let’s build
your next digital product.

Subscribe to our newsletter

YOU MIGHT ALSO BE INTERESTED IN

YOU MIGHT ALSO BE INTERESTED IN

Drag