What is this mythical beast I keep hearing about? Today, Deep Learning is a buzzword for a well deserved reason. Let’s do a deep dive into this subject and slay this beast.
Deep Learning
What is this mythical beast I keep hearing about? Today, Deep Learning is a buzzword for a well deserved reason. Some of it’s applications include:
- Automatic Machine Translation
- Object Classification in Photographs
- Adding Sounds to silent Movies
- Image Caption Generation
- Voice Search & Voice-Activated Assistants
We all must have come across at least one application on our phone that uses deep learning. We also keep on hearing about Self-driving cars, reminds me of back to the future, I-robot and so on. There are so many movies where self driving cars where imagined. Now, we finally have them as a reality. To further understand what made all this possible and how it works, let’s tackle this topic layer by layer:
Series:
- The Origins: Every superhero has an Origin story! We will take a look at the overall motive and the inspiration behind neural networks. Link
- Architecture: What are Neural Networks made of? Understanding the components of Neural Networks. Link
- Inside the Black box: What is going on inside this Black box algorithm? Trying to build intuition and understanding of what is going on in the different layers of a neural network. Link
- Mechanics of Deep Learning: Understand the concept of Gradient Descent and Back-propagation to get some idea of how Neural Networks work. Warning some math involved! Don’t worry, we will first try to explain it in an intuitive manner and then explore some math behind it. Link
- Activation: Understand the different types of activation functions and explore their characteristics. Link
- Learning Rates: An optimal learning rate is not only helpful in managing the computational load but also helps prevent over-fitting. It is one of the most important parameters and we will go over some techniques on how to choose the Learning Rates. I am working on this post and will update the link soon.
- Invasive Species: Don’t get alarmed, we are going to put it into practice on a playground kaggle data set explaining the code along the way. Link
- Convolution Neural Networks: CNNs are a bit different. In the post titled “Invasive Species” we use CNN to train on a data set of images and classify them as invasive/harmless. Let’s take a look at what was happening in more detail. However, I am still in the process of putting together a post to explain this concept.
References
This blog series has been put together by using several references and it is necessary to point out some of them so that other readers can also take inspiration and understanding from these sources.
- Walter Pitts and Warren McCulloch
- Frank Rosenblatt’s Perceptron
- Ian Goodfellow, Yoshua Bengio, and Aaron Courville. Deep Learning
- CS 231n
- He-et-al
- Fjodor Van Veen
- Neural Networks Feature Visualization
- Neural Networks Deep Visualization
- Efficient Backprop
- fastai
- Deep Sparse Rectifier
- Nielson
- Gradient Based Learning for Document Recognition
- Understand CNN
- Imagenet Classification
- Gradient Descent
- Cyclical Learning Rates for Training Neural Networks
I will continue to update this list of references and add more articles as we progress on our deep learning exploration.
About Us
Data science discovery is a step on the path of your data science journey. Please follow us on LinkedIn to stay updated.
About the writers:
- Ankit Gadi: Driven by a knack and passion for data science coupled with a strong foundation in Operations Research and Statistics has helped me embark on my data science journey.