Using a Keras Long Short-Term Memory (LSTM) Model to Predict Stock Prices

Articles

Editor’s note: This tutorial illustrates how to get started forecasting time series with LSTM models. Stock market data is a great choice for this because it’s quite regular and widely available to everyone. Please don’t take this as financial advice or use it to make any trades of your own.

In this tutorial, we’ll build a Python deep learning model that will predict the future behavior of stock prices. We assume that the reader is familiar with the concepts of deep learning in Python, especially Long Short-Term Memory.

Continue reading Using a Keras Long Short-Term Memory (LSTM) Model to Predict Stock Prices

Using foreground services for executing long-running processes in Android

Articles

In my last blog, I talked about how devs can use Kotlin coroutines to efficiently handle long-running tasks in their apps:

The method outlined works well when the user is using your app, but as soon as the user exits the app, the system kills the app and all the processes spawned by it. I faced this issue while working on AfterShoot when I had to run my machine learning model through all of a given user’s images.

Continue reading Using foreground services for executing long-running processes in Android

Using DeOldify to Colorize and Restore Grayscale Images and Videos

Articles

Image colorization is an engaging topic in the field of image-to-image translation. Even though color photography was invented in 1907, It didn’t become popular for the average person until the 1960s because of its expensiveness and inaccessibility. All the photography and videography up until then was done on Black & White. Colorizing these images was impossible—until the DeOldify deep learning model came to life.

Continue reading Using DeOldify to Colorize and Restore Grayscale Images and Videos

Transfer Learning with PyTorch

Articles

When we learn something in our daily lives, similar things become very easy to learn because—we use our existing knowledge on the new task. Example: When I learned how to ride a bicycle, it became very easy to learn how to ride a motorcycle because in riding the bicycle, I knew I had to sit and maintain balance, hold the handles firmly, and peddle to accelerate. In using my prior knowledge, I could easily adapt to a motorcycle’s design and how it could be driven. And that is the general idea behind transfer learning.

Continue reading Transfer Learning with PyTorch

Understanding the Mathematics behind Principal Component Analysis

Articles

In this post, we’re going to learn the foundations of a very famous and interesting dimensionality reduction technique known as principal component analysis (PCA).

Specifically, we’re going to learn what principal components are, how data is concentrated within them, and learn about their orthogonality properties that make extraction of important data easier.

In other words, Principal component analysis (PCA) is a procedure for reducing the dimensionality of the variable space by representing it with a few orthogonal (uncorrelated) variables that capture most of its variability.

Continue reading Understanding the Mathematics behind Principal Component Analysis

Understanding the Mathematics Behind Naive Bayes

Articles

In this post, we’re going to dive deep into one of the most popular and simple machine learning classification algorithms—the Naive Bayes algorithm, which is based on the Bayes Theorem for calculating probabilities and conditional probabilities.

Before we jump into the Naive Bayes classifier/algorithm, we need to know the fundamentals of Bayes Theorem, on which it’s based.

Continue reading Understanding the Mathematics Behind Naive Bayes