Generative algorithms have opened a new window for AI applications. Machine learning has traditionally been concerned with classifying/learning the behavior of a certain process, without trying to mimic it, or more precisely; without generating a similar behavior.
We all witnessed the evolution of style transfer applications such as FaceApp, where a given image could be altered to generate different features such as beard, hair, age, or even smiles and laughs.
Continue reading Artificial Art: How GANs are making machines creative
Knowledge distillation is a model compression technique whereby a small network (student) is taught by a larger trained neural network (teacher). The smaller network is trained to behave like the large neural network. This enables the deployment of such models on small devices such as mobile phones or other edge devices. In this guide, we’ll look at a couple of papers that attempt to tackle this challenge.
Continue reading Research Guide: Model Distillation Techniques for Deep Learning
One question app developers must ask themselves is: What am I going to do when users communicate that they feel laggy while using a given app? The answer isn’t always immediately clear, but most of the time it has to do with CPU-intensive tasks that block the main thread, and there are also cases where these kinds of performance issues are related to memory.
Continue reading Profiling your app with Android Studio
Doing cool things with data!
Tracking is an important problem in the domain of computer vision. It involves tracking an object through a sequence of frames. An ID is assigned to an object the first time it appears, and then this ID is carried forward in subsequent frames.
Continue reading Real-Time Person Tracking on the Edge with a Raspberry Pi
Editor’s note: This tutorial illustrates how to get started forecasting time series with LSTM models. Stock market data is a great choice for this because it’s quite regular and widely available to everyone. Please don’t take this as financial advice or use it to make any trades of your own.
In this tutorial, we’ll build a Python deep learning model that will predict the future behavior of stock prices. We assume that the reader is familiar with the concepts of deep learning in Python, especially Long Short-Term Memory.
Continue reading Using a Keras Long Short-Term Memory (LSTM) Model to Predict Stock Prices
In my last blog, I talked about how devs can use Kotlin coroutines to efficiently handle long-running tasks in their apps:
The method outlined works well when the user is using your app, but as soon as the user exits the app, the system kills the app and all the processes spawned by it. I faced this issue while working on AfterShoot when I had to run my machine learning model through all of a given user’s images.
Continue reading Using foreground services for executing long-running processes in Android
There are various techniques for handling text data in machine learning. In this article, we’ll look at working with word embeddings in Keras—one such technique. For a deeper introduction to Keras refer to this tutorial:
Continue reading Using a Keras Embedding Layer to Handle Text Data
Image colorization is an engaging topic in the field of image-to-image translation. Even though color photography was invented in 1907, It didn’t become popular for the average person until the 1960s because of its expensiveness and inaccessibility. All the photography and videography up until then was done on Black & White. Colorizing these images was impossible—until the DeOldify deep learning model came to life.
Continue reading Using DeOldify to Colorize and Restore Grayscale Images and Videos
When we learn something in our daily lives, similar things become very easy to learn because—we use our existing knowledge on the new task. Example: When I learned how to ride a bicycle, it became very easy to learn how to ride a motorcycle because in riding the bicycle, I knew I had to sit and maintain balance, hold the handles firmly, and peddle to accelerate. In using my prior knowledge, I could easily adapt to a motorcycle’s design and how it could be driven. And that is the general idea behind transfer learning.
Continue reading Transfer Learning with PyTorch
In this post, we’re going to learn the foundations of a very famous and interesting dimensionality reduction technique known as principal component analysis (PCA).
Specifically, we’re going to learn what principal components are, how data is concentrated within them, and learn about their orthogonality properties that make extraction of important data easier.
In other words, Principal component analysis (PCA) is a procedure for reducing the dimensionality of the variable space by representing it with a few orthogonal (uncorrelated) variables that capture most of its variability.
Continue reading Understanding the Mathematics behind Principal Component Analysis