Articles Fritz has written:

A 2019 Guide to Speech Synthesis with Deep Learning

Articles

Artificial production of human speech is known as speech synthesis. This machine learning-based technique is applicable in text-to-speech, music generation, speech generation, speech-enabled devices, navigation systems, and accessibility for visually-impaired people.

In this article, we’ll look at research and model architectures that have been written and developed to do just that using deep learning.

But before we jump in, there are a couple of specific, traditional strategies for speech synthesis that we need to briefly outline: concatenative and parametric.

Continue reading A 2019 Guide to Speech Synthesis with Deep Learning

Applications of Matrix Decompositions for Machine Learning

Articles

In machine learning and statistics, we often have to deal with structural data, which is generally represented as a table of rows and columns, or a matrix. A lot of problems in machine learning can be solved using matrix algebra and vector calculus. In this blog, I’m going to discuss a few problems that can be solved using matrix decomposition techniques. I’m also going to talk about which particular decomposition techniques have been shown to work better for a number of ML problems. This blog post is my effort to summarize matrix decompositions, as taught by Rachel Thomas and Xuemei Chen in the Computational Linear Algebra course at the University of San Francisco. This whole course is available for free as a part of fast.ai online courses. Here is the link to the introductory post by Rachel Thomas about the course.

Continue reading Applications of Matrix Decompositions for Machine Learning

Benchmarking deep learning activation functions on MNIST

Articles

Over the years, many activation functions have been introduced by various machine learning researchers. And with so many different activation functions to choose from, aspiring machine learning practitioners might not be able to see the forest for the trees. Although this range of options allows practitioners to train more accurate networks, it also makes it harder to know which one to use.

Continue reading Benchmarking deep learning activation functions on MNIST

Building Production Machine Learning Systems

Articles

Machine learning systems built for production are required to efficiently train, deploy, and update your machine learning models. Various factors have to be considered while deciding on the architecture of each system. Parts of this blog post are based on the Coursera and GCP (Google Cloud Platform) course on building production machine learning systems. Below, I’ll list some of the concerns in building a scalable machine learning system:

Continue reading Building Production Machine Learning Systems

Building an App Introduction Slider in React Native

Articles

In this tutorial, we’ll learn the basics of creating an intro slider using React Native. To do this, we’ll use a React Native module known as react-native-app-walkthrough, and the result will be a simple and configurable app introduction slider.

The detailed explanation of each step for this tutorial is provided below:

First, we need to simply install the react-native-app-walkthrough package. We can do this using NPM (Node Package Manager) or yarn, and for this, we’ll using NPM. Open your NPM console or command prompt within your project directory and enter the following code snippet:

Continue reading Building an App Introduction Slider in React Native

5 Regression Loss Functions All Machine Learners Should Know

Articles

All the algorithms in machine learning rely on minimizing or maximizing a function, which we call “objective function”. The group of functions that are minimized are called “loss functions”. A loss function is a measure of how good a prediction model does in terms of being able to predict the expected outcome. A most commonly used method of finding the minimum point of function is “gradient descent”. Think of loss function like undulating mountain and gradient descent is like sliding down the mountain to reach the bottommost point.

Continue reading 5 Regression Loss Functions All Machine Learners Should Know

A 2019 Guide to Semantic Segmentation

Articles

Semantic segmentation refers to the process of linking each pixel in an image to a class label. These labels could include a person, car, flower, piece of furniture, etc., just to mention a few.

We can think of semantic segmentation as image classification at a pixel level. For example, in an image that has many cars, segmentation will label all the objects as car objects. However, a separate class of models known as instance segmentation is able to label the separate instances where an object appears in an image. This kind of segmentation can be very useful in applications that are used to count the number of objects, such as counting the amount of foot traffic in a mall.

Continue reading A 2019 Guide to Semantic Segmentation

A Definitive Guide for Audio Processing in Android with TensorFlow Lite Models

Articles

This guide describes how to process audio files in Android, in order to feed them into deep learning models built using TensorFlow.

TensorFlow Lite’s launch and subsequent progress have reduced the distance between mobile development and AI. And over time, don’t be surprised if app stores eventually end up flooded with AI/ML-powered apps.

Continue reading A Definitive Guide for Audio Processing in Android with TensorFlow Lite Models