Creating a 17 KB style transfer model with layer pruning and quantization

Articles

There are now a bunch of off-the-shelf tools for training artistic style transfer models and thousands of open source implementations. Most use a variation of the network architecture described by Johnson et al to perform fast, feed-forward stylization.

As a result, the majority of the style transfer models you find are the same size: 7MB. That’s not an unreasonably large asset to add to your application, but it’s also not insignificant.

Research suggests that neural networks are often way larger than they need to be—that many of the millions of weights they contain are insignificant and needlessly precise. So I wondered: What’s the smallest model I can create that still reliably performs style transfer?

Continue reading “Creating a 17 KB style transfer model with layer pruning and quantization”

Analyze and Visualize Detected Video Objects Using Keras and ImageAI

Articles

Last month, I wrote an article that explored the nature of videos and how to use Keras, OpenCV, and ImageAI to easily run object detection code on videos and a live camera feed. If you haven’t read this article, kindly visit this link, as it will be very helpful in understanding what will be covered in this post.

The previous article walked us through installing all the Python dependencies, downloading the detection models, running the code on both video files and a live camera feed, and saving the detected video.

Continue reading “Analyze and Visualize Detected Video Objects Using Keras and ImageAI”

Engineering Feature Engineering

Articles
What is feature engineering?

Feature engineering is the use of domain knowledge to create features that make machine learning algorithms work. It’s a paramount step in the real-world application of ML.

It’s also both difficult and expensive.

Feature engineering is essentially the process of creating new input features from existing attributes to improve model performance. It’s about isolating/highlighting key information to help your algorithm “focus” on what’s important. Feature engineering takes place in both data preparation and model building.

Continue reading “Engineering Feature Engineering”

Exploring Convolutional Neural Networks (CNNs) from an iOS Developer’s Perspective

Articles

This article is for anyone that wants to learn the basics of neural networks applied to graphical content, which can be either images or videos.

As an iOS developer myself, I was initially frightened to look into anything related to machine learning. But, one day I stumbled upon a Computer Vision book called Computer Vision – Algorithms and Application by Richard Szeliski. I wouldn’t recommend it for beginners, but it was good introduction for me.

Continue reading “Exploring Convolutional Neural Networks (CNNs) from an iOS Developer’s Perspective”

Creator Conversations: An Interview With Julien Van Dorland

Articles Interviews

Julien Van Dorland is a digital creator and entrepreneur who specializes in social augmented reality. He is an official lens creator at Snapchat, which allows him to work with some of the world’s biggest and most influential brands. Outside of his work with Snapchat, Julien has created multiple business ventures in the digital and fashion marketplace, including Arvable.

Jonah Cohn is an Augmented Reality (AR) artist specializing in creations for Snapchat and Instagram. Since 2018, Jonah has been enhancing the analog world around us through unique and attention-grabbing AR experiences. Obsessed with digital art since childhood, Jonah spent years perfecting his craft before offering his services professionally.

Continue reading “Creator Conversations: An Interview With Julien Van Dorland”

Implementing a Natural Language Classifier in iOS with Keras + Core ML

Articles

IBM Watson NLC and Conversation services (as well as many other NLU cloud platforms) provide a Swift SDK to use in custom apps to implement intent understanding from natural language utterances.

These SDKs and the corresponding NLU platforms are super powerful. They provide much more than simply intent understanding capability — they also detect entities/slots and provide tools to manage complex, long running conversation dialogs.

Continue reading “Implementing a Natural Language Classifier in iOS with Keras + Core ML”

Emotion Recognition for Cats — Custom Vision & Core ML on a Swift Playground

Articles

I love Swift Playgrounds, I really do. That’s why it was time to pick up my Playgrounds repository again to add some new features (along with adapting it to Swift 4).

This particular Playground is an intro to Microsoft Azure’s Cognitive Services — and I really fell in love with the emotion detection feature 😍. So I decided that it’s time to get my hands dirty with the Custom Vision Service.

The Computer Vision API is already in the Playground, which tells you what’s on a picture and gives you a proper description of it. But the underlying model is trained quite broadly and with general terms, so detecting very specific things is quite difficult…like, for instance, cat emotions!

Continue reading “Emotion Recognition for Cats — Custom Vision & Core ML on a Swift Playground”

Building Blocks of Image Segmentation

Articles

Computers’ ability to have a deep understanding of images has given us the power to use them to extract information from images in an automated manner. A technique as simple as thresholding can extract enough information to be used in different applications.

More advanced algorithms can extract much complex information about the context and the content of images, such as classification and detection.

Continue reading “Building Blocks of Image Segmentation”

Neural Network Quantization Research Review

Articles

Neural network quantization is a process of reducing the precision of the weights in the neural network, thus reducing the memory, computation, and energy bandwidths.

Particularly when deploying NN models on mobile or edge devices, quantization, and model compression in general, is desirable and often the only plausible way to deploy a mobile model as the memory and computational budget of these devices is very limited.

Continue reading “Neural Network Quantization Research Review”