Implementing a Natural Language Classifier in iOS with Keras + Core ML

Articles

IBM Watson NLC and Conversation services (as well as many other NLU cloud platforms) provide a Swift SDK to use in custom apps to implement intent understanding from natural language utterances.

These SDKs and the corresponding NLU platforms are super powerful. They provide much more than simply intent understanding capability — they also detect entities/slots and provide tools to manage complex, long running conversation dialogs.

Continue reading “Implementing a Natural Language Classifier in iOS with Keras + Core ML”

Emotion Recognition for Cats — Custom Vision & Core ML on a Swift Playground

Articles

I love Swift Playgrounds, I really do. That’s why it was time to pick up my Playgrounds repository again to add some new features (along with adapting it to Swift 4).

This particular Playground is an intro to Microsoft Azure’s Cognitive Services — and I really fell in love with the emotion detection feature 😍. So I decided that it’s time to get my hands dirty with the Custom Vision Service.

The Computer Vision API is already in the Playground, which tells you what’s on a picture and gives you a proper description of it. But the underlying model is trained quite broadly and with general terms, so detecting very specific things is quite difficult…like, for instance, cat emotions!

Continue reading “Emotion Recognition for Cats — Custom Vision & Core ML on a Swift Playground”

Building Blocks of Image Segmentation

Articles

Computers’ ability to have a deep understanding of images has given us the power to use them to extract information from images in an automated manner. A technique as simple as thresholding can extract enough information to be used in different applications.

More advanced algorithms can extract much complex information about the context and the content of images, such as classification and detection.

Continue reading “Building Blocks of Image Segmentation”

Neural Network Quantization Research Review

Articles

Neural network quantization is a process of reducing the precision of the weights in the neural network, thus reducing the memory, computation, and energy bandwidths.

Particularly when deploying NN models on mobile or edge devices, quantization, and model compression in general, is desirable and often the only plausible way to deploy a mobile model as the memory and computational budget of these devices is very limited.

Continue reading “Neural Network Quantization Research Review”

An architecture for production-ready natural speech synthesizer

Articles

The recent improvements in the field of speech synthesis have lead to many innovative technologies, offering a wide range of useful applications like automatic speech recognition, natural speech synthesis, voice cloning, digital dictation, and so forth.

Deep learning has played an immensely important role in improving already existent speech synthesis approaches by replacing the entire pipeline process with neural networks trained on data alone.

Continue reading “An architecture for production-ready natural speech synthesizer”

Build Custom Image Classification Models for Mobile with Flutter, ML Kit, and AutoML

Articles

Building a machine learning model to identify custom images might require a lot of dataset collection and a lot of time to do it correctly.

This article is not a theoretical one—rather, I’ll walk you through the steps for creating an app that you could distribute amongst your peers to collect, create, and analyze TensorFlow models.

If you want a custom image classifier, but don’t have the right data or the know-how to build it, you’ve come to the right place.

Continue reading “Build Custom Image Classification Models for Mobile with Flutter, ML Kit, and AutoML”

Practical tips for better quantization results

Articles

Quantization can make your deep learning models smaller, faster, and more energy-efficient (I’ve written about this previously).

But the process may cause high accuracy loss or may not improve prediction speed if done incorrectly. So here, I’m sharing some practical tips to minimize accuracy loss while maintaining good inference speed. These points are valid for both post-training quantization and quantization-aware training.

Continue reading “Practical tips for better quantization results”

Character AI Review (2024): A New Realm of Imagination

AI Tools Articles

So I just had a brief chat with Elon Musk about his SpaceX program. I also discussed life and its meaning with Socrates.

I’m not joking – it’s real and all this is possible with artificial intelligence. I’m talking about a chatbot service called Character AI where you can chat with your favorite characters.

There are many chatbot services in the market but with Character AI, it’s like you’re actually chatting with the character you choose.

Continue reading “Character AI Review (2024): A New Realm of Imagination”

Five Contemporary Natural Language Processing Problems Pushed Forward by DL

Articles

Computer scientists have been attempting to solve many different Natural Language Processing (NLP) problems since the time computers were conceived.

But the field become slightly stagnant around the 2000s, and did not gain traction again till the ‘Deep Learning’ boom that occurred during the last decade.

Of the many factors that helped build this traction was the spike in available textual data, thanks to the rise in the number of web and mobile applications.

Continue reading “Five Contemporary Natural Language Processing Problems Pushed Forward by DL”