Model Management and Optimization on iOS
When building and optimizing machine learning models for iOS deployment, there are a number of unique factors to consider. From resource constraints and optimizing inference speed, to model conversion and accessing advanced functionality, there can be quite a bit of legwork to get an ML model working effectively inside an iOS app.
Given the inherent difficulties in managing and optimizing iOS-ready ML models, we’ve worked to curate and assemble an authoritative collection of articles and tutorials that explore this range of concerns.
Swipeless Tinder Using iOS 14 Vision Hand Pose Estimation
Let’s use the power of computer vision to detect hand gestures in iOS
Evolving Your iOS App’s Intelligence with Core ML Model Deployment
Continuously improve your models without rogue on-device training — or updating your app entirely.
— by Danny Bolella
How to Run Core ML Models in Swift Playground
Test your Core ML models quickly and efficiently with Swift Playgrounds.
— by Özgür Şahin
Protecting Core ML Models
Exploring various methods to protect your valuable on-device IP.
— by Georguy
Swift loves TensorFlow and Core ML
Hacking Core ML protobuf data structures to export Swift for TensorFlow models to Core ML, and personalizing S4TF models on-device using Core ML 3 model personalization.
— by Jacopo Mangiavacchi
iOS 12 Core ML Benchmarks
Performance benchmarks for Core ML in iOS 12, on Apple’s A12 Bionic Processor.
— by Jameson Toole
Reverse Engineering Core ML
When a developer deploys a machine learning model to mobile, they lose control over how the model is accessed or used. This post shows how to take a Core ML model and reconstruct the original model.
— by Christopher Kelly
Using coremltools to Convert a Keras Model to Core ML for iOS
Learn the simple steps for converting a Keras machine learning model to Core ML format to power your next iOS app with ML features.
— by Carlos Gonzalez
How to Fit Large Neural Networks on the Edge
Exploring techniques used to fit neural networks in memory-constrained edge settings.
Machine learning on iOS and Android
Exploring machine learning on mobile with benefits, use cases, and developer environments.
— by Austin Kodra
Organizing mobile machine learning projects with the Fritz CLI
Learn how to use the Fritz CLI to more effectively organize, track, and manage your mobile machine learning projects.
— by Jameson Toole
Reducing Core ML 2 Model Size by 4X Using Quantization in iOS 12
Learn how to reduce the size of Core ML 2 models by up to 4x using quantization.
— by Alexis Creuzot
How smartphones handle huge Neural Networks
6 powerful techniques that enable neural networks to run on mobile phones in real-time.
— by Julien Despois
Does my Core ML model run on Apple’s Neural Engine?
Taking advantage of hardware acceleration via Apple’s Neural Engine is important if you want your application to run fast and not drain a user’s battery.
— by Jameson Toole
Figuring out if Core ML models use the Apple Neural Engine
A follow-up to the above post on ANE coverage. Learn a programmatic technique to determine whether or not your Core ML model is running on Apple’s Neural Engine.
— by Konrad Mokiejewski
Advanced Tips for Core ML
In this post, you’ll learn 4 advanced tips to help manage your mobile machine learning projects as they grow in scale and complexity.
— by Jameson Toole
Creating a 17 KB style transfer model with layer pruning and quantization
Learn how to drastically shrink the size of your machine learning models using pruning and quantization.
— by Jameson Toole
Distributing on-device machine learning models with tags and metadata
Developers using Fritz AI can now add tags and metadata to on-device machine learning models, affording more control over distribution and usage.
— by Jameson Toole
Create ML for iOS — Increasing model accuracy
Learn methods for increasing model accuracy metrics when building iOS-ready machine learning models with Apple’s Create ML.
— by Navdeep Singh
Editor’s Note: Heartbeat is a contributor-driven online publication and community dedicated to providing premier educational resources for data science, machine learning, and deep learning practitioners. We’re committed to supporting and inspiring developers and engineers from all walks of life.
Editorially independent, Heartbeat is sponsored and published by Comet, an MLOps platform that enables data scientists & ML teams to track, compare, explain, & optimize their experiments. We pay our contributors, and we don’t sell ads.
If you’d like to contribute, head on over to our call for contributors. You can also sign up to receive our weekly newsletters (Deep Learning Weekly and the Comet Newsletter), join us on Slack, and follow Comet on Twitter and LinkedIn for resources, events, and much more that will help you build better ML models, faster.
Comments 0 Responses