Supporting deep learning inference on mobile and edge devices has gained popularity more than ever and we have a greater number of options to choose from when carrying out AI-related development tasks on our little companions than we could have guessed.
Not only is implementing machine learning models—the standard for tasks such as computer vision—faster and easier on mobile devices these days, but the renewed competition between the developers of frameworks supporting them also seems to have ensured that the process itself reaches new heights in terms of performance, flexibility and adaptability.