Initially controversial, the accepted wisdom is that deep learning is eating software. Not now, not next year, but very soon, the way we approach software development will be fundamentally different.
We’re seeing the first signs of these changes with increased accessibility to machine learning at the edge. Inevitably, in much the same way as we saw an explosion in the number of form factors in the micro-controller and single board computing market, manufacturers are experimenting with how they should offer deep learning to developers.
Billed as the “first embedded ultra-compact artificial intelligence processing card,” and built around the same Intel Movidius chip as Intel’s own Neural Compute Stick, the UP AI Core is machine learning on a mini-PCIe board.
The UP AI Core has 512MB of DDR SDRAM, and is a standard looking PCI-e board measuring 51×30 mm, while the onboard Movidius chip supports the use of both TensorFlow and Caffe frameworks.
In order to support the board, the host computer needs to have at least 1GB of RAM, and 4GB of free storage space. Right now, only 64-bit x86 boards running Ubuntu 16.04 are fully supported. However that’s a requirement for the Movidius toolchain rather than something inherent in the design of the UP board itself. So presumably as the support requirements around the Movidius chip change, so will the requirements around the board.
However there’s been a lot of work since the release of the Movidius Neural Compute Stick to get it working on the Raspberry Pi, so it’s possible that you can use it with an Arm-based board with appropriate PCI-e slot like the Pine H64. But without official support, your milage may vary here.
The UP AI Core is now available for $69, and is compatible with the UP Core Plus, but should work with any single-board computer that has a mini-PCIe interface. Although you should be careful about toolchain support for the Movidius chip if you’re buying the board to use on another platform.